Rosa, Pedro J; Morais, Diogo; Gamito, Pedro; Oliveira, Jorge; Saraiva, Tomaz
2016-03-01
Immersive virtual reality is thought to be advantageous by leading to higher levels of presence. However, and despite users getting actively involved in immersive three-dimensional virtual environments that incorporate sound and motion, there are individual factors, such as age, video game knowledge, and the predisposition to immersion, that may be associated with the quality of virtual reality experience. Moreover, one particular concern for users engaged in immersive virtual reality environments (VREs) is the possibility of side effects, such as cybersickness. The literature suggests that at least 60% of virtual reality users report having felt symptoms of cybersickness, which reduces the quality of the virtual reality experience. The aim of this study was thus to profile the right user to be involved in a VRE through head-mounted display. To examine which user characteristics are associated with the most effective virtual reality experience (lower cybersickness), a multiple correspondence analysis combined with cluster analysis technique was performed. Results revealed three distinct profiles, showing that the PC gamer profile is more associated with higher levels of virtual reality effectiveness, that is, higher predisposition to be immersed and reduced cybersickness symptoms in the VRE than console gamer and nongamer. These findings can be a useful orientation in clinical practice and future research as they help identify which users are more predisposed to benefit from immersive VREs.
2017-08-01
ARL-TN-0839 ● AUG 2017 US Army Research Laboratory User Guide: How to Use and Operate Virtual Reality Equipment in the Systems...ARL-TN-0839 ● AUG 2017 US Army Research Laboratory User Guide: How to Use and Operate Virtual Reality Equipment in the Systems...September 2017 4. TITLE AND SUBTITLE User Guide: How to Use and Operate Virtual Reality Equipment in the Systems Assessment and Usability Laboratory
Virtual reality for intelligent and interactive operating, training, and visualization systems
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen; Schluse, Michael
2000-10-01
Virtual Reality Methods allow a new and intuitive way of communication between man and machine. The basic idea of Virtual Reality (VR) is the generation of artificial computer simulated worlds, which the user not only can look at but also can interact with actively using data glove and data helmet. The main emphasis for the use of such techniques at the IRF is the development of a new generation of operator interfaces for the control of robots and other automation components and for intelligent training systems for complex tasks. The basic idea of the methods developed at the IRF for the realization of Projective Virtual Reality is to let the user work in the virtual world as he would act in reality. The user actions are recognized by the Virtual reality System and by means of new and intelligent control software projected onto the automation components like robots which afterwards perform the necessary actions in reality to execute the users task. In this operation mode the user no longer has to be a robot expert to generate tasks for robots or to program them, because intelligent control software recognizes the users intention and generated automatically the commands for nearly every automation component. Now, Virtual Reality Methods are ideally suited for universal man-machine-interfaces for the control and supervision of a big class of automation components, interactive training and visualization systems. The Virtual Reality System of the IRF-COSIMIR/VR- forms the basis for different projects starting with the control of space automation systems in the projects CIROS, VITAL and GETEX, the realization of a comprehensive development tool for the International Space Station and last but not least with the realistic simulation fire extinguishing, forest machines and excavators which will be presented in the final paper in addition to the key ideas of this Virtual Reality System.
Jones, Jake S.
1999-01-01
An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch.
Jones, J.S.
1999-01-12
An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment are disclosed. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch. 4 figs.
Virtual reality training improves balance function.
Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng
2014-09-01
Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function.
Virtual reality training improves balance function
Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng
2014-01-01
Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function. PMID:25368651
NASA Astrophysics Data System (ADS)
Rengganis, Y. A.; Safrodin, M.; Sukaridhoto, S.
2018-01-01
Virtual Reality Laboratory (VR Lab) is an innovation for conventional learning media which show us whole learning process in laboratory. There are many tools and materials are needed by user for doing practical in it, so user could feel new learning atmosphere by using this innovation. Nowadays, technologies more sophisticated than before. So it would carry in education and it will be more effective, efficient. The Supported technologies are needed us for making VR Lab such as head mounted display device and hand motion gesture device. The integration among them will be used us for making this research. Head mounted display device for viewing 3D environment of virtual reality laboratory. Hand motion gesture device for catching user real hand and it will be visualized in virtual reality laboratory. Virtual Reality will show us, if using the newest technologies in learning process it could make more interesting and easy to understand.
NASA Astrophysics Data System (ADS)
Beckhaus, Steffi
Virtual Reality aims at creating an artificial environment that can be perceived as a substitute to a real setting. Much effort in research and development goes into the creation of virtual environments that in their majority are perceivable only by eyes and hands. The multisensory nature of our perception, however, allows and, arguably, also expects more than that. As long as we are not able to simulate and deliver a fully sensory believable virtual environment to a user, we could make use of the fully sensory, multi-modal nature of real objects to fill in for this deficiency. The idea is to purposefully integrate real artifacts into the application and interaction, instead of dismissing anything real as hindering the virtual experience. The term virtual reality - denoting the goal, not the technology - shifts from a core virtual reality to an “enriched” reality, technologically encompassing both the computer generated and the real, physical artifacts. Together, either simultaneously or in a hybrid way, real and virtual jointly provide stimuli that are perceived by users through their senses and are later formed into an experience by the user's mind.
World Reaction to Virtual Space
NASA Technical Reports Server (NTRS)
1999-01-01
DRaW Computing developed virtual reality software for the International Space Station. Open Worlds, as the software has been named, can be made to support Java scripting and virtual reality hardware devices. Open Worlds permits the use of VRML script nodes to add virtual reality capabilities to the user's applications.
Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms.
Rutkowski, Tomasz M
2016-01-01
The paper reviews nine robotic and virtual reality (VR) brain-computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI-lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in applications to direct brain-robot and brain-virtual-reality-agent control interfaces using tactile and auditory BCI technologies. The BCI user intentions are decoded from the brainwaves in realtime using a non-invasive electroencephalography (EEG) and they are translated to a symbiotic robot or virtual reality agent thought-based only control. A communication protocol between the BCI output and the robot or the virtual environment is realized in a symbiotic communication scenario using an user datagram protocol (UDP), which constitutes an internet of things (IoT) control scenario. Results obtained from healthy users reproducing simple brain-robot and brain-virtual-agent control tasks in online experiments support the research goal of a possibility to interact with robotic devices and virtual reality agents using symbiotic thought-based BCI technologies. An offline BCI classification accuracy boosting method, using a previously proposed information geometry derived approach, is also discussed in order to further support the reviewed robotic and virtual reality thought-based control paradigms.
Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms
Rutkowski, Tomasz M.
2016-01-01
The paper reviews nine robotic and virtual reality (VR) brain–computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI–lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in applications to direct brain-robot and brain-virtual-reality-agent control interfaces using tactile and auditory BCI technologies. The BCI user intentions are decoded from the brainwaves in realtime using a non-invasive electroencephalography (EEG) and they are translated to a symbiotic robot or virtual reality agent thought-based only control. A communication protocol between the BCI output and the robot or the virtual environment is realized in a symbiotic communication scenario using an user datagram protocol (UDP), which constitutes an internet of things (IoT) control scenario. Results obtained from healthy users reproducing simple brain-robot and brain-virtual-agent control tasks in online experiments support the research goal of a possibility to interact with robotic devices and virtual reality agents using symbiotic thought-based BCI technologies. An offline BCI classification accuracy boosting method, using a previously proposed information geometry derived approach, is also discussed in order to further support the reviewed robotic and virtual reality thought-based control paradigms. PMID:27999538
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez Anez, Francisco
This paper presents two development projects (STARMATE and VIRMAN) focused on supporting training on maintenance. Both projects aim at specifying, designing, developing, and demonstrating prototypes allowing computer guided maintenance of complex mechanical elements using Augmented and Virtual Reality techniques. VIRMAN is a Spanish development project. The objective is to create a computer tool for maintenance training course elaborations and training delivery based on 3D virtual reality models of complex components. The training delivery includes 3D record displays on maintenance procedures with all complementary information for intervention understanding. Users are requested to perform the maintenance intervention trying to follow up themore » procedure. Users can be evaluated about the level of knowledge achieved. Instructors can check the evaluation records left during the training sessions. VIRMAN is simple software supported by a regular computer and can be used in an Internet framework. STARMATE is a forward step in the area of virtual reality. STARMATE is a European Commission project in the frame of 'Information Societies Technologies'. A consortium of five companies and one research institute shares their expertise in this new technology. STARMATE provides two main functionalities (1) user assistance for achieving assembly/de-assembly and following maintenance procedures, and (2) workforce training. The project relies on Augmented Reality techniques, which is a growing area in Virtual Reality research. The idea of Augmented Reality is to combine a real scene, viewed by the user, with a virtual scene, generated by a computer, augmenting the reality with additional information. The user interface is see-through goggles, headphones, microphone and an optical tracking system. All these devices are integrated in a helmet connected with two regular computers. The user has his hands free for performing the maintenance intervention and he can navigate in the virtual world thanks to a voice recognition system and a virtual pointing device. The maintenance work is guided with audio instructions, 2D and 3D information are directly displayed into the user's goggles: There is a position-tracking system that allows 3D virtual models to be displayed in the real counterpart positions independently of the user allocation. The user can create his own virtual environment, placing the information required wherever he wants. The STARMATE system is applicable to a large variety of real work situations. (author)« less
Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.
Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A
2013-01-01
Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.
Virtual reality: past, present and future.
Gobbetti, E; Scateni, R
1998-01-01
This report provides a short survey of the field of virtual reality, highlighting application domains, technological requirements, and currently available solutions. The report is organized as follows: section 1 presents the background and motivation of virtual environment research and identifies typical application domain, section 2 discusses the characteristics a virtual reality system must have in order to exploit the perceptual and spatial skills of users, section 3 surveys current input/output devices for virtual reality, section 4 surveys current software approaches to support the creation of virtual reality systems, and section 5 summarizes the report.
Can virtual reality be used to conduct mass prophylaxis clinic training? A pilot program.
Yellowlees, Peter; Cook, James N; Marks, Shayna L; Wolfe, Daniel; Mangin, Elanor
2008-03-01
To create and evaluate a pilot bioterrorism defense training environment using virtual reality technology. The present pilot project used Second Life, an internet-based virtual world system, to construct a virtual reality environment to mimic an actual setting that might be used as a Strategic National Stockpile (SNS) distribution site for northern California in the event of a bioterrorist attack. Scripted characters were integrated into the system as mock patients to analyze various clinic workflow scenarios. Users tested the virtual environment over two sessions. Thirteen users who toured the environment were asked to complete an evaluation survey. Respondents reported that the virtual reality system was relevant to their practice and had potential as a method of bioterrorism defense training. Computer simulations of bioterrorism defense training scenarios are feasible with existing personal computer technology. The use of internet-connected virtual environments holds promise for bioterrorism defense training. Recommendations are made for public health agencies regarding the implementation and benefits of using virtual reality for mass prophylaxis clinic training.
1998-03-01
Research Laboratory’s Virtual Reality Responsive Workbench (VRRWB) and Dragon software system which together address the problem of battle space...and describe the lessons which have been learned. Interactive graphics, workbench, battle space visualization, virtual reality , user interface.
Inertial Motion-Tracking Technology for Virtual 3-D
NASA Technical Reports Server (NTRS)
2005-01-01
In the 1990s, NASA pioneered virtual reality research. The concept was present long before, but, prior to this, the technology did not exist to make a viable virtual reality system. Scientists had theories and ideas they knew that the concept had potential, but the computers of the 1970s and 1980s were not fast enough, sensors were heavy and cumbersome, and people had difficulty blending fluidly with the machines. Scientists at Ames Research Center built upon the research of previous decades and put the necessary technology behind them, making the theories of virtual reality a reality. Virtual reality systems depend on complex motion-tracking sensors to convey information between the user and the computer to give the user the feeling that he is operating in the real world. These motion-tracking sensors measure and report an object s position and orientation as it changes. A simple example of motion tracking would be the cursor on a computer screen moving in correspondence to the shifting of the mouse. Tracking in 3-D, necessary to create virtual reality, however, is much more complex. To be successful, the perspective of the virtual image seen on the computer must be an accurate representation of what is seen in the real world. As the user s head or camera moves, turns, or tilts, the computer-generated environment must change accordingly with no noticeable lag, jitter, or distortion. Historically, the lack of smooth and rapid tracking of the user s motion has thwarted the widespread use of immersive 3-D computer graphics. NASA uses virtual reality technology for a variety of purposes, mostly training of astronauts. The actual missions are costly and dangerous, so any opportunity the crews have to practice their maneuvering in accurate situations before the mission is valuable and instructive. For that purpose, NASA has funded a great deal of virtual reality research, and benefited from the results.
E-Learning Application of Tarsier with Virtual Reality using Android Platform
NASA Astrophysics Data System (ADS)
Oroh, H. N.; Munir, R.; Paseru, D.
2017-01-01
Spectral Tarsier is a primitive primate that can only be found in the province of North Sulawesi. To study these primates can be used an e-learning application with Augmented Reality technology that uses a marker to confronted the camera computer to interact with three dimensions Tarsier object. But that application only shows tarsier object in three dimensions without habitat and requires a lot of resources because it runs on a Personal Computer. The same technology can be shown three dimensions’ objects is Virtual Reality to excess can make the user like venturing into the virtual world with Android platform that requires fewer resources. So, put on Virtual Reality technology using the Android platform that can make users not only to view and interact with the tarsiers but also the habitat. The results of this research indicate that the user can learn the Tarsier and habitat with good. Thus, the use of Virtual Reality technology in the e-learning application of tarsiers can help people to see, know, and learn about Spectral Tarsier.
Review of virtual reality treatment for mental health.
Gourlay, D; Lun, K C; Liya, G
2001-01-01
This paper describes recent research that proposes virtual reality techniques as a therapy for patients with cognitive and psychological problems. Specifically this applies to victims of conditions such as traumatic brain injury, Alzheimers and Parkinsons. Additionally virtual reality therapy offers an alternative to current desensitization techniques for the treatment of phobias Some important issues are examined including means of user interaction, skills transfer to the real world, and side-effects of virtual reality exposure.
Virtual Reality for Pediatric Sedation: A Randomized Controlled Trial Using Simulation.
Zaveri, Pavan P; Davis, Aisha B; O'Connell, Karen J; Willner, Emily; Aronson Schinasi, Dana A; Ottolini, Mary
2016-02-09
Team training for procedural sedation for pediatric residents has traditionally consisted of didactic presentations and simulated scenarios using high-fidelity mannequins. We assessed the effectiveness of a virtual reality module in teaching preparation for and management of sedation for procedures. After developing a virtual reality environment in Second Life® (Linden Lab, San Francisco, CA) where providers perform and recover patients from procedural sedation, we conducted a randomized controlled trial to assess the effectiveness of the virtual reality module versus a traditional web-based educational module. A 20 question pre- and post-test was administered to assess knowledge change. All subjects participated in a simulated pediatric procedural sedation scenario that was video recorded for review and assessed using a 32-point checklist. A brief survey elicited feedback on the virtual reality module and the simulation scenario. The median score on the assessment checklist was 75% for the intervention group and 70% for the control group (P = 0.32). For the knowledge tests, there was no statistically significant difference between the groups (P = 0.14). Users had excellent reviews of the virtual reality module and reported that the module added to their education. Pediatric residents performed similarly in simulation and on a knowledge test after a virtual reality module compared with a traditional web-based module on procedural sedation. Although users enjoyed the virtual reality experience, these results question the value virtual reality adds in improving the performance of trainees. Further inquiry is needed into how virtual reality provides true value in simulation-based education.
Virtual Reality for Pediatric Sedation: A Randomized Controlled Trial Using Simulation
Davis, Aisha B; O'Connell, Karen J; Willner, Emily; Aronson Schinasi, Dana A; Ottolini, Mary
2016-01-01
Introduction: Team training for procedural sedation for pediatric residents has traditionally consisted of didactic presentations and simulated scenarios using high-fidelity mannequins. We assessed the effectiveness of a virtual reality module in teaching preparation for and management of sedation for procedures. Methods: After developing a virtual reality environment in Second Life® (Linden Lab, San Francisco, CA) where providers perform and recover patients from procedural sedation, we conducted a randomized controlled trial to assess the effectiveness of the virtual reality module versus a traditional web-based educational module. A 20 question pre- and post-test was administered to assess knowledge change. All subjects participated in a simulated pediatric procedural sedation scenario that was video recorded for review and assessed using a 32-point checklist. A brief survey elicited feedback on the virtual reality module and the simulation scenario. Results: The median score on the assessment checklist was 75% for the intervention group and 70% for the control group (P = 0.32). For the knowledge tests, there was no statistically significant difference between the groups (P = 0.14). Users had excellent reviews of the virtual reality module and reported that the module added to their education. Conclusions: Pediatric residents performed similarly in simulation and on a knowledge test after a virtual reality module compared with a traditional web-based module on procedural sedation. Although users enjoyed the virtual reality experience, these results question the value virtual reality adds in improving the performance of trainees. Further inquiry is needed into how virtual reality provides true value in simulation-based education. PMID:27014520
Lewis, Gwyn N; Rosie, Juliet A
2012-01-01
To review quantitative and qualitative studies that have examined the users' response to virtual reality game-based interventions in people with movement disorders associated with chronic neurological conditions. We aimed to determine key themes that influenced users' enjoyment and engagement in the games and develop suggestions as to how future systems could best address their needs and expectations. There were a limited number of studies that evaluated user opinions. From those found, seven common themes emerged: technology limitations, user control and therapist assistance, the novel physical and cognitive challenge, feedback, social interaction, game purpose and expectations, and the virtual environments. Our key recommendations derived from the review were to avoid technology failure, maintain overt therapeutic principles within the games, encompass progression to promote continuing physical and cognitive challenge, and to provide feedback that is easily and readily associated with success. While there have been few studies that have evaluated the users' perspective of virtual rehabilitation games, our findings indicate that canvassing these experiences provides valuable information on the needs of the intended users. Incorporating our recommendations may enhance the efficacy of future systems to optimize the rehabilitation benefits of virtual reality games.
Virtual reality and telerobotics applications of an Address Recalculation Pipeline
NASA Technical Reports Server (NTRS)
Regan, Matthew; Pose, Ronald
1994-01-01
The technology described in this paper was designed to reduce latency to user interactions in immersive virtual reality environments. It is also ideally suited to telerobotic applications such as interaction with remote robotic manipulators in space or in deep sea operations. in such circumstances the significant latency is observed response to user stimulus which is due to communications delays, and the disturbing jerkiness due to low and unpredictable frame rates on compressed video user feedback or computationally limited virtual worlds, can be masked by our techniques. The user is provided with highly responsive visual feedback independent of communication or computational delays in providing physical video feedback or in rendering virtual world images. Virtual and physical environments can be combined seamlessly using these techniques.
Reality Check: Basics of Augmented, Virtual, and Mixed Reality.
Brigham, Tara J
2017-01-01
Augmented, virtual, and mixed reality applications all aim to enhance a user's current experience or reality. While variations of this technology are not new, within the last few years there has been a significant increase in the number of artificial reality devices or applications available to the general public. This column will explain the difference between augmented, virtual, and mixed reality and how each application might be useful in libraries. It will also provide an overview of the concerns surrounding these different reality applications and describe how and where they are currently being used.
Transduction between worlds: using virtual and mixed reality for earth and planetary science
NASA Astrophysics Data System (ADS)
Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.
2017-12-01
Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.
Building intuitive 3D interfaces for virtual reality systems
NASA Astrophysics Data System (ADS)
Vaidya, Vivek; Suryanarayanan, Srikanth; Seitel, Mathias; Mullick, Rakesh
2007-03-01
An exploration of techniques for developing intuitive, and efficient user interfaces for virtual reality systems. Work seeks to understand which paradigms from the better-understood world of 2D user interfaces remain viable within 3D environments. In order to establish this a new user interface was created that applied various understood principles of interface design. A user study was then performed where it was compared with an earlier interface for a series of medical visualization tasks.
A haptic interface for virtual simulation of endoscopic surgery.
Rosenberg, L B; Stredney, D
1996-01-01
Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.
Encarnação, L Miguel; Bimber, Oliver
2002-01-01
Collaborative virtual environments for diagnosis and treatment planning are increasingly gaining importance in our global society. Virtual and Augmented Reality approaches promised to provide valuable means for the involved interactive data analysis, but the underlying technologies still create a cumbersome work environment that is inadequate for clinical employment. This paper addresses two of the shortcomings of such technology: Intuitive interaction with multi-dimensional data in immersive and semi-immersive environments as well as stereoscopic multi-user displays combining the advantages of Virtual and Augmented Reality technology.
Linking Audio and Visual Information while Navigating in a Virtual Reality Kiosk Display
ERIC Educational Resources Information Center
Sullivan, Briana; Ware, Colin; Plumlee, Matthew
2006-01-01
3D interactive virtual reality museum exhibits should be easy to use, entertaining, and informative. If the interface is intuitive, it will allow the user more time to learn the educational content of the exhibit. This research deals with interface issues concerning activating audio descriptions of images in such exhibits while the user is…
ERIC Educational Resources Information Center
Yildirim, Gürkan
2017-01-01
Today, it is seen that developing technologies are tried to be used continuously in the learning environments. These technologies have rapidly been diversifying and changing. Recently, virtual reality technology has become one of the technologies that experts have often been dwelling on. The present research tries to determine users' opinions and…
The Influences of the 2D Image-Based Augmented Reality and Virtual Reality on Student Learning
ERIC Educational Resources Information Center
Liou, Hsin-Hun; Yang, Stephen J. H.; Chen, Sherry Y.; Tarng, Wernhuar
2017-01-01
Virtual reality (VR) learning environments can provide students with concepts of the simulated phenomena, but users are not allowed to interact with real elements. Conversely, augmented reality (AR) learning environments blend real-world environments so AR could enhance the effects of computer simulation and promote students' realistic experience.…
Validation of virtual reality as a tool to understand and prevent child pedestrian injury.
Schwebel, David C; Gaines, Joanna; Severson, Joan
2008-07-01
In recent years, virtual reality has emerged as an innovative tool for health-related education and training. Among the many benefits of virtual reality is the opportunity for novice users to engage unsupervised in a safe environment when the real environment might be dangerous. Virtual environments are only useful for health-related research, however, if behavior in the virtual world validly matches behavior in the real world. This study was designed to test the validity of an immersive, interactive virtual pedestrian environment. A sample of 102 children and 74 adults was recruited to complete simulated road-crossings in both the virtual environment and the identical real environment. In both the child and adult samples, construct validity was demonstrated via significant correlations between behavior in the virtual and real worlds. Results also indicate construct validity through developmental differences in behavior; convergent validity by showing correlations between parent-reported child temperament and behavior in the virtual world; internal reliability of various measures of pedestrian safety in the virtual world; and face validity, as measured by users' self-reported perception of realism in the virtual world. We discuss issues of generalizability to other virtual environments, and the implications for application of virtual reality to understanding and preventing pediatric pedestrian injuries.
Sensor supervision and multiagent commanding by means of projective virtual reality
NASA Astrophysics Data System (ADS)
Rossmann, Juergen
1998-10-01
When autonomous systems with multiple agents are considered, conventional control- and supervision technologies are often inadequate because the amount of information available is often presented in a way that the user is effectively overwhelmed by the displayed data. New virtual reality (VR) techniques can help to cope with this problem, because VR offers the chance to convey information in an intuitive manner and can combine supervision capabilities and new, intuitive approaches to the control of autonomous systems. In the approach taken, control and supervision issues were equally stressed and finally led to the new ideas and the general framework for Projective Virtual Reality. The key idea of this new approach for an intuitively operable man machine interface for decentrally controlled multi-agent systems is to let the user act in the virtual world, detect the changes and have an action planning component automatically generate task descriptions for the agents involved to project actions that have been carried out by users in the virtual world into the physical world, e.g. with the help of robots. Thus the Projective Virtual Reality approach is to split the job between the task deduction in the VR and the task `projection' onto the physical automation components by the automatic action planning component. Besides describing the realized projective virtual reality system, the paper will also describe in detail the metaphors and visualization aids used to present different types of (e.g. sensor-) information in an intuitively comprehensible manner.
Using Immersive Virtual Reality for Electrical Substation Training
ERIC Educational Resources Information Center
Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana
2015-01-01
Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…
NASA Astrophysics Data System (ADS)
Demir, I.
2015-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.
Evaluating the Effects of Immersive Embodied Interaction on Cognition in Virtual Reality
NASA Astrophysics Data System (ADS)
Parmar, Dhaval
Virtual reality is on its advent of becoming mainstream household technology, as technologies such as head-mounted displays, trackers, and interaction devices are becoming affordable and easily available. Virtual reality (VR) has immense potential in enhancing the fields of education and training, and its power can be used to spark interest and enthusiasm among learners. It is, therefore, imperative to evaluate the risks and benefits that immersive virtual reality poses to the field of education. Research suggests that learning is an embodied process. Learning depends on grounded aspects of the body including action, perception, and interactions with the environment. This research aims to study if immersive embodiment through the means of virtual reality facilitates embodied cognition. A pedagogical VR solution which takes advantage of embodied cognition can lead to enhanced learning benefits. Towards achieving this goal, this research presents a linear continuum for immersive embodied interaction within virtual reality. This research evaluates the effects of three levels of immersive embodied interactions on cognitive thinking, presence, usability, and satisfaction among users in the fields of science, technology, engineering, and mathematics (STEM) education. Results from the presented experiments show that immersive virtual reality is greatly effective in knowledge acquisition and retention, and highly enhances user satisfaction, interest and enthusiasm. Users experience high levels of presence and are profoundly engaged in the learning activities within the immersive virtual environments. The studies presented in this research evaluate pedagogical VR software to train and motivate students in STEM education, and provide an empirical analysis comparing desktop VR (DVR), immersive VR (IVR), and immersive embodied VR (IEVR) conditions for learning. This research also proposes a fully immersive embodied interaction metaphor (IEIVR) for learning of computational concepts as a future direction, and presents the challenges faced in implementing the IEIVR metaphor due to extended periods of immersion. Results from the conducted studies help in formulating guidelines for virtual reality and education researchers working in STEM education and training, and for educators and curriculum developers seeking to improve student engagement in the STEM fields.
Ergonomic aspects of a virtual environment.
Ahasan, M R; Väyrynen, S
1999-01-01
A virtual environment is an interactive graphic system mediated through computer technology that allows a certain level of reality or a sense of presence to access virtual information. To create reality in a virtual environment, ergonomics issues are explored in this paper, aiming to develop the design of presentation formats with related information, that is possible to attain and to maintain user-friendly application.
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Vauderwange, Oliver; Mandal, Avikarsha; Javahiraly, Nicolas; Curticapean, Dan
2016-09-01
Practical exercises are a crucial part of many curricula. Even simple exercises can improve the understanding of the underlying subject. Most experimental setups require special hardware. To carry out e. g. a lens experiments the students need access to an optical bench, various lenses, light sources, apertures and a screen. In our previous publication we demonstrated the use of augmented reality visualization techniques in order to let the students prepare with a simulated experimental setup. Within the context of our intended blended learning concept we want to utilize augmented or virtual reality techniques for stationary laboratory exercises. Unlike applications running on mobile devices, stationary setups can be extended more easily with additional interfaces and thus allow for more complex interactions and simulations in virtual reality (VR) and augmented reality (AR). The most significant difference is the possibility to allow interactions beyond touching a screen. The LEAP Motion controller is a small inexpensive device that allows for the tracking of the user's hands and fingers in three dimensions. It is conceivable to allow the user to interact with the simulation's virtual elements by the user's very hand position, movement and gesture. In this paper we evaluate possible applications of the LEAP Motion controller for simulated experiments in augmented and virtual reality. We pay particular attention to the devices strengths and weaknesses and want to point out useful and less useful application scenarios.
Virtual Reality: You Are There
NASA Technical Reports Server (NTRS)
1993-01-01
Telepresence or "virtual reality," allows a person, with assistance from advanced technology devices, to figuratively project himself into another environment. This technology is marketed by several companies, among them Fakespace, Inc., a former Ames Research Center contractor. Fakespace developed a teleoperational motion platform for transmitting sounds and images from remote locations. The "Molly" matches the user's head motion and, when coupled with a stereo viewing device and appropriate software, creates the telepresence experience. Its companion piece is the BOOM-the user's viewing device that provides the sense of involvement in the virtual environment. Either system may be used alone. Because suits, gloves, headphones, etc. are not needed, a whole range of commercial applications is possible, including computer-aided design techniques and virtual reality visualizations. Customers include Sandia National Laboratories, Stanford Research Institute and Mattel Toys.
Surgery applications of virtual reality
NASA Technical Reports Server (NTRS)
Rosen, Joseph
1994-01-01
Virtual reality is a computer-generated technology which allows information to be displayed in a simulated, bus lifelike, environment. In this simulated 'world', users can move and interact as if they were actually a part of that world. This new technology will be useful in many different fields, including the field of surgery. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations, simulate and perform surgical procedures (telesurgery), and predict the outcomes of surgery. The authors of this paper describe the basic components of a virtual reality surgical system. These components include: the virtual world, the virtual tools, the anatomical model, the software platform, the host computer, the interface, and the head-coupled display. In the chapter they also review the progress towards using virtual reality for surgical training, planning, telesurgery, and predicting outcomes. Finally, the authors present a training system being developed for the practice of new procedures in abdominal surgery.
[Virtual reality in neurosurgery].
Tronnier, V M; Staubert, A; Bonsanto, M M; Wirtz, C R; Kunze, S
2000-03-01
Virtual reality enables users to immerse themselves in a virtual three-dimensional world and to interact in this world. The simulation is different from the kind in computer games, in which the viewer is active but acts in a nonrealistic world, or on the TV screen, where we are passively driven in an active world. In virtual reality elements look realistic, they change their characteristics and have almost real-world unpredictability. Virtual reality is not only implemented in gambling dens and the entertainment industry but also in manufacturing processes (cars, furniture etc.), military applications and medicine. Especially the last two areas are strongly correlated, because telemedicine or telesurgery was originated for military reasons to operate on war victims from a secure distance or to perform surgery on astronauts in an orbiting space station. In medicine and especially neurosurgery virtual-reality methods are used for education, surgical planning and simulation on a virtual patient.
Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback
NASA Astrophysics Data System (ADS)
Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve
2011-03-01
Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.
NASA Technical Reports Server (NTRS)
Johnson, David W.
1992-01-01
Virtual realities are a type of human-computer interface (HCI) and as such may be understood from a historical perspective. In the earliest era, the computer was a very simple, straightforward machine. Interaction was human manipulation of an inanimate object, little more than the provision of an explicit instruction set to be carried out without deviation. In short, control resided with the user. In the second era of HCI, some level of intelligence and control was imparted to the system to enable a dialogue with the user. Simple context sensitive help systems are early examples, while more sophisticated expert system designs typify this era. Control was shared more equally. In this, the third era of the HCI, the constructed system emulates a particular environment, constructed with rules and knowledge about 'reality'. Control is, in part, outside the realm of the human-computer dialogue. Virtual reality systems are discussed.
ERIC Educational Resources Information Center
Montoya, Mauricio Hincapié; Díaz, Christian Andrés; Moreno, Gustavo Adolfo
2017-01-01
Nowadays, the use of technology to improve teaching and learning experiences in the classroom has been promoted. One of these technologies is augmented reality, which allows overlaying layers of virtual information on real scene with the aim of increasing the perception that user has of reality. Augmented reality has proved to offer several…
Designing 3 Dimensional Virtual Reality Using Panoramic Image
NASA Astrophysics Data System (ADS)
Wan Abd Arif, Wan Norazlinawati; Wan Ahmad, Wan Fatimah; Nordin, Shahrina Md.; Abdullah, Azrai; Sivapalan, Subarna
The high demand to improve the quality of the presentation in the knowledge sharing field is to compete with rapidly growing technology. The needs for development of technology based learning and training lead to an idea to develop an Oil and Gas Plant Virtual Environment (OGPVE) for the benefit of our future. Panoramic Virtual Reality learning based environment is essential in order to help educators overcome the limitations in traditional technical writing lesson. Virtual reality will help users to understand better by providing the simulations of real-world and hard to reach environment with high degree of realistic experience and interactivity. Thus, in order to create a courseware which will achieve the objective, accurate images of intended scenarios must be acquired. The panorama shows the OGPVE and helps to generate ideas to users on what they have learnt. This paper discusses part of the development in panoramic virtual reality. The important phases for developing successful panoramic image are image acquisition and image stitching or mosaicing. In this paper, the combination of wide field-of-view (FOV) and close up image used in this panoramic development are also discussed.
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
Virtual Reality: Directions in Research and Development.
ERIC Educational Resources Information Center
Stuart, Rory
1992-01-01
Discussion of virtual reality (VR) focuses on research and development being carried out at NYNEX to solve business problems. Component technologies are described; design decisions are considered, including interactivity, connectivity, and locus of control; potential perils of VR are discussed, including user dissociation; and areas of promise are…
Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo
2014-04-15
Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients' brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies.
Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo
2014-01-01
Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients’ brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies. PMID:25206907
New Desktop Virtual Reality Technology in Technical Education
ERIC Educational Resources Information Center
Ausburn, Lynna J.; Ausburn, Floyd B.
2008-01-01
Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…
Virtual reality enhanced mannequin (VREM) that is well received by resuscitation experts.
Semeraro, Federico; Frisoli, Antonio; Bergamasco, Massimo; Cerchiari, Erga L
2009-04-01
The objective of this study was to test acceptance of, and interest in, a newly developed prototype of virtual reality enhanced mannequin (VREM) on a sample of congress attendees who volunteered to participate in the evaluation session and to respond to a specifically designed questionnaire. A commercial Laerdal HeartSim 4000 mannequin was developed to integrate virtual reality (VR) technologies with specially developed virtual reality software to increase the immersive perception of emergency scenarios. To evaluate the acceptance of a virtual reality enhanced mannequin (VREM), we presented it to a sample of 39 possible users. Each evaluation session involved one trainee and two instructors with a standardized procedure and scenario: the operator was invited by the instructor to wear the data-gloves and the head mounted display and was briefly introduced to the scope of the simulation. The instructor helped the operator familiarize himself with the environment. After the patient's collapse, the operator was asked to check the patient's clinical conditions and start CPR. Finally, the patient started to recover signs of circulation and the evaluation session was concluded. Each participant was then asked to respond to a questionnaire designed to explore the trainee's perception in the areas of user-friendliness, realism, and interaction/immersion. Overall, the evaluation of the system was very positive, as was the feeling of immersion and realism of the environment and simulation. Overall, 84.6% of the participants judged the virtual reality experience as interesting and believed that its development could be very useful for healthcare training. The prototype of the virtual reality enhanced mannequin was well-liked, without interfence by interaction devices, and deserves full technological development and validation in emergency medical training.
Open multi-agent control architecture to support virtual-reality-based man-machine interfaces
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen; Brasch, Marcel
2001-10-01
Projective Virtual Reality is a new and promising approach to intuitively operable man machine interfaces for the commanding and supervision of complex automation systems. The user interface part of Projective Virtual Reality heavily builds on latest Virtual Reality techniques, a task deduction component and automatic action planning capabilities. In order to realize man machine interfaces for complex applications, not only the Virtual Reality part has to be considered but also the capabilities of the underlying robot and automation controller are of great importance. This paper presents a control architecture that has proved to be an ideal basis for the realization of complex robotic and automation systems that are controlled by Virtual Reality based man machine interfaces. The architecture does not just provide a well suited framework for the real-time control of a multi robot system but also supports Virtual Reality metaphors and augmentations which facilitate the user's job to command and supervise a complex system. The developed control architecture has already been used for a number of applications. Its capability to integrate sensor information from sensors of different levels of abstraction in real-time helps to make the realized automation system very responsive to real world changes. In this paper, the architecture will be described comprehensively, its main building blocks will be discussed and one realization that is built based on an open source real-time operating system will be presented. The software design and the features of the architecture which make it generally applicable to the distributed control of automation agents in real world applications will be explained. Furthermore its application to the commanding and control of experiments in the Columbus space laboratory, the European contribution to the International Space Station (ISS), is only one example which will be described.
Research on three-dimensional visualization based on virtual reality and Internet
NASA Astrophysics Data System (ADS)
Wang, Zongmin; Yang, Haibo; Zhao, Hongling; Li, Jiren; Zhu, Qiang; Zhang, Xiaohong; Sun, Kai
2007-06-01
To disclose and display water information, a three-dimensional visualization system based on Virtual Reality (VR) and Internet is researched for demonstrating "digital water conservancy" application and also for routine management of reservoir. To explore and mine in-depth information, after completion of modeling high resolution DEM with reliable quality, topographical analysis, visibility analysis and reservoir volume computation are studied. And also, some parameters including slope, water level and NDVI are selected to classify easy-landslide zone in water-level-fluctuating zone of reservoir area. To establish virtual reservoir scene, two kinds of methods are used respectively for experiencing immersion, interaction and imagination (3I). First virtual scene contains more detailed textures to increase reality on graphical workstation with virtual reality engine Open Scene Graph (OSG). Second virtual scene is for internet users with fewer details for assuring fluent speed.
Using Virtual Reality For Outreach Purposes in Planetology
NASA Astrophysics Data System (ADS)
Civet, François; Le Mouélic, Stéphane; Le Menn, Erwan; Beaunay, Stéphanie
2016-10-01
2016 has been a year marked by a technological breakthrough : the availability for the first time to the general public of technologically mature virtual reality devices. Virtual Reality consists in visually immerging a user in a 3D environment reproduced either from real and/or imaginary data, with the possibility to move and eventually interact with the different elements. In planetology, most of the places will remain inaccessible to the public for a while, but a fleet of dedicated spacecraft's such as orbiters, landers and rovers allow the possibility to virtually reconstruct the environments, using image processing, cartography and photogrammetry. Virtual reality can then bridge the gap to virtually "send" any user into the place and enjoy the exploration.We are investigating several type of devices to render orbital or ground based data of planetological interest, mostly from Mars. The most simple system consists of a "cardboard" headset, on which the user can simply use his cellphone as the screen. A more comfortable experience is obtained with more complex systems such as the HTC vive or Oculus Rift headsets, which include a tracking system important to minimize motion sickness. The third environment that we have developed is based on the CAVE concept, were four 3D video projectors are used to project on three 2x3m walls plus the ground. These systems can be used for scientific data analysis, but also prove to be perfectly suited for outreach and education purposes.
De Leo, Gianluca; Diggs, Leigh A; Radici, Elena; Mastaglio, Thomas W
2014-02-01
Virtual-reality solutions have successfully been used to train distributed teams. This study aimed to investigate the correlation between user characteristics and sense of presence in an online virtual-reality environment where distributed teams are trained. A greater sense of presence has the potential to make training in the virtual environment more effective, leading to the formation of teams that perform better in a real environment. Being able to identify, before starting online training, those user characteristics that are predictors of a greater sense of presence can lead to the selection of trainees who would benefit most from the online simulated training. This is an observational study with a retrospective postsurvey of participants' user characteristics and degree of sense of presence. Twenty-nine members from 3 Air Force National Guard Medical Service expeditionary medical support teams participated in an online virtual environment training exercise and completed the Independent Television Commission-Sense of Presence Inventory survey, which measures sense of presence and user characteristics. Nonparametric statistics were applied to determine the statistical significance of user characteristics to sense of presence. Comparing user characteristics to the 4 scales of the Independent Television Commission-Sense of Presence Inventory using Kendall τ test gave the following results: the user characteristics "how often you play video games" (τ(26)=-0.458, P<0.01) and "television/film production knowledge" (τ(27)=-0.516, P<0.01) were significantly related to negative effects. Negative effects refer to adverse physiologic reactions owing to the virtual environment experience such as dizziness, nausea, headache, and eyestrain. The user characteristic "knowledge of virtual reality" was significantly related to engagement (τ(26)=0.463, P<0.01) and negative effects (τ(26)=-0.404, P<0.05). Individuals who have knowledge about virtual environments and experience with gaming environments report a higher sense of presence that indicates that they will likely benefit more from online virtual training. Future research studies could include a larger population of expeditionary medical support, and the results obtained could be used to create a model that predicts the level of presence based on the user characteristics. To maximize results and minimize costs, only those individuals who, based on their characteristics, are supposed to have a higher sense of presence and less negative effects could be selected for online simulated virtual environment training.
Three-Dimensional User Interfaces for Immersive Virtual Reality
NASA Technical Reports Server (NTRS)
vanDam, Andries
1997-01-01
The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.
Meeting and Serving Users in Their New Work (and Play) Spaces
ERIC Educational Resources Information Center
Peters, Tom
2008-01-01
This article examines the public services component of digital and virtual libraries, focusing on the end-user experience. As the number and types of "places" where library users access library collections and services continue to expand (now including cell phones, iPods, and three-dimensional virtual reality environments populated by avatars),…
Proposal for Implementing Multi-User Database (MUD) Technology in an Academic Library.
ERIC Educational Resources Information Center
Filby, A. M. Iliana
1996-01-01
Explores the use of MOO (multi-user object oriented) virtual environments in academic libraries to enhance reference services. Highlights include the development of multi-user database (MUD) technology from gaming to non-recreational settings; programming issues; collaborative MOOs; MOOs as distinguished from other types of virtual reality; audio…
DJ Sim: a virtual reality DJ simulation game
NASA Astrophysics Data System (ADS)
Tang, Ka Yin; Loke, Mei Hwan; Chin, Ching Ling; Chua, Gim Guan; Chong, Jyh Herng; Manders, Corey; Khan, Ishtiaq Rasool; Yuan, Miaolong; Farbiz, Farzam
2009-02-01
This work describes the process of developing a 3D Virtual Reality (VR) DJ simulation game intended to be displayed on a stereoscopic display. Using a DLP projector and shutter glasses, the user of the system plays a game in which he or she is a DJ in a night club. The night club's music is playing, and the DJ is "scratching" in correspondence to this music. Much in the flavor of Guitar Hero or Dance Dance Revolution, a virtual turntable is manipulated to project information about how the user should perform. The user only needs a small set of hand gestures, corresponding to the turntable scratch movements to play the game. As the music plays, a series of moving arrows approaching the DJ's turntable instruct the user as to when and how to perform the scratches.
Role of virtual reality for cerebral palsy management.
Weiss, Patrice L Tamar; Tirosh, Emanuel; Fehlings, Darcy
2014-08-01
Virtual reality is the use of interactive simulations to present users with opportunities to perform in virtual environments that appear, sound, and less frequently, feel similar to real-world objects and events. Interactive computer play refers to the use of a game where a child interacts and plays with virtual objects in a computer-generated environment. Because of their distinctive attributes that provide ecologically realistic and motivating opportunities for active learning, these technologies have been used in pediatric rehabilitation over the past 15 years. The ability of virtual reality to create opportunities for active repetitive motor/sensory practice adds to their potential for neuroplasticity and learning in individuals with neurologic disorders. The objectives of this article is to provide an overview of how virtual reality and gaming are used clinically, to present the results of several example studies that demonstrate their use in research, and to briefly remark on future developments. © The Author(s) 2014.
Augmented Reality versus Virtual Reality for 3D Object Manipulation.
Krichenbauer, Max; Yamamoto, Goshiro; Taketom, Takafumi; Sandor, Christian; Kato, Hirokazu
2018-02-01
Virtual Reality (VR) Head-Mounted Displays (HMDs) are on the verge of becoming commodity hardware available to the average user and feasible to use as a tool for 3D work. Some HMDs include front-facing cameras, enabling Augmented Reality (AR) functionality. Apart from avoiding collisions with the environment, interaction with virtual objects may also be affected by seeing the real environment. However, whether these effects are positive or negative has not yet been studied extensively. For most tasks it is unknown whether AR has any advantage over VR. In this work we present the results of a user study in which we compared user performance measured in task completion time on a 9 degrees of freedom object selection and transformation task performed either in AR or VR, both with a 3D input device and a mouse. Our results show faster task completion time in AR over VR. When using a 3D input device, a purely VR environment increased task completion time by 22.5 percent on average compared to AR ( ). Surprisingly, a similar effect occurred when using a mouse: users were about 17.3 percent slower in VR than in AR ( ). Mouse and 3D input device produced similar task completion times in each condition (AR or VR) respectively. We further found no differences in reported comfort.
The Virtual Tablet: Virtual Reality as a Control System
NASA Technical Reports Server (NTRS)
Chronister, Andrew
2016-01-01
In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of hardware, write software to incorporate each piece of position or orientation data into a coherent description of the object's location in space, place the virtual analogue accordingly, and project the control interface onto it, resulting in a functioning object which has both a physical and a virtual presence. Additionally, the virtual environment was enhanced with two live video feeds from cameras mounted on the robotic device being used as an example target of the virtual interface. The working VT allows users to naturally interact with a control interface with little to no training and without the issues found in previous efforts.
An integrated pipeline to create and experience compelling scenarios in virtual reality
NASA Astrophysics Data System (ADS)
Springer, Jan P.; Neumann, Carsten; Reiners, Dirk; Cruz-Neira, Carolina
2011-03-01
One of the main barriers to create and use compelling scenarios in virtual reality is the complexity and time-consuming efforts for modeling, element integration, and the software development to properly display and interact with the content in the available systems. Still today, most virtual reality applications are tedious to create and they are hard-wired to the specific display and interaction system available to the developers when creating the application. Furthermore, it is not possible to alter the content or the dynamics of the content once the application has been created. We present our research on designing a software pipeline that enables the creation of compelling scenarios with a fair degree of visual and interaction complexity in a semi-automated way. Specifically, we are targeting drivable urban scenarios, ranging from large cities to sparsely populated rural areas that incorporate both static components (e. g., houses, trees) and dynamic components (e. g., people, vehicles) as well as events, such as explosions or ambient noise. Our pipeline has four basic components. First, an environment designer, where users sketch the overall layout of the scenario, and an automated method constructs the 3D environment from the information in the sketch. Second, a scenario editor used for authoring the complete scenario, incorporate the dynamic elements and events, fine tune the automatically generated environment, define the execution conditions of the scenario, and set up any data gathering that may be necessary during the execution of the scenario. Third, a run-time environment for different virtual-reality systems provides users with the interactive experience as designed with the designer and the editor. And fourth, a bi-directional monitoring system that allows for capturing and modification of information from the virtual environment. One of the interesting capabilities of our pipeline is that scenarios can be built and modified on-the-fly as they are being presented in the virtual-reality systems. Users can quickly prototype the basic scene using the designer and the editor on a control workstation. More elements can then be introduced into the scene from both the editor and the virtual-reality display. In this manner, users are able to gradually increase the complexity of the scenario with immediate feedback. The main use of this pipeline is the rapid development of scenarios for human-factors studies. However, it is applicable in a much more general context.
Possibilities and Determinants of Using Low-Cost Devices in Virtual Education Applications
ERIC Educational Resources Information Center
Bun, Pawel Kazimierz; Wichniarek, Radoslaw; Górski, Filip; Grajewski, Damian; Zawadzki, Przemyslaw; Hamrol, Adam
2017-01-01
Virtual reality (VR) may be used as an innovative educational tool. However, in order to fully exploit its potential, it is essential to achieve the effect of immersion. To more completely submerge the user in a virtual environment, it is necessary to ensure that the user's actions are directly translated into the image generated by the…
Generating Contextual Descriptions of Virtual Reality (VR) Spaces
NASA Astrophysics Data System (ADS)
Olson, D. M.; Zaman, C. H.; Sutherland, A.
2017-12-01
Virtual reality holds great potential for science communication, education, and research. However, interfaces for manipulating data and environments in virtual worlds are limited and idiosyncratic. Furthermore, speech and vision are the primary modalities by which humans collect information about the world, but the linking of visual and natural language domains is a relatively new pursuit in computer vision. Machine learning techniques have been shown to be effective at image and speech classification, as well as at describing images with language (Karpathy 2016), but have not yet been used to describe potential actions. We propose a technique for creating a library of possible context-specific actions associated with 3D objects in immersive virtual worlds based on a novel dataset generated natively in virtual reality containing speech, image, gaze, and acceleration data. We will discuss the design and execution of a user study in virtual reality that enabled the collection and the development of this dataset. We will also discuss the development of a hybrid machine learning algorithm linking vision data with environmental affordances in natural language. Our findings demonstrate that it is possible to develop a model which can generate interpretable verbal descriptions of possible actions associated with recognized 3D objects within immersive VR environments. This suggests promising applications for more intuitive user interfaces through voice interaction within 3D environments. It also demonstrates the potential to apply vast bodies of embodied and semantic knowledge to enrich user interaction within VR environments. This technology would allow for applications such as expert knowledge annotation of 3D environments, complex verbal data querying and object manipulation in virtual spaces, and computer-generated, dynamic 3D object affordances and functionality during simulations.
Brief Report: A Pilot Study of the Use of a Virtual Reality Headset in Autism Populations
ERIC Educational Resources Information Center
Newbutt, Nigel; Sung, Connie; Kuo, Hung-Jen; Leahy, Michael J.; Lin, Chien-Chun; Tong, Boyang
2016-01-01
The application of virtual reality technologies (VRTs) for users with autism spectrum disorder (ASD) has been studied for decades. However, a gap remains in our understanding surrounding VRT head-mounted displays (HMDs). As newly designed HMDs have become commercially available (in this study the Oculus Rift[superscript TM]) the need to…
Virtual Reality Educational Tool for Human Anatomy.
Izard, Santiago González; Juanes Méndez, Juan A; Palomera, Pablo Ruisoto
2017-05-01
Virtual Reality is becoming widespread in our society within very different areas, from industry to entertainment. It has many advantages in education as well, since it allows visualizing almost any object or going anywhere in a unique way. We will be focusing on medical education, and more specifically anatomy, where its use is especially interesting because it allows studying any structure of the human body by placing the user inside each one. By allowing virtual immersion in a body structure such as the interior of the cranium, stereoscopic vision goggles make these innovative teaching technologies a powerful tool for training in all areas of health sciences. The aim of this study is to illustrate the teaching potential of applying Virtual Reality in the field of human anatomy, where it can be used as a tool for education in medicine. A Virtual Reality Software was developed as an educational tool. This technological procedure is based entirely on software which will run in stereoscopic goggles to give users the sensation of being in a virtual environment, clearly showing the different bones and foramina which make up the cranium, and accompanied by audio explanations. Throughout the results the structure of the cranium is described in detailed from both inside and out. Importance of an exhaustive morphological knowledge of cranial fossae is further discussed. Application for the design of microsurgery is also commented.
Virtual reality and the unfolding of higher dimensions
NASA Astrophysics Data System (ADS)
Aguilera, Julieta C.
2006-02-01
As virtual/augmented reality evolves, the need for spaces that are responsive to structures independent from three dimensional spatial constraints, become apparent. The visual medium of computer graphics may also challenge these self imposed constraints. If one can get used to how projections affect 3D objects in two dimensions, it may also be possible to compose a situation in which to get used to the variations that occur while moving through higher dimensions. The presented application is an enveloping landscape of concave and convex forms, which are determined by the orientation and displacement of the user in relation to a grid made of tesseracts (cubes in four dimensions). The interface accepts input from tridimensional and four-dimensional transformations, and smoothly displays such interactions in real-time. The motion of the user becomes the graphic element whereas the higher dimensional grid references to his/her position relative to it. The user learns how motion inputs affect the grid, recognizing a correlation between the input and the transformations. Mapping information to complex grids in virtual reality is valuable for engineers, artists and users in general because navigation can be internalized like a dance pattern, and further engage us to maneuver space in order to know and experience.
The Input-Interface of Webcam Applied in 3D Virtual Reality Systems
ERIC Educational Resources Information Center
Sun, Huey-Min; Cheng, Wen-Lin
2009-01-01
Our research explores a virtual reality application based on Web camera (Webcam) input-interface. The interface can replace with the mouse to control direction intention of a user by the method of frame difference. We divide a frame into nine grids from Webcam and make use of the background registration to compute the moving object. In order to…
NASA Astrophysics Data System (ADS)
Kersten, T. P.; Büyüksalih, G.; Tschirschwitz, F.; Kan, T.; Deggim, S.; Kaya, Y.; Baskaraca, A. P.
2017-05-01
Recent advances in contemporary Virtual Reality (VR) technologies are going to have a significant impact on veryday life. Through VR it is possible to virtually explore a computer-generated environment as a different reality, and to immerse oneself into the past or in a virtual museum without leaving the current real-life situation. For such the ultimate VR experience, the user should only see the virtual world. Currently, the user must wear a VR headset which fits around the head and over the eyes to visually separate themselves from the physical world. Via the headset images are fed to the eyes through two small lenses. Cultural heritage monuments are ideally suited both for thorough multi-dimensional geometric documentation and for realistic interactive visualisation in immersive VR applications. Additionally, the game industry offers tools for interactive visualisation of objects to motivate users to virtually visit objects and places. In this paper the generation of a virtual 3D model of the Selimiye mosque in the city of Edirne, Turkey and its processing for data integration into the game engine Unity is presented. The project has been carried out as a co-operation between BİMTAŞ, a company of the Greater Municipality of Istanbul, Turkey and the Photogrammetry & Laser Scanning Lab of the HafenCity University Hamburg, Germany to demonstrate an immersive and interactive visualisation using the new VR system HTC Vive. The workflow from data acquisition to VR visualisation, including the necessary programming for navigation, is described. Furthermore, the possible use (including simultaneous multiple users environments) of such a VR visualisation for a CH monument is discussed in this contribution.
A 3-D mixed-reality system for stereoscopic visualization of medical dataset.
Ferrari, Vincenzo; Megali, Giuseppe; Troia, Elena; Pietrabissa, Andrea; Mosca, Franco
2009-11-01
We developed a simple, light, and cheap 3-D visualization device based on mixed reality that can be used by physicians to see preoperative radiological exams in a natural way. The system allows the user to see stereoscopic "augmented images," which are created by mixing 3-D virtual models of anatomies obtained by processing preoperative volumetric radiological images (computed tomography or MRI) with real patient live images, grabbed by means of cameras. The interface of the system consists of a head-mounted display equipped with two high-definition cameras. Cameras are mounted in correspondence of the user's eyes and allow one to grab live images of the patient with the same point of view of the user. The system does not use any external tracker to detect movements of the user or the patient. The movements of the user's head and the alignment of virtual patient with the real one are done using machine vision methods applied on pairs of live images. Experimental results, concerning frame rate and alignment precision between virtual and real patient, demonstrate that machine vision methods used for localization are appropriate for the specific application and that systems based on stereoscopic mixed reality are feasible and can be proficiently adopted in clinical practice.
A computer-based training system combining virtual reality and multimedia
NASA Technical Reports Server (NTRS)
Stansfield, Sharon A.
1993-01-01
Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment. The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system.
3D Flow visualization in virtual reality
NASA Astrophysics Data System (ADS)
Pietraszewski, Noah; Dhillon, Ranbir; Green, Melissa
2017-11-01
By viewing fluid dynamic isosurfaces in virtual reality (VR), many of the issues associated with the rendering of three-dimensional objects on a two-dimensional screen can be addressed. In addition, viewing a variety of unsteady 3D data sets in VR opens up novel opportunities for education and community outreach. In this work, the vortex wake of a bio-inspired pitching panel was visualized using a three-dimensional structural model of Q-criterion isosurfaces rendered in virtual reality using the HTC Vive. Utilizing the Unity cross-platform gaming engine, a program was developed to allow the user to control and change this model's position and orientation in three-dimensional space. In addition to controlling the model's position and orientation, the user can ``scroll'' forward and backward in time to analyze the formation and shedding of vortices in the wake. Finally, the user can toggle between different quantities, while keeping the time step constant, to analyze flow parameter relationships at specific times during flow development. The information, data, or work presented herein was funded in part by an award from NYS Department of Economic Development (DED) through the Syracuse Center of Excellence.
Educational MOO: Text-Based Virtual Reality for Learning in Community. ERIC Digest.
ERIC Educational Resources Information Center
Turbee, Lonnie
MOO stands for "Multi-user domain, Object-Oriented." Early multi-user domains, or "MUDs," began as net-based dungeons-and-dragons type games, but MOOs have evolved from these origins to become some of cyberspace's most fascinating and engaging online communities. MOOs are social environments in a text-based virtual reality…
Mixed Methods for Mixed Reality: Understanding Users' Avatar Activities in Virtual Worlds
ERIC Educational Resources Information Center
Feldon, David F.; Kafai, Yasmin B.
2008-01-01
This paper examines the use of mixed methods for analyzing users' avatar-related activities in a virtual world. Server logs recorded keystroke-level activity for 595 participants over a six-month period in Whyville.net, an informal science website. Participants also completed surveys and participated in interviews regarding their experiences.…
NASA Astrophysics Data System (ADS)
Thoma, George R.
1996-03-01
The virtual digital library, a concept that is quickly becoming a reality, offers rapid and geography-independent access to stores of text, images, graphics, motion video and other datatypes. Furthermore, a user may move from one information source to another through hypertext linkages. The projects described here further the notion of such an information paradigm from an end user viewpoint.
A pilot study of surgical training using a virtual robotic surgery simulator.
Tergas, Ana I; Sheth, Sangini B; Green, Isabel C; Giuntoli, Robert L; Winder, Abigail D; Fader, Amanda N
2013-01-01
Our objectives were to compare the utility of learning a suturing task on the virtual reality da Vinci Skills Simulator versus the da Vinci Surgical System dry laboratory platform and to assess user satisfaction among novice robotic surgeons. Medical trainees were enrolled prospectively; one group trained on the virtual reality simulator, and the other group trained on the da Vinci dry laboratory platform. Trainees received pretesting and post-testing on the dry laboratory platform. Participants then completed an anonymous online user experience and satisfaction survey. We enrolled 20 participants. Mean pretest completion times did not significantly differ between the 2 groups. Training with either platform was associated with a similar decrease in mean time to completion (simulator platform group, 64.9 seconds [P = .04]; dry laboratory platform group, 63.9 seconds [P < .01]). Most participants (58%) preferred the virtual reality platform. The majority found the training "definitely useful" in improving robotic surgical skills (mean, 4.6) and would attend future training sessions (mean, 4.5). Training on the virtual reality robotic simulator or the dry laboratory robotic surgery platform resulted in significant improvements in time to completion and economy of motion for novice robotic surgeons. Although there was a perception that both simulators improved performance, there was a preference for the virtual reality simulator. Benefits unique to the simulator platform include autonomy of use, computerized performance feedback, and ease of setup. These features may facilitate more efficient and sophisticated simulation training above that of the conventional dry laboratory platform, without loss of efficacy.
Vision-based augmented reality system
NASA Astrophysics Data System (ADS)
Chen, Jing; Wang, Yongtian; Shi, Qi; Yan, Dayuan
2003-04-01
The most promising aspect of augmented reality lies in its ability to integrate the virtual world of the computer with the real world of the user. Namely, users can interact with the real world subjects and objects directly. This paper presents an experimental augmented reality system with a video see-through head-mounted device to display visual objects, as if they were lying on the table together with real objects. In order to overlay virtual objects on the real world at the right position and orientation, the accurate calibration and registration are most important. A vision-based method is used to estimate CCD external parameters by tracking 4 known points with different colors. It achieves sufficient accuracy for non-critical applications such as gaming, annotation and so on.
Ambient Intelligence in Multimeda and Virtual Reality Environments for the rehabilitation
NASA Astrophysics Data System (ADS)
Benko, Attila; Cecilia, Sik Lanyi
This chapter presents a general overview about the use of multimedia and virtual reality in rehabilitation and assistive and preventive healthcare. This chapter deals with multimedia, virtual reality applications based AI intended for use by medical doctors, nurses, special teachers and further interested persons. It describes methods how multimedia and virtual reality is able to assist their work. These include the areas how multimedia and virtual reality can help the patients everyday life and their rehabilitation. In the second part of the chapter we present the Virtual Therapy Room (VTR) a realized application for aphasic patients that was created for practicing communication and expressing emotions in a group therapy setting. The VTR shows a room that contains a virtual therapist and four virtual patients (avatars). The avatars are utilizing their knowledge base in order to answer the questions of the user providing an AI environment for the rehabilitation. The user of the VTR is the aphasic patient who has to solve the exercises. The picture that is relevant for the actual task appears on the virtual blackboard. Patient answers questions of the virtual therapist. Questions are about pictures describing an activity or an object in different levels. Patient can ask an avatar for answer. If the avatar knows the answer the avatars emotion changes to happy instead of sad. The avatar expresses its emotions in different dimensions. Its behavior, face-mimic, voice-tone and response also changes. The emotion system can be described as a deterministic finite automaton where places are emotion-states and the transition function of the automaton is derived from the input-response reaction of an avatar. Natural language processing techniques were also implemented in order to establish highquality human-computer interface windows for each of the avatars. Aphasic patients are able to interact with avatars via these interfaces. At the end of the chapter we visualize the possible future research field.
Development of a low-cost virtual reality workstation for training and education
NASA Technical Reports Server (NTRS)
Phillips, James A.
1996-01-01
Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) it involves 3-dimensional computer graphics; (2) it includes real-time feedback and response to user actions; and (3) it must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, but the high cost of VR technology has limited its practical application to fields with big budgets, such as military combat simulation, commercial pilot training, and certain projects within the space program. However, in the last year there has been a revolution in the cost of VR technology. The speed of inexpensive personal computers has increased dramatically, especially with the introduction of the Pentium processor and the PCI bus for IBM-compatibles, and the cost of high-quality virtual reality peripherals has plummeted. The result is that many public schools, colleges, and universities can afford a PC-based workstation capable of running immersive virtual reality applications. My goal this summer was to assemble and evaluate such a system.
On the Usability and Likeability of Virtual Reality Games for Education: The Case of VR-ENGAGE
ERIC Educational Resources Information Center
Virvou, Maria; Katsionis, George
2008-01-01
Educational software games aim at increasing the students' motivation and engagement while they learn. However, if software games are targeted to school classrooms they have to be usable and likeable by all students. Usability of virtual reality games may be a problem because these games tend to have complex user interfaces so that they are more…
ERIC Educational Resources Information Center
Chen, Yu-Hsuan; Wang, Chang-Hwa
2018-01-01
Although research has indicated that augmented reality (AR)-facilitated instruction improves learning performance, further investigation of the usefulness of AR from a psychological perspective has been recommended. Researchers consider presence a major psychological effect when users are immersed in virtual reality environments. However, most…
Riva, Giuseppe; Carelli, Laura; Gaggioli, Andrea; Gorini, Alessandra; Vigna, Cinzia; Corsi, Riccardo; Faletti, Gianluca; Vezzadini, Luca
2009-01-01
At MMVR 2007 we presented NeuroVR (http://www.neurovr.org) a free virtual reality platform based on open-source software. The software allows non-expert users to adapt the content of 14 pre-designed virtual environments to the specific needs of the clinical or experimental setting. Following the feedbacks of the 700 users who downloaded the first version, we developed a new version - NeuroVR 1.5 - that improves the possibility for the therapist to enhance the patient's feeling of familiarity and intimacy with the virtual scene, by using external sounds, photos or videos. Specifically, the new version now includes full sound support and the ability of triggering external sounds and videos using the keyboard. The outcomes of different trials made using NeuroVR will be presented and discussed.
ERIC Educational Resources Information Center
Ehrlich, Justin
2010-01-01
The application of virtual reality is becoming ever more important as technology reaches new heights allowing virtual environments (VE) complete with global illumination. One successful application of virtual environments is educational interventions meant to treat individuals with autism spectrum disorder (ASD). VEs are effective with these…
Manually locating physical and virtual reality objects.
Chen, Karen B; Kimmel, Ryan A; Bartholomew, Aaron; Ponto, Kevin; Gleicher, Michael L; Radwin, Robert G
2014-09-01
In this study, we compared how users locate physical and equivalent three-dimensional images of virtual objects in a cave automatic virtual environment (CAVE) using the hand to examine how human performance (accuracy, time, and approach) is affected by object size, location, and distance. Virtual reality (VR) offers the promise to flexibly simulate arbitrary environments for studying human performance. Previously, VR researchers primarily considered differences between virtual and physical distance estimation rather than reaching for close-up objects. Fourteen participants completed manual targeting tasks that involved reaching for corners on equivalent physical and virtual boxes of three different sizes. Predicted errors were calculated from a geometric model based on user interpupillary distance, eye location, distance from the eyes to the projector screen, and object. Users were 1.64 times less accurate (p < .001) and spent 1.49 times more time (p = .01) targeting virtual versus physical box corners using the hands. Predicted virtual targeting errors were on average 1.53 times (p < .05) greater than the observed errors for farther virtual targets but not significantly different for close-up virtual targets. Target size, location, and distance, in addition to binocular disparity, affected virtual object targeting inaccuracy. Observed virtual box inaccuracy was less than predicted for farther locations, suggesting possible influence of cues other than binocular vision. Human physical interaction with objects in VR for simulation, training, and prototyping involving reaching and manually handling virtual objects in a CAVE are more accurate than predicted when locating farther objects.
NASA Technical Reports Server (NTRS)
Benford, Steve; Bowers, John; Fahlen, Lennart E.; Greenhalgh, Chris; Snowdon, Dave
1994-01-01
This paper explores the issue of user embodiment within collaborative virtual environments. By user embodiment we mean the provision of users with appropriate body images so as to represent them to others and also to themselves. By collaborative virtual environments we mean multi-user virtual reality systems which support cooperative work (although we argue that the results of our exploration may also be applied to other kinds of collaborative systems). The main part of the paper identifies a list of embodiment design issues including: presence, location, identity, activity, availability, history of activity, viewpoint, action point, gesture, facial expression, voluntary versus involuntary expression, degree of presence, reflecting capabilities, manipulating the user's view of others, representation across multiple media, autonomous and distributed body parts, truthfulness and efficiency. Following this, we show how these issues are reflected in our own DIVE and MASSIVE prototype collaborative virtual environments.
Practical system for generating digital mixed reality video holograms.
Song, Joongseok; Kim, Changseob; Park, Hanhoon; Park, Jong-Il
2016-07-10
We propose a practical system that can effectively mix the depth data of real and virtual objects by using a Z buffer and can quickly generate digital mixed reality video holograms by using multiple graphic processing units (GPUs). In an experiment, we verify that real objects and virtual objects can be merged naturally in free viewing angles, and the occlusion problem is well handled. Furthermore, we demonstrate that the proposed system can generate mixed reality video holograms at 7.6 frames per second. Finally, the system performance is objectively verified by users' subjective evaluations.
NASA Astrophysics Data System (ADS)
Lin, Chien-Liang; Su, Yu-Zheng; Hung, Min-Wei; Huang, Kuo-Cheng
2010-08-01
In recent years, Augmented Reality (AR)[1][2][3] is very popular in universities and research organizations. The AR technology has been widely used in Virtual Reality (VR) fields, such as sophisticated weapons, flight vehicle development, data model visualization, virtual training, entertainment and arts. AR has characteristics to enhance the display output as a real environment with specific user interactive functions or specific object recognitions. It can be use in medical treatment, anatomy training, precision instrument casting, warplane guidance, engineering and distance robot control. AR has a lot of vantages than VR. This system developed combines sensors, software and imaging algorithms to make users feel real, actual and existing. Imaging algorithms include gray level method, image binarization method, and white balance method in order to make accurate image recognition and overcome the effects of light.
NASA Astrophysics Data System (ADS)
Chien, Shao-Chi; Chung, Yu-Wei; Lin, Yi-Hsuan; Huang, Jun-Yi; Chang, Jhih-Ting; He, Cai-Ying; Cheng, Yi-Wen
2012-04-01
This study uses 3D virtual reality technology to create the "Mackay campus of the environmental education and digital cultural 3D navigation system" for local historical sites in the Tamsui (Hoba) area, in hopes of providing tourism information and navigation through historical sites using a 3D navigation system. We used Auto CAD, Sketch Up, and SpaceEyes 3D software to construct the virtual reality scenes and create the school's historical sites, such as the House of Reverends, the House of Maidens, the Residence of Mackay, and the Education Hall. We used this technology to complete the environmental education and digital cultural Mackay campus . The platform we established can indeed achieve the desired function of providing tourism information and historical site navigation. The interactive multimedia style and the presentation of the information will allow users to obtain a direct information response. In addition to showing the external appearances of buildings, the navigation platform can also allow users to enter the buildings to view lifelike scenes and textual information related to the historical sites. The historical sites are designed according to their actual size, which gives users a more realistic feel. In terms of the navigation route, the navigation system does not force users along a fixed route, but instead allows users to freely control the route they would like to take to view the historical sites on the platform.
Productive confusions: learning from simulations of pandemic virus outbreaks in Second Life
NASA Astrophysics Data System (ADS)
Cárdenas, Micha; Greci, Laura S.; Hurst, Samantha; Garman, Karen; Hoffman, Helene; Huang, Ricky; Gates, Michael; Kho, Kristen; Mehrmand, Elle; Porteous, Todd; Calvitti, Alan; Higginbotham, Erin; Agha, Zia
2011-03-01
Users of immersive virtual reality environments have reported a wide variety of side and after effects including the confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real can be turned around to explore the possibilities for immersion with minimal technological support in virtual world group training simulations. This paper will describe observations from my time working as an artist/researcher with the UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will examine moments of training to learn the software interface, moments within the drill and interviews after the drill.
Mobile devices, Virtual Reality, Augmented Reality, and Digital Geoscience Education.
NASA Astrophysics Data System (ADS)
Crompton, H.; De Paor, D. G.; Whitmeyer, S. J.; Bentley, C.
2016-12-01
Mobile devices are playing an increasing role in geoscience education. Affordances include instructor-student communication and class management in large classrooms, virtual and augmented reality applications, digital mapping, and crowd-sourcing. Mobile technologies have spawned the sub field of mobile learning or m-learning, which is defined as learning across multiple contexts, through social and content interactions. Geoscientists have traditionally engaged in non-digital mobile learning via fieldwork, but digital devices are greatly extending the possibilities, especially for non-traditional students. Smartphones and tablets are the most common devices but smart glasses such as Pivothead enable live streaming of a first-person view (see for example, https://youtu.be/gWrDaYP5w58). Virtual reality headsets such as Google Cardboard create an immersive virtual field experience and digital imagery such as GigaPan and Structure from Motion enables instructors and/or students to create virtual specimens and outcrops that are sharable across the globe. Whereas virtual reality (VR) replaces the real world with a virtual representation, augmented reality (AR) overlays digital data on the live scene visible to the user in real time. We have previously reported on our use of the AR application called FreshAiR for geoscientific "egg hunts." The popularity of Pokémon Go demonstrates the potential of AR for mobile learning in the geosciences.
A Virtual Reality Simulator Prototype for Learning and Assessing Phaco-sculpting Skills
NASA Astrophysics Data System (ADS)
Choi, Kup-Sze
This paper presents a virtual reality based simulator prototype for learning phacoemulsification in cataract surgery, with focus on the skills required for making a cross-shape trench in cataractous lens by an ultrasound probe during the phaco-sculpting procedure. An immersive virtual environment is created with 3D models of the lens and surgical tools. Haptic device is also used as 3D user interface. Phaco-sculpting is simulated by interactively deleting the constituting tetrahedrons of the lens model. Collisions between the virtual probe and the lens are effectively identified by partitioning the space containing the lens hierarchically with an octree. The simulator can be programmed to collect real-time quantitative user data for reviewing and assessing trainee's performance in an objective manner. A game-based learning environment can be created on top of the simulator by incorporating gaming elements based on the quantifiable performance metrics.
Nunnerley, Joanne; Gupta, Swati; Snell, Deborah; King, Marcus
2017-05-01
A user-centred design was used to develop and test the feasibility of an immersive 3D virtual reality wheelchair training tool for people with spinal cord injury (SCI). A Wheelchair Training System was designed and modelled using the Oculus Rift headset and a Dynamic Control wheelchair joystick. The system was tested by clinicians and expert wheelchair users with SCI. Data from focus groups and individual interviews were analysed using a general inductive approach to thematic analysis. Four themes emerged: Realistic System, which described the advantages of a realistic virtual environment; a Wheelchair Training System, which described participants' thoughts on the wheelchair training applications; Overcoming Resistance to Technology, the obstacles to introducing technology within the clinical setting; and Working outside the Rehabilitation Bubble which described the protective hospital environment. The Oculus Rift Wheelchair Training System has the potential to provide a virtual rehabilitation setting which could allow wheelchair users to learn valuable community wheelchair use in a safe environment. Nausea appears to be a side effect of the system, which will need to be resolved before this can be a viable clinical tool. Implications for Rehabilitation Immersive virtual reality shows promising benefit for wheelchair training in a rehabilitation setting. Early engagement with consumers can improve product development.
Meal-Maker: A Virtual Meal Preparation Environment for Children with Cerebral Palsy
ERIC Educational Resources Information Center
Kirshner, Sharon; Weiss, Patrice L.; Tirosh, Emanuel
2011-01-01
Virtual reality (VR) technology enables evaluation and practice of specific skills in a motivating, user-friendly and safe way. The implementation of virtual game environments within clinical settings has increased substantially in recent years. However, the psychometric properties and feasibility of many applications have not been fully…
Seamless 3D interaction for virtual tables, projection planes, and CAVEs
NASA Astrophysics Data System (ADS)
Encarnacao, L. M.; Bimber, Oliver; Schmalstieg, Dieter; Barton, Robert J., III
2000-08-01
The Virtual Table presents stereoscopic graphics to a user in a workbench-like setting. This device shares with other large- screen display technologies (such as data walls and surround- screen projection systems) the lack of human-centered unencumbered user interfaces and 3D interaction technologies. Such shortcomings present severe limitations to the application of virtual reality (VR) technology to time- critical applications as well as employment scenarios that involve heterogeneous groups of end-users without high levels of computer familiarity and expertise. Traditionally such employment scenarios are common in planning-related application areas such as mission rehearsal and command and control. For these applications, a high grade of flexibility with respect to the system requirements (display and I/O devices) as well as to the ability to seamlessly and intuitively switch between different interaction modalities and interaction are sought. Conventional VR techniques may be insufficient to meet this challenge. This paper presents novel approaches for human-centered interfaces to Virtual Environments focusing on the Virtual Table visual input device. It introduces new paradigms for 3D interaction in virtual environments (VE) for a variety of application areas based on pen-and-clipboard, mirror-in-hand, and magic-lens metaphors, and introduces new concepts for combining VR and augmented reality (AR) techniques. It finally describes approaches toward hybrid and distributed multi-user interaction environments and concludes by hypothesizing on possible use cases for defense applications.
NASA Technical Reports Server (NTRS)
2002-01-01
Ames Research Center granted Reality Capture Technologies (RCT), Inc., a license to further develop NASA's Mars Map software platform. The company incorporated NASA#s innovation into software that uses the Virtual Plant Model (VPM)(TM) to structure, modify, and implement the construction sites of industrial facilities, as well as develop, validate, and train operators on procedures. The VPM orchestrates the exchange of information between engineering, production, and business transaction systems. This enables users to simulate, control, and optimize work processes while increasing the reliability of critical business decisions. Engineers can complete the construction process and test various aspects of it in virtual reality before building the actual structure. With virtual access to and simulation of the construction site, project personnel can manage, access control, and respond to changes on complex constructions more effectively. Engineers can also create operating procedures, training, and documentation. Virtual Plant Model(TM) is a trademark of Reality Capture Technologies, Inc.
Towards Gesture-Based Multi-User Interactions in Collaborative Virtual Environments
NASA Astrophysics Data System (ADS)
Pretto, N.; Poiesi, F.
2017-11-01
We present a virtual reality (VR) setup that enables multiple users to participate in collaborative virtual environments and interact via gestures. A collaborative VR session is established through a network of users that is composed of a server and a set of clients. The server manages the communication amongst clients and is created by one of the users. Each user's VR setup consists of a Head Mounted Display (HMD) for immersive visualisation, a hand tracking system to interact with virtual objects and a single-hand joypad to move in the virtual environment. We use Google Cardboard as a HMD for the VR experience and a Leap Motion for hand tracking, thus making our solution low cost. We evaluate our VR setup though a forensics use case, where real-world objects pertaining to a simulated crime scene are included in a VR environment, acquired using a smartphone-based 3D reconstruction pipeline. Users can interact using virtual gesture-based tools such as pointers and rulers.
Augmented Reality for Close Quarters Combat
None
2018-01-16
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
Learning Anatomy via Mobile Augmented Reality: Effects on Achievement and Cognitive Load
ERIC Educational Resources Information Center
Küçük, Sevda; Kapakin, Samet; Göktas, Yüksel
2016-01-01
Augmented reality (AR), a new generation of technology, has attracted the attention of educators in recent years. In this study, a MagicBook was developed for a neuroanatomy topic by using mobile augmented reality (mAR) technology. This technology integrates virtual learning objects into the real world and allow users to interact with the…
An intelligent virtual human system for providing healthcare information and support.
Rizzo, Albert A; Lange, Belinda; Buckwalter, John G; Forbell, Eric; Kim, Julia; Sagae, Kenji; Williams, Josh; Rothbaum, Barbara O; Difede, JoAnn; Reger, Greg; Parsons, Thomas; Kenny, Patrick
2011-01-01
Over the last 15 years, a virtual revolution has taken place in the use of Virtual Reality simulation technology for clinical purposes. Shifts in the social and scientific landscape have now set the stage for the next major movement in Clinical Virtual Reality with the "birth" of intelligent virtual humans. Seminal research and development has appeared in the creation of highly interactive, artificially intelligent and natural language capable virtual human agents that can engage real human users in a credible fashion. No longer at the level of a prop to add context or minimal faux interaction in a virtual world, virtual humans can be designed to perceive and act in a 3D virtual world, engage in spoken dialogues with real users and can be capable of exhibiting human-like emotional reactions. This paper will present an overview of the SimCoach project that aims to develop virtual human support agents to serve as online guides for promoting access to psychological healthcare information and for assisting military personnel and family members in breaking down barriers to initiating care. The SimCoach experience is being designed to attract and engage military Service Members, Veterans and their significant others who might not otherwise seek help with a live healthcare provider. It is expected that this experience will motivate users to take the first step--to empower themselves to seek advice and information regarding their healthcare and general personal welfare and encourage them to take the next step towards seeking more formal resources if needed.
Spyridonis, Fotios; Gronli, Tor-Morten; Hansen, Jarle; Ghinea, Gheorghita
2012-01-01
Pain constitutes an important medical concern that can have severe implications to a wheelchair user's quality of life. Results from studies indicate that pain is a common problem in this group of individuals, having a reported frequency of always (12%) and everyday (33%). This incidence signifies the need for more applicable and effective pain management clinical tools. As a result, in this paper we present an Android application (PainDroid) that has been enhanced with Virtual Reality (VR) technology for the purpose of improving the management of pain. Our evaluation with a group of wheelchair users revealed that PainDroid demonstrated high usability among this population, and is foreseen that it can make an important contribution in research on the assessment and management of pain.
Usability evaluation of low-cost virtual reality hand and arm rehabilitation games.
Seo, Na Jin; Arun Kumar, Jayashree; Hur, Pilwon; Crocher, Vincent; Motawar, Binal; Lakshminarayanan, Kishor
2016-01-01
The emergence of lower-cost motion tracking devices enables home-based virtual reality rehabilitation activities and increased accessibility to patients. Currently, little documentation on patients' expectations for virtual reality rehabilitation is available. This study surveyed 10 people with stroke for their expectations of virtual reality rehabilitation games. This study also evaluated the usability of three lower-cost virtual reality rehabilitation games using a survey and House of Quality analysis. The games (kitchen, archery, and puzzle) were developed in the laboratory to encourage coordinated finger and arm movements. Lower-cost motion tracking devices, the P5 Glove and Microsoft Kinect, were used to record the movements. People with stroke were found to desire motivating and easy-to-use games with clinical insights and encouragement from therapists. The House of Quality analysis revealed that the games should be improved by obtaining evidence for clinical effectiveness, including clinical feedback regarding improving functional abilities, adapting the games to the user's changing functional ability, and improving usability of the motion-tracking devices. This study reports the expectations of people with stroke for rehabilitation games and usability analysis that can help guide development of future games.
NASA Astrophysics Data System (ADS)
Ribeiro, Allan; Santos, Helen
With the advent of new information and communication technologies (ICTs), the communicative interaction changes the way of being and acting of people, at the same time that changes the way of work activities related to education. In this range of possibilities provided by the advancement of computational resources include virtual reality (VR) and augmented reality (AR), are highlighted as new forms of information visualization in computer applications. While the RV allows user interaction with a virtual environment totally computer generated; in RA the virtual images are inserted in real environment, but both create new opportunities to support teaching and learning in formal and informal contexts. Such technologies are able to express representations of reality or of the imagination, as systems in nanoscale and low dimensionality, being imperative to explore, in the most diverse areas of knowledge, the potential offered by ICT and emerging technologies. In this sense, this work presents computer applications of virtual and augmented reality developed with the use of modeling and simulation in computational approaches to topics related to nanoscience and nanotechnology, and articulated with innovative pedagogical practices.
NASA Technical Reports Server (NTRS)
Dumas, Joseph D., II
1998-01-01
Several virtual reality I/O peripherals were successfully configured and integrated as part of the author's 1997 Summer Faculty Fellowship work. These devices, which were not supported by the developers of VR software packages, use new software drivers and configuration files developed by the author to allow them to be used with simulations developed using those software packages. The successful integration of these devices has added significant capability to the ANVIL lab at MSFC. In addition, the author was able to complete the integration of a networked virtual reality simulation of the Space Shuttle Remote Manipulator System docking Space Station modules which was begun as part of his 1996 Fellowship. The successful integration of this simulation demonstrates the feasibility of using VR technology for ground-based training as well as on-orbit operations.
Active Learning through the Use of Virtual Environments
ERIC Educational Resources Information Center
Mayrose, James
2012-01-01
Immersive Virtual Reality (VR) has seen explosive growth over the last decade. Immersive VR attempts to give users the sensation of being fully immersed in a synthetic environment by providing them with 3D hardware, and allowing them to interact with objects in virtual worlds. The technology is extremely effective for learning and exploration, and…
Virtual Environments Supporting Learning and Communication in Special Needs Education
ERIC Educational Resources Information Center
Cobb, Sue V. G.
2007-01-01
Virtual reality (VR) describes a set of technologies that allow users to explore and experience 3-dimensional computer-generated "worlds" or "environments." These virtual environments can contain representations of real or imaginary objects on a small or large scale (from modeling of molecular structures to buildings, streets, and scenery of a…
ERIC Educational Resources Information Center
Panettieri, Joseph C.
2007-01-01
Across the globe, progressive universities are embracing any number of MUVEs (multi-user virtual environments), 3D environments, and "immersive" virtual reality tools. And within the next few months, several universities are expected to test so-called "telepresence" videoconferencing systems from Cisco Systems and other leading…
NASA Astrophysics Data System (ADS)
Demir, I.
2014-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.
Intelligent Motion and Interaction Within Virtual Environments
NASA Technical Reports Server (NTRS)
Ellis, Stephen R. (Editor); Slater, Mel (Editor); Alexander, Thomas (Editor)
2007-01-01
What makes virtual actors and objects in virtual environments seem real? How can the illusion of their reality be supported? What sorts of training or user-interface applications benefit from realistic user-environment interactions? These are some of the central questions that designers of virtual environments face. To be sure simulation realism is not necessarily the major, or even a required goal, of a virtual environment intended to communicate specific information. But for some applications in entertainment, marketing, or aspects of vehicle simulation training, realism is essential. The following chapters will examine how a sense of truly interacting with dynamic, intelligent agents may arise in users of virtual environments. These chapters are based on presentations at the London conference on Intelligent Motion and Interaction within a Virtual Environments which was held at University College, London, U.K., 15-17 September 2003.
Virtual Application of Darul Arif Palace from Serdang Sultanate using Virtual Reality
NASA Astrophysics Data System (ADS)
Syahputra, M. F.; Annisa, T.; Rahmat, R. F.; Muchtar, M. A.
2017-01-01
Serdang Sultanate is one of Malay Sultanate in Sumatera Utara. In the 18th century, many Malay Aristocrats have developed in Sumatera Utara. Social revolution has happened in 1946, many sultanates were overthrown and member of PKI (Communist Party of Indonesia) did mass killing on members of the sultanate families. As the results of this incident, many cultural and historical heritage destroyed. The integration of heritage preservation and the digital technology has become recent trend. The digital technology is not only able to record, preserve detailed documents and information of heritage completely, but also effectively bring the value-added. In this research, polygonal modelling techniques from 3D modelling technology is used to reconstruct Darul Arif Palace of Serdang Sultanate. After modelling the palace, it will be combined with virtual reality technology to allow user to explore the palace and the environment around the palace. Virtual technology is simulation of real objects in virtual world. The results in this research is that virtual reality application can run using Head-Mounted Display.
A game based virtual campus tour
NASA Astrophysics Data System (ADS)
Razia Sulthana, A.; Arokiaraj Jovith, A.; Saveetha, D.; Jaithunbi, A. K.
2018-04-01
The aim of the application is to create a virtual reality game, whose purpose is to showcase the facilities of SRM University, while doing so in an entertaining manner. The virtual prototype of the institution is deployed in a game engine which eases the students to look over the infrastructure, thereby reducing the resources utilization. Time and money are the resources in concern today. The virtual campus application assists the end user even from a remote location. The virtual world simulates the exact location and hence the effect is created. Thus, it virtually transports the user to the university, with the help of a VR Headset. This is a dynamic application wherein the user can move in any direction. The VR headset provides an interface to get gyro input and this is used to start and stop the movement. Virtual Campus is size efficient and occupies minimal space. It is scalable against mobile gadgets. This gaming application helps the end user to explore the campus, while having fun too. It is a user friendly application that supports users worldwide.
Virtual reality and robotics for stroke rehabilitation: where do we go from here?
Wade, Eric; Winstein, Carolee J
2011-01-01
Promoting functional recovery after stroke requires collaborative and innovative approaches to neurorehabilitation research. Task-oriented training (TOT) approaches that include challenging, adaptable, and meaningful activities have led to successful outcomes in several large-scale multisite definitive trials. This, along with recent technological advances of virtual reality and robotics, provides a fertile environment for furthering clinical research in neurorehabilitation. Both virtual reality and robotics make use of multimodal sensory interfaces to affect human behavior. In the therapeutic setting, these systems can be used to quantitatively monitor, manipulate, and augment the users' interaction with their environment, with the goal of promoting functional recovery. This article describes recent advances in virtual reality and robotics and the synergy with best clinical practice. Additionally, we describe the promise shown for automated assessments and in-home activity-based interventions. Finally, we propose a broader approach to ensuring that technology-based assessment and intervention complement evidence-based practice and maintain a patient-centered perspective.
Optoelectronics technologies for Virtual Reality systems
NASA Astrophysics Data System (ADS)
Piszczek, Marek; Maciejewski, Marcin; Pomianek, Mateusz; Szustakowski, Mieczysław
2017-08-01
Solutions in the field of virtual reality are very strongly associated with optoelectronic technologies. This applies to both process design and operation of VR applications. Technologies such as 360 cameras and 3D scanners significantly improve the design work. What is more, HMD displays with high field of view or optoelectronic Motion Capture systems and 3D cameras guarantee an extraordinary experience in immersive VR applications. This article reviews selected technologies from the perspective of their use in a broadly defined process of creating and implementing solutions for virtual reality. There is also the ability to create, modify and adapt new approaches that show team own work (SteamVR tracker). Most of the introduced examples are effectively used by authors to create different VR applications. The use of optoelectronic technology in virtual reality is presented in terms of design and operation of the system as well as referring to specific applications. Designers and users of VR systems should take a close look on new optoelectronics solutions, as they can significantly contribute to increased work efficiency and offer completely new opportunities for virtual world reception.
Levy
1996-08-01
New interactive computer technologies are having a significant influence on medical education, training, and practice. The newest innovation in computer technology, virtual reality, allows an individual to be immersed in a dynamic computer-generated, three-dimensional environment and can provide realistic simulations of surgical procedures. A new virtual reality hysteroscope passes through a sensing device that synchronizes movements with a three-dimensional model of a uterus. Force feedback is incorporated into this model, so the user actually experiences the collision of an instrument against the uterine wall or the sensation of the resistance or drag of a resectoscope as it cuts through a myoma in a virtual environment. A variety of intrauterine pathologies and procedures are simulated, including hyperplasia, cancer, resection of a uterine septum, polyp, or myoma, and endometrial ablation. This technology will be incorporated into comprehensive training programs that will objectively assess hand-eye coordination and procedural skills. It is possible that by incorporating virtual reality into hysteroscopic training programs, a decrease in the learning curve and the number of complications presently associated with the procedures may be realized. Prospective studies are required to assess these potential benefits.
Using Virtual Reality to Improve Walking Post-Stroke: Translation to Individuals with Diabetes
Deutsch, Judith E
2011-01-01
Use of virtual reality (VR) technology to improve walking for people post-stroke has been studied for its clinical application since 2004. The hardware and software used to create these systems has varied but has predominantly been constituted by projected environments with users walking on treadmills. Transfer of training from the virtual environment to real-world walking has modest but positive research support. Translation of the research findings to clinical practice has been hampered by commercial availability and costs of the VR systems. Suggestions for how the work for individuals post-stroke might be applied and adapted for individuals with diabetes and other impaired ambulatory conditions include involvement of the target user groups (both practitioners and clients) early in the design and integration of activity and education into the systems. PMID:21527098
Using virtual reality to improve walking post-stroke: translation to individuals with diabetes.
Deutsch, Judith E
2011-03-01
Use of virtual reality (VR) technology to improve walking for people post-stroke has been studied for its clinical application since 2004. The hardware and software used to create these systems has varied but has predominantly been constituted by projected environments with users walking on treadmills. Transfer of training from the virtual environment to real-world walking has modest but positive research support. Translation of the research findings to clinical practice has been hampered by commercial availability and costs of the VR systems. Suggestions for how the work for individuals post-stroke might be applied and adapted for individuals with diabetes and other impaired ambulatory conditions include involvement of the target user groups (both practitioners and clients) early in the design and integration of activity and education into the systems. © 2011 Diabetes Technology Society.
Multi-modal virtual environment research at Armstrong Laboratory
NASA Technical Reports Server (NTRS)
Eggleston, Robert G.
1995-01-01
One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.
Zibrek, Katja; Kokkinara, Elena; Mcdonnell, Rachel
2018-04-01
Virtual characters that appear almost photo-realistic have been shown to induce negative responses from viewers in traditional media, such as film and video games. This effect, described as the uncanny valley, is the reason why realism is often avoided when the aim is to create an appealing virtual character. In Virtual Reality, there have been few attempts to investigate this phenomenon and the implications of rendering virtual characters with high levels of realism on user enjoyment. In this paper, we conducted a large-scale experiment on over one thousand members of the public in order to gather information on how virtual characters are perceived in interactive virtual reality games. We were particularly interested in whether different render styles (realistic, cartoon, etc.) would directly influence appeal, or if a character's personality was the most important indicator of appeal. We used a number of perceptual metrics such as subjective ratings, proximity, and attribution bias in order to test our hypothesis. Our main result shows that affinity towards virtual characters is a complex interaction between the character's appearance and personality, and that realism is in fact a positive choice for virtual characters in virtual reality.
Research on virtual Guzheng based on Kinect
NASA Astrophysics Data System (ADS)
Li, Shuyao; Xu, Kuangyi; Zhang, Heng
2018-05-01
There are a lot of researches on virtual instruments, but there are few on classical Chinese instruments, and the techniques used are very limited. This paper uses Unity 3D and Kinect camera combined with virtual reality technology and gesture recognition method to design a virtual playing system of Guzheng, a traditional Chinese musical instrument, with demonstration function. In this paper, the real scene obtained by Kinect camera is fused with virtual Guzheng in Unity 3D. The depth data obtained by Kinect and the Suzuki85 algorithm are used to recognize the relative position of the user's right hand and the virtual Guzheng, and the hand gesture of the user is recognized by Kinect.
Mobile viewer system for virtual 3D space using infrared LED point markers and camera
NASA Astrophysics Data System (ADS)
Sakamoto, Kunio; Taneji, Shoto
2006-09-01
The authors have developed a 3D workspace system using collaborative imaging devices. A stereoscopic display enables this system to project 3D information. In this paper, we describe the position detecting system for a see-through 3D viewer. A 3D display system is useful technology for virtual reality, mixed reality and augmented reality. We have researched spatial imaging and interaction system. We have ever proposed 3D displays using the slit as a parallax barrier, the lenticular screen and the holographic optical elements(HOEs) for displaying active image 1)2)3)4). The purpose of this paper is to propose the interactive system using these 3D imaging technologies. The observer can view virtual images in the real world when the user watches the screen of a see-through 3D viewer. The goal of our research is to build the display system as follows; when users see the real world through the mobile viewer, the display system gives users virtual 3D images, which is floating in the air, and the observers can touch these floating images and interact them such that kids can make play clay. The key technologies of this system are the position recognition system and the spatial imaging display. The 3D images are presented by the improved parallax barrier 3D display. Here the authors discuss the measuring method of the mobile viewer using infrared LED point markers and a camera in the 3D workspace (augmented reality world). The authors show the geometric analysis of the proposed measuring method, which is the simplest method using a single camera not the stereo camera, and the results of our viewer system.
The Use of Virtual Reality in Patients with Eating Disorders: Systematic Review
Clus, Damien; Larsen, Mark Erik; Lemey, Christophe
2018-01-01
Background Patients with eating disorders are characterized by pathological eating habits and a tendency to overestimate their weight and body shape. Virtual reality shows promise for the evaluation and management of patients with eating disorders. This technology, when accepted by this population, allows immersion in virtual environments, assessment, and therapeutic approaches, by exposing users to high-calorie foods or changes in body shape. Objective To better understand the value of virtual reality, we conducted a review of the literature, including clinical studies proposing the use of virtual reality for the evaluation and management of patients with eating disorders. Methods We searched PubMed, PsycINFO, ScienceDirect, the Cochrane Library, Scopus, and Web of Science up to April 2017. We created the list of keywords based on two domains: virtual reality and eating disorders. We used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses to identify, select, and critically appraise relevant research while minimizing bias. Results The initial database searches identified 311 articles, 149 of which we removed as duplicates. We analyzed the resulting set of 26 unique studies that met the inclusion criteria. Of these, 8 studies were randomized controlled trials, 13 were nonrandomized studies, and 5 were clinical trials with only 1 participant. Most articles focused on clinical populations (19/26, 73%), with the remainder reporting case-control studies (7/26, 27%). Most of the studies used visual immersive equipment (16/26, 62%) with a head-mounted display (15/16, 94%). Two main areas of interest emerged from these studies: virtual work on patients’ body image (7/26, 27%) and exposure to virtual food stimuli (10/26, 38%). Conclusions We conducted a broad analysis of studies on the use of virtual reality in patients with eating disorders. This review of the literature showed that virtual reality is an acceptable and promising therapeutic tool for patients with eating disorders. PMID:29703715
Using virtual reality to assess user experience.
Rebelo, Francisco; Noriega, Paulo; Duarte, Emília; Soares, Marcelo
2012-12-01
The aim of this article is to discuss how user experience (UX) evaluation can benefit from the use of virtual reality (VR). UX is usually evaluated in laboratory settings. However, considering that UX occurs as a consequence of the interaction between the product, the user, and the context of use, the assessment of UX can benefit from a more ecological test setting. VR provides the means to develop realistic-looking virtual environments with the advantage of allowing greater control of the experimental conditions while granting good ecological validity. The methods used to evaluate UX, as well as their main limitations, are identified.The currentVR equipment and its potential applications (as well as its limitations and drawbacks) to overcome some of the limitations in the assessment of UX are highlighted. The relevance of VR for UX studies is discussed, and a VR-based framework for evaluating UX is presented. UX research may benefit from a VR-based methodology in the scopes of user research (e.g., assessment of users' expectations derived from their lifestyles) and human-product interaction (e.g., assessment of users' emotions since the first moment of contact with the product and then during the interaction). This article provides knowledge to researchers and professionals engaged in the design of technological interfaces about the usefulness of VR in the evaluation of UX.
Design Virtual Reality Scene Roam for Tour Animations Base on VRML and Java
NASA Astrophysics Data System (ADS)
Cao, Zaihui; hu, Zhongyan
Virtual reality has been involved in a wide range of academic and commercial applications. It can give users a natural feeling of the environment by creating realistic virtual worlds. Implementing a virtual tour through a model of a tourist area on the web has become fashionable. In this paper, we present a web-based application that allows a user to, walk through, see, and interact with a fully three-dimensional model of the tourist area. Issues regarding navigation and disorientation areaddressed and we suggest a combination of the metro map and an intuitive navigation system. Finally we present a prototype which implements our ideas. The application of VR techniques integrates the visualization and animation of the three dimensional modelling to landscape analysis. The use of the VRML format produces the possibility to obtain some views of the 3D model and to explore it in real time. It is an important goal for the spatial information sciences.
Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments
NASA Astrophysics Data System (ADS)
Portalés, Cristina; Lerma, José Luis; Navarro, Santiago
2010-01-01
Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.
Virtual environments simulation in research reactor
NASA Astrophysics Data System (ADS)
Muhamad, Shalina Bt. Sheik; Bahrin, Muhammad Hannan Bin
2017-01-01
Virtual reality based simulations are interactive and engaging. It has the useful potential in improving safety training. Virtual reality technology can be used to train workers who are unfamiliar with the physical layout of an area. In this study, a simulation program based on the virtual environment at research reactor was developed. The platform used for virtual simulation is 3DVia software for which it's rendering capabilities, physics for movement and collision and interactive navigation features have been taken advantage of. A real research reactor was virtually modelled and simulated with the model of avatars adopted to simulate walking. Collision detection algorithms were developed for various parts of the 3D building and avatars to restrain the avatars to certain regions of the virtual environment. A user can control the avatar to move around inside the virtual environment. Thus, this work can assist in the training of personnel, as in evaluating the radiological safety of the research reactor facility.
Baus, Oliver; Bouchard, Stéphane
2014-01-01
This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed "safely" to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user's experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia.
Fortmeier, Dirk; Mastmeyer, Andre; Schröder, Julian; Handels, Heinz
2016-01-01
This study presents a new visuo-haptic virtual reality (VR) training and planning system for percutaneous transhepatic cholangio-drainage (PTCD) based on partially segmented virtual patient models. We only use partially segmented image data instead of a full segmentation and circumvent the necessity of surface or volume mesh models. Haptic interaction with the virtual patient during virtual palpation, ultrasound probing and needle insertion is provided. Furthermore, the VR simulator includes X-ray and ultrasound simulation for image-guided training. The visualization techniques are GPU-accelerated by implementation in Cuda and include real-time volume deformations computed on the grid of the image data. Computation on the image grid enables straightforward integration of the deformed image data into the visualization components. To provide shorter rendering times, the performance of the volume deformation algorithm is improved by a multigrid approach. To evaluate the VR training system, a user evaluation has been performed and deformation algorithms are analyzed in terms of convergence speed with respect to a fully converged solution. The user evaluation shows positive results with increased user confidence after a training session. It is shown that using partially segmented patient data and direct volume rendering is suitable for the simulation of needle insertion procedures such as PTCD.
Klapan, Ivica; Vranjes, Zeljko; Prgomet, Drago; Lukinović, Juraj
2008-03-01
The real-time requirement means that the simulation should be able to follow the actions of the user that may be moving in the virtual environment. The computer system should also store in its memory a three-dimensional (3D) model of the virtual environment. In that case a real-time virtual reality system will update the 3D graphic visualization as the user moves, so that up-to-date visualization is always shown on the computer screen. Upon completion of the tele-operation, the surgeon compares the preoperative and postoperative images and models of the operative field, and studies video records of the procedure itself Using intraoperative records, animated images of the real tele-procedure performed can be designed. Virtual surgery offers the possibility of preoperative planning in rhinology. The intraoperative use of computer in real time requires development of appropriate hardware and software to connect medical instrumentarium with the computer and to operate the computer by thus connected instrumentarium and sophisticated multimedia interfaces.
Development and application of virtual reality for man/systems integration
NASA Technical Reports Server (NTRS)
Brown, Marcus
1991-01-01
While the graphical presentation of computer models signified a quantum leap over presentations limited to text and numbers, it still has the problem of presenting an interface barrier between the human user and the computer model. The user must learn a command language in order to orient themselves in the model. For example, to move left from the current viewpoint of the model, they might be required to type 'LEFT' at a keyboard. This command is fairly intuitive, but if the viewpoint moves far enough that there are no visual cues overlapping with the first view, the user does not know if the viewpoint has moved inches, feet, or miles to the left, or perhaps remained in the same position, but rotated to the left. Until the user becomes quite familiar with the interface language of the computer model presentation, they will be proned to lossing their bearings frequently. Even a highly skilled user will occasionally get lost in the model. A new approach to presenting type type of information is to directly interpret the user's body motions as the input language for determining what view to present. When the user's head turns 45 degrees to the left, the viewpoint should be rotated 45 degrees to the left. Since the head moves through several intermediate angles between the original view and the final one, several intermediate views should be presented, providing the user with a sense of continuity between the original view and the final one. Since the primary way a human physically interacts with their environment should monitor the movements of the user's hands and alter objects in the virtual model in a way consistent with the way an actual object would move when manipulated using the same hand movements. Since this approach to the man-computer interface closely models the same type of interface that humans have with the physical world, this type of interface is often called virtual reality, and the model is referred to as a virtual world. The task of this summer fellowship was to set up a virtual reality system at MSFC and begin applying it to some of the questions which concern scientists and engineers involved in space flight. A brief discussion of this work is presented.
Berryman, Donna R
2012-01-01
Augmented reality is a technology that overlays digital information on objects or places in the real world for the purpose of enhancing the user experience. It is not virtual reality, that is, the technology that creates a totally digital or computer created environment. Augmented reality, with its ability to combine reality and digital information, is being studied and implemented in medicine, marketing, museums, fashion, and numerous other areas. This article presents an overview of augmented reality, discussing what it is, how it works, its current implementations, and its potential impact on libraries.
Bao, Xiao; Mao, Yurong; Lin, Qiang; Qiu, Yunhai; Chen, Shaozhen; Li, Le; Cates, Ryan S; Zhou, Shufeng; Huang, Dongfeng
2013-11-05
The Kinect-based virtual reality system for the Xbox 360 enables users to control and interact with the game console without the need to touch a game controller, and provides rehabilitation training for stroke patients with lower limb dysfunctions. However, the underlying mechanism remains unclear. In this study, 18 healthy subjects and five patients after subacute stroke were included. The five patients were scanned using functional MRI prior to training, 3 weeks after training and at a 12-week follow-up, and then compared with healthy subjects. The Fugl-Meyer Assessment and Wolf Motor Function Test scores of the hemiplegic upper limbs of stroke patients were significantly increased 3 weeks after training and at the 12-week follow-up. Functional MRI results showed that contralateral primary sensorimotor cortex was activated after Kinect-based virtual reality training in the stroke patients compared with the healthy subjects. Contralateral primary sensorimotor cortex, the bilateral supplementary motor area and the ipsilateral cerebellum were also activated during hand-clenching in all 18 healthy subjects. Our findings indicate that Kinect-based virtual reality training could promote the recovery of upper limb motor function in subacute stroke patients, and brain reorganization by Kinect-based virtual reality training may be linked to the contralateral sensorimotor cortex.
Bao, Xiao; Mao, Yurong; Lin, Qiang; Qiu, Yunhai; Chen, Shaozhen; Li, Le; Cates, Ryan S.; Zhou, Shufeng; Huang, Dongfeng
2013-01-01
The Kinect-based virtual reality system for the Xbox 360 enables users to control and interact with the game console without the need to touch a game controller, and provides rehabilitation training for stroke patients with lower limb dysfunctions. However, the underlying mechanism remains unclear. In this study, 18 healthy subjects and five patients after subacute stroke were included. The five patients were scanned using functional MRI prior to training, 3 weeks after training and at a 12-week follow-up, and then compared with healthy subjects. The Fugl-Meyer Assessment and Wolf Motor Function Test scores of the hemiplegic upper limbs of stroke patients were significantly increased 3 weeks after training and at the 12-week follow-up. Functional MRI results showed that contralateral primary sensorimotor cortex was activated after Kinect-based virtual reality training in the stroke patients compared with the healthy subjects. Contralateral primary sensorimotor cortex, the bilateral supplementary motor area and the ipsilateral cerebellum were also activated during hand-clenching in all 18 healthy subjects. Our findings indicate that Kinect-based virtual reality training could promote the recovery of upper limb motor function in subacute stroke patients, and brain reorganization by Kinect-based virtual reality training may be linked to the contralateral sensorimotor cortex. PMID:25206611
WeaVR: a self-contained and wearable immersive virtual environment simulation system.
Hodgson, Eric; Bachmann, Eric R; Vincent, David; Zmuda, Michael; Waller, David; Calusdian, James
2015-03-01
We describe WeaVR, a computer simulation system that takes virtual reality technology beyond specialized laboratories and research sites and makes it available in any open space, such as a gymnasium or a public park. Novel hardware and software systems enable HMD-based immersive virtual reality simulations to be conducted in any arbitrary location, with no external infrastructure and little-to-no setup or site preparation. The ability of the WeaVR system to provide realistic motion-tracked navigation for users, to improve the study of large-scale navigation, and to generate usable behavioral data is shown in three demonstrations. First, participants navigated through a full-scale virtual grocery store while physically situated in an open grass field. Trajectory data are presented for both normal tracking and for tracking during the use of redirected walking that constrained users to a predefined area. Second, users followed a straight path within a virtual world for distances of up to 2 km while walking naturally and being redirected to stay within the field, demonstrating the ability of the system to study large-scale navigation by simulating virtual worlds that are potentially unlimited in extent. Finally, the portability and pedagogical implications of this system were demonstrated by taking it to a regional high school for live use by a computer science class on their own school campus.
Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications.
Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico
2017-01-01
Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.
A virtual reality environment for telescope operation
NASA Astrophysics Data System (ADS)
Martínez, Luis A.; Villarreal, José L.; Ángeles, Fernando; Bernal, Abel
2010-07-01
Astronomical observatories and telescopes are becoming increasingly large and complex systems, demanding to any potential user the acquirement of great amount of information previous to access them. At present, the most common way to overcome that information is through the implementation of larger graphical user interfaces and computer monitors to increase the display area. Tonantzintla Observatory has a 1-m telescope with a remote observing system. As a step forward in the improvement of the telescope software, we have designed a Virtual Reality (VR) environment that works as an extension of the remote system and allows us to operate the telescope. In this work we explore this alternative technology that is being suggested here as a software platform for the operation of the 1-m telescope.
Multi-degree of freedom joystick for virtual reality simulation.
Head, M J; Nelson, C A; Siu, K C
2013-11-01
A modular control interface and simulated virtual reality environment were designed and created in order to determine how the kinematic architecture of a control interface affects minimally invasive surgery training. A user is able to selectively determine the kinematic configuration of an input device (number, type and location of degrees of freedom) for a specific surgical simulation through the use of modular joints and constraint components. Furthermore, passive locking was designed and implemented through the use of inflated latex tubing around rotational joints in order to allow a user to step away from a simulation without unwanted tool motion. It is believed that these features will facilitate improved simulation of a variety of surgical procedures and, thus, improve surgical skills training.
Cooper, Natalia; Milella, Ferdinando; Pinto, Carlo; Cant, Iain; White, Mark; Meyer, Georg
2018-01-01
Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as 'presence', when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user's overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience.
Future Evolution of Virtual Worlds as Communication Environments
NASA Astrophysics Data System (ADS)
Prisco, Giulio
Extensive experience creating locations and activities inside virtual worlds provides the basis for contemplating their future. Users of virtual worlds are diverse in their goals for these online environments; for example, immersionists want them to be alternative realities disconnected from real life, whereas augmentationists want them to be communication media supporting real-life activities. As the technology improves, the diversity of virtual worlds will increase along with their significance. Many will incorporate more advanced virtual reality, or serve as major media for long-distance collaboration, or become the venues for futurist social movements. Key issues are how people can create their own virtual worlds, travel across worlds, and experience a variety of multimedia immersive environments. This chapter concludes by noting the view among some computer scientists that future technologies will permit uploading human personalities to artificial intelligence avatars, thereby enhancing human beings and rendering the virtual worlds entirely real.
Virtual reality applied to teletesting
NASA Astrophysics Data System (ADS)
van den Berg, Thomas J.; Smeenk, Roland J. M.; Mazy, Alain; Jacques, Patrick; Arguello, Luis; Mills, Simon
2003-05-01
The activity "Virtual Reality applied to Teletesting" is related to a wider European Space Agency (ESA) initiative of cost reduction, in particular the reduction of test costs. Reduction of costs of space related projects have to be performed on test centre operating costs and customer company costs. This can accomplished by increasing the automation and remote testing ("teletesting") capabilities of the test centre. Main problems related to teletesting are a lack of situational awareness and the separation of control over the test environment. The objective of the activity is to evaluate the use of distributed computing and Virtual Reality technology to support the teletesting of a payload under vacuum conditions, and to provide a unified man-machine interface for the monitoring and control of payload, vacuum chamber and robotics equipment. The activity includes the development and testing of a "Virtual Reality Teletesting System" (VRTS). The VRTS is deployed at one of the ESA certified test centres to perform an evaluation and test campaign using a real payload. The VRTS is entirely written in the Java programming language, using the J2EE application model. The Graphical User Interface runs as an applet in a Web browser, enabling easy access from virtually any place.
[Virtual reality in the treatment of mental disorders].
Malbos, Eric; Boyer, Laurent; Lançon, Christophe
2013-11-01
Virtual reality is a media allowing users to interact in real time with computerized virtual environments. The application of this immersive technology to cognitive behavioral therapies is increasingly exploited for the treatment of mental disorders. The present study is a review of literature spanning from 1992 to 2012. It depicts the utility of this new tool for assessment and therapy through the various clinical studies carried out on subjects exhibiting diverse mental disorders. Most of the studies conducted on tested subjects attest to the significant efficacy of the Virtual Reality Exposure Therapy (VRET) for the treatment of distinct mental disorders. Comparative studies of VRET with the treatment of reference (the in vivo exposure component of the cognitive behavioral therapy) document an equal efficacy of the two methods and in some cases a superior therapeutic effect in favor of the VRET. Even though clinical experiments set on a larger scale, extended follow-up and studies about factors influencing presence are needed, virtual reality exposure represents an efficacious, confidential, affordable, flexible, interactive therapeutic method which application will progressively widened in the field of mental health. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
ConfocalVR: Immersive Visualization Applied to Confocal Microscopy.
Stefani, Caroline; Lacy-Hulbert, Adam; Skillman, Thomas
2018-06-24
ConfocalVR is a virtual reality (VR) application created to improve the ability of researchers to study the complexity of cell architecture. Confocal microscopes take pictures of fluorescently labeled proteins or molecules at different focal planes to create a stack of 2D images throughout the specimen. Current software applications reconstruct the 3D image and render it as a 2D projection onto a computer screen where users need to rotate the image to expose the full 3D structure. This process is mentally taxing, breaks down if you stop the rotation, and does not take advantage of the eye's full field of view. ConfocalVR exploits consumer-grade virtual reality (VR) systems to fully immerse the user in the 3D cellular image. In this virtual environment the user can: 1) adjust image viewing parameters without leaving the virtual space, 2) reach out and grab the image to quickly rotate and scale the image to focus on key features, and 3) interact with other users in a shared virtual space enabling real-time collaborative exploration and discussion. We found that immersive VR technology allows the user to rapidly understand cellular architecture and protein or molecule distribution. We note that it is impossible to understand the value of immersive visualization without experiencing it first hand, so we encourage readers to get access to a VR system, download this software, and evaluate it for yourself. The ConfocalVR software is available for download at http://www.confocalvr.com, and is free for nonprofits. Copyright © 2018. Published by Elsevier Ltd.
EduMOOs: Virtual Learning Centers.
ERIC Educational Resources Information Center
Woods, Judy C.
1998-01-01
Multi-user Object Oriented Internet activities (MOOs) permit real time interaction in a text-based virtual reality via the Internet. This article explains EduMOOs (educational MOOs) and provides brief descriptions, World Wide Web addresses, and telnet addresses for selected EduMOOs. Instructions for connecting to a MOO and a list of related Web…
Virtual reality environments for post-stroke arm rehabilitation.
Subramanian, Sandeep; Knaut, Luiz A; Beaudoin, Christian; McFadyen, Bradford J; Feldman, Anatol G; Levin, Mindy F
2007-06-22
Optimal practice and feedback elements are essential requirements for maximal motor recovery in patients with motor deficits due to central nervous system lesions. A virtual environment (VE) was created that incorporates practice and feedback elements necessary for maximal motor recovery. It permits varied and challenging practice in a motivating environment that provides salient feedback. The VE gives the user knowledge of results feedback about motor behavior and knowledge of performance feedback about the quality of pointing movements made in a virtual elevator. Movement distances are related to length of body segments. We describe an immersive and interactive experimental protocol developed in a virtual reality environment using the CAREN system. The VE can be used as a training environment for the upper limb in patients with motor impairments.
NASA Astrophysics Data System (ADS)
Chittaro, Luca; Zangrando, Nicola
Although virtual reality (VR) is a powerful simulation tool that can allow users to experience the effects of their actions in vivid and memorable ways, explorations of VR as a persuasive technology are rare. In this paper, we focus on different ways of providing negative feedback for persuasive purposes through simulated experiences in VR. The persuasive goal we consider concerns awareness of personal fire safety issues and the experiment we describe focuses on attitudes towards smoke in evacuating buildings. We test two techniques: the first technique simulates the damaging effects of smoke on the user through a visualization that should not evoke strong emotions, while the second is aimed at partially reproducing the anxiety of an emergency situation. The results of the study show that the second technique is able to increase user's anxiety as well as producing better results in attitude change.
Prototype of haptic device for sole of foot using magnetic field sensitive elastomer
NASA Astrophysics Data System (ADS)
Kikuchi, T.; Masuda, Y.; Sugiyama, M.; Mitsumata, T.; Ohori, S.
2013-02-01
Walking is one of the most popular activities and a healthy aerobic exercise for the elderly. However, if they have physical and / or cognitive disabilities, sometimes it is challenging to go somewhere they don't know well. The final goal of this study is to develop a virtual reality walking system that allows users to walk in virtual worlds fabricated with computer graphics. We focus on a haptic device that can perform various plantar pressures on users' soles of feet as an additional sense in the virtual reality walking. In this study, we discuss a use of a magnetic field sensitive elastomer (MSE) as a working material for the haptic interface on the sole. The first prototype with MSE was developed and evaluated in this work. According to the measurement of planter pressures, it was found that this device can perform different pressures on the sole of a light-weight user by applying magnetic field on the MSE. The result also implied necessities of the improvement of the magnetic circuit and the basic structure of the mechanism of the device.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
Zenner, Andre; Kruger, Antonio
2017-04-01
We define the concept of Dynamic Passive Haptic Feedback (DPHF) for virtual reality by introducing the weight-shifting physical DPHF proxy object Shifty. This concept combines actuators known from active haptics and physical proxies known from passive haptics to construct proxies that automatically adapt their passive haptic feedback. We describe the concept behind our ungrounded weight-shifting DPHF proxy Shifty and the implementation of our prototype. We then investigate how Shifty can, by automatically changing its internal weight distribution, enhance the user's perception of virtual objects interacted with in two experiments. In a first experiment, we show that Shifty can enhance the perception of virtual objects changing in shape, especially in length and thickness. Here, Shifty was shown to increase the user's fun and perceived realism significantly, compared to an equivalent passive haptic proxy. In a second experiment, Shifty is used to pick up virtual objects of different virtual weights. The results show that Shifty enhances the perception of weight and thus the perceived realism by adapting its kinesthetic feedback to the picked-up virtual object. In the same experiment, we additionally show that specific combinations of haptic, visual and auditory feedback during the pick-up interaction help to compensate for visual-haptic mismatch perceived during the shifting process.
NASA Astrophysics Data System (ADS)
Martin, P.; Tseu, A.; Férey, N.; Touraine, D.; Bourdot, P.
2014-02-01
Most advanced immersive devices provide collaborative environment within several users have their distinct head-tracked stereoscopic point of view. Combining with common used interactive features such as voice and gesture recognition, 3D mouse, haptic feedback, and spatialized audio rendering, these environments should faithfully reproduce a real context. However, even if many studies have been carried out on multimodal systems, we are far to definitively solve the issue of multimodal fusion, which consists in merging multimodal events coming from users and devices, into interpretable commands performed by the application. Multimodality and collaboration was often studied separately, despite of the fact that these two aspects share interesting similarities. We discuss how we address this problem, thought the design and implementation of a supervisor that is able to deal with both multimodal fusion and collaborative aspects. The aim of this supervisor is to ensure the merge of user's input from virtual reality devices in order to control immersive multi-user applications. We deal with this problem according to a practical point of view, because the main requirements of this supervisor was defined according to a industrial task proposed by our automotive partner, that as to be performed with multimodal and collaborative interactions in a co-located multi-user environment. In this task, two co-located workers of a virtual assembly chain has to cooperate to insert a seat into the bodywork of a car, using haptic devices to feel collision and to manipulate objects, combining speech recognition and two hands gesture recognition as multimodal instructions. Besides the architectural aspect of this supervisor, we described how we ensure the modularity of our solution that could apply on different virtual reality platforms, interactive contexts and virtual contents. A virtual context observer included in this supervisor in was especially designed to be independent to the content of the virtual scene of targeted application, and is use to report high-level interactive and collaborative events. This context observer allows the supervisor to merge these interactive and collaborative events, but is also used to deal with new issues coming from our observation of two co-located users in an immersive device performing this assembly task. We highlight the fact that when speech recognition features are provided to the two users, it is required to automatically detect according to the interactive context, whether the vocal instructions must be translated into commands that have to be performed by the machine, or whether they take a part of the natural communication necessary for collaboration. Information coming from this context observer that indicates a user is looking at its collaborator, is important to detect if the user is talking to its partner. Moreover, as the users are physically co-localised and head-tracking is used to provide high fidelity stereoscopic rendering, and natural walking navigation in the virtual scene, we have to deals with collision and screen occlusion between the co-located users in the physical work space. Working area and focus of each user, computed and reported by the context observer is necessary to prevent or avoid these situations.
Demers, Marika; Chan Chun Kong, Daniel; Levin, Mindy F
2018-03-11
To determine user satisfaction and safety of incorporating a low-cost virtual rehabilitation intervention as an adjunctive therapeutic option for cognitive-motor upper limb rehabilitation in individuals with sub-acute stroke. A low-cost upper limb virtual rehabilitation application incorporating realistic functionally-relevant unimanual and bimanual tasks, specifically designed for cognitive-motor rehabilitation was developed for patients with sub-acute stroke. Clinicians and individuals with stroke interacted with the intervention for 15-20 or 20-45 minutes, respectively. The study had a mixed-methods convergent parallel design that included a focus group interview with clinicians working in a stroke program and semi-structured interviews and standardized assessments (Borg Perceived Exertion Scale, Short Feedback Questionnaire) for participants with sub-acute stroke undergoing rehabilitation. The occurrence of adverse events was also noted. Three main themes emerged from the clinician focus group and patient interviews: Perceived usefulness in rehabilitation, satisfaction with the virtual reality intervention and aspects to improve. All clinicians and the majority of participants with stroke were highly satisfied with the intervention and perceived its usefulness to decrease arm motor impairment during functional tasks. No participants experienced major adverse events. Incorporation of this type of functional activity game-based virtual reality intervention in the sub-acute phase of rehabilitation represents a way to transfer skills learned early in the clinical setting to real world situations. This type of intervention may lead to better integration of the upper limb into everyday activities. Implications for Rehabilitation • Use of a cognitive-motor low-cost virtual reality intervention designed to remediate arm motor impairments in sub-acute stroke is feasible, safe and perceived as useful by therapists and patients for stroke rehabilitation. • Input from end-users (therapists and individuals with stroke) is critical for the development and implementation of a virtual reality intervention.
Ng, Ivan; Hwang, Peter Y K; Kumar, Dinesh; Lee, Cheng Kiang; Kockro, Ralf A; Sitoh, Y Y
2009-05-01
To evaluate the feasibility of surgical planning using a virtual reality platform workstation in the treatment of cerebral arterio-venous malformations (AVMs) Patient-specific data of multiple imaging modalities were co-registered, fused and displayed as a 3D stereoscopic object on the Dextroscope, a virtual reality surgical planning platform. This system allows for manipulation of 3D data and for the user to evaluate and appreciate the angio-architecture of the nidus with regards to position and spatial relationships of critical feeders and draining veins. We evaluated the ability of the Dextroscope to influence surgical planning by providing a better understanding of the angio-architecture as well as its impact on the surgeon's pre- and intra-operative confidence and ability to tackle these lesions. Twenty four patients were studied. The mean age was 29.65 years. Following pre-surgical planning on the Dextroscope, 23 patients underwent microsurgical resection after pre-surgical virtual reality planning, during which all had documented complete resection of the AVM. Planning on the virtual reality platform allowed for identification of critical feeders and draining vessels in all patients. The appreciation of the complex patient specific angio-architecture to establish a surgical plan was found to be invaluable in the conduct of the procedure and was found to enhance the surgeon's confidence significantly. Surgical planning of resection of an AVM with a virtual reality system allowed detailed and comprehensive analysis of 3D multi-modality imaging data and, in our experience, proved very helpful in establishing a good surgical strategy, enhancing intra-operative spatial orientation and increasing surgeon's confidence.
The RoboCup Mixed Reality League - A Case Study
NASA Astrophysics Data System (ADS)
Gerndt, Reinhard; Bohnen, Matthias; da Silva Guerra, Rodrigo; Asada, Minoru
In typical mixed reality systems there is only a one-way interaction from real to virtual. A human user or the physics of a real object may influence the behavior of virtual objects, but real objects usually cannot be influenced by the virtual world. By introducing real robots into the mixed reality system, we allow a true two-way interaction between virtual and real worlds. Our system has been used since 2007 to implement the RoboCup mixed reality soccer games and other applications for research and edutainment. Our framework system is freely programmable to generate any virtual environment, which may then be further supplemented with virtual and real objects. The system allows for control of any real object based on differential drive robots. The robots may be adapted for different applications, e.g., with markers for identification or with covers to change shape and appearance. They may also be “equipped” with virtual tools. In this chapter we present the hardware and software architecture of our system and some applications. The authors believe this can be seen as a first implementation of Ivan Sutherland’s 1965 idea of the ultimate display: “The ultimate display would, of course, be a room within which the computer can control the existence of matter …” (Sutherland, 1965, Proceedings of IFIPS Congress 2:506-508).
Development of excavator training simulator using leap motion controller
NASA Astrophysics Data System (ADS)
Fahmi, F.; Nainggolan, F.; Andayani, U.; Siregar, B.
2018-03-01
Excavator is a heavy machinery that is used for many industries purposes. Controlling the excavator is not easy. Its operator has to be trained well in many skills to make sure it is safe, effective, and efficient while using the excavator. In this research, we proposed a virtual reality excavator simulator supported by a device called Leap Motion Controller that supports finger and hand motions as an input. This prototype will be developed than in the virtual reality environment to give a more real sensing to the user.
A virtual reality browser for Space Station models
NASA Technical Reports Server (NTRS)
Goldsby, Michael; Pandya, Abhilash; Aldridge, Ann; Maida, James
1993-01-01
The Graphics Analysis Facility at NASA/JSC has created a visualization and learning tool by merging its database of detailed geometric models with a virtual reality system. The system allows an interactive walk-through of models of the Space Station and other structures, providing detailed realistic stereo images. The user can activate audio messages describing the function and connectivity of selected components within his field of view. This paper presents the issues and trade-offs involved in the implementation of the VR system and discusses its suitability for its intended purposes.
Magic cards: a new augmented-reality approach.
Demuynck, Olivier; Menendez, José Manuel
2013-01-01
Augmented reality (AR) commonly uses markers for detection and tracking. Such multimedia applications associate each marker with a virtual 3D model stored in the memory of the camera-equipped device running the application. Application users are limited in their interactions, which require knowing how to design and program 3D objects. This generally prevents them from developing their own entertainment AR applications. The Magic Cards application solves this problem by offering an easy way to create and manage an unlimited number of virtual objects that are encoded on special markers.
Development of virtual environment for treating acrophobia.
Ku, J; Jang, D; Shin, M; Jo, H; Ahn, H; Lee, J; Cho, B; Kim, S I
2001-01-01
Virtual Reality (VR) is a new technology that makes humans communicate with computer. It allows the user to see, hear, feel and interact in a three-dimensional virtual world created graphically. Virtual Reality Therapy (VRT), based on this sophisticated technology, has been recently used in the treatment of subjects diagnosed with acrophobia, a disorder that is characterized by marked anxiety upon exposure to heights, avoidance of heights, and a resulting interference in functioning. Conventional virtual reality system for the treatment of acrophobia has a limitation in scope that it is based on over-costly devices or somewhat unrealistic graphic scene. The goal of this study was to develop a inexpensive and more realistic virtual environment for the exposure therapy of acrophobia. We constructed two types virtual environment. One is constituted a bungee-jump tower in the middle of a city. It includes the open lift surrounded by props beside tower that allowed the patient to feel sense of heights. Another is composed of diving boards which have various heights. It provides a view of a lower diving board and people swimming in the pool to serve the patient stimuli upon exposure to heights.
Schuster-Amft, Corina; Eng, Kynan; Lehmann, Isabelle; Schmid, Ludwig; Kobashi, Nagisa; Thaler, Irène; Verra, Martin L; Henneke, Andrea; Signer, Sandra; McCaskey, Michael; Kiper, Daniel
2014-09-06
In recent years, virtual reality has been introduced to neurorehabilitation, in particular with the intention of improving upper-limb training options and facilitating motor function recovery. The proposed study incorporates a quantitative part and a qualitative part, termed a mixed-methods approach: (1) a quantitative investigation of the efficacy of virtual reality training compared to conventional therapy in upper-limb motor function are investigated, (2a) a qualitative investigation of patients' experiences and expectations of virtual reality training and (2b) a qualitative investigation of therapists' experiences using the virtual reality training system in the therapy setting. At three participating clinics, 60 patients at least 6 months after stroke onset will be randomly allocated to an experimental virtual reality group (EG) or to a control group that will receive conventional physiotherapy or occupational therapy (16 sessions, 45 minutes each, over the course of 4 weeks). Using custom data gloves, patients' finger and arm movements will be displayed in real time on a monitor, and they will move and manipulate objects in various virtual environments. A blinded assessor will test patients' motor and cognitive performance twice before, once during, and twice after the 4-week intervention. The primary outcome measure is the Box and Block Test. Secondary outcome measures are the Chedoke-McMaster Stroke Assessments (hand, arm and shoulder pain subscales), the Chedoke-McMaster Arm and Hand Activity Inventory, the Line Bisection Test, the Stroke Impact Scale, the MiniMentalState Examination and the Extended Barthel Index. Semistructured face-to-face interviews will be conducted with patients in the EG after intervention finalization with a focus on the patients' expectations and experiences regarding the virtual reality training. Therapists' perspectives on virtual reality training will be reviewed in three focus groups comprising four to six occupational therapists and physiotherapists. The interviews will help to gain a deeper understanding of the phenomena under investigation to provide sound recommendations for the implementation of the virtual reality training system for routine use in neurorehabilitation complementing the quantitative clinical assessments. Cliniclatrials.gov Identifier: NCT01774669 (15 January 2013).
Mass production of holographic transparent components for augmented and virtual reality applications
NASA Astrophysics Data System (ADS)
Russo, Juan Manuel; Dimov, Fedor; Padiyar, Joy; Coe-Sullivan, Seth
2017-06-01
Diffractive optics such as holographic optical elements (HOEs) can provide transparent and narrow band components with arbitrary incident and diffracted angles for near-to-eye commercial electronic products for augmented reality (AR), virtual reality (VR), and smart glass applications. In this paper, we will summarize the operational parameters and general optical geometries relevant for near-to-eye displays, the holographic substrates available for these applications, and their performance characteristics and ease of manufacture. We will compare the holographic substrates available in terms of fabrication, manufacturability, and end-user performance characteristics. Luminit is currently emplacing the manufacturing capacity to serve this market, and this paper will discuss the capabilities and limitations of this unique facility.
Higuera-Trujillo, Juan Luis; López-Tarruella Maldonado, Juan; Llinares Millán, Carmen
2017-11-01
Psychological research into human factors frequently uses simulations to study the relationship between human behaviour and the environment. Their validity depends on their similarity with the physical environments. This paper aims to validate three environmental-simulation display formats: photographs, 360° panoramas, and virtual reality. To do this we compared the psychological and physiological responses evoked by simulated environments set-ups to those from a physical environment setup; we also assessed the users' sense of presence. Analysis show that 360° panoramas offer the closest to reality results according to the participants' psychological responses, and virtual reality according to the physiological responses. Correlations between the feeling of presence and physiological and other psychological responses were also observed. These results may be of interest to researchers using environmental-simulation technologies currently available in order to replicate the experience of physical environments. Copyright © 2017 Elsevier Ltd. All rights reserved.
Thomas, J Graham; Spitalnick, Josh S; Hadley, Wendy; Bond, Dale S; Wing, Rena R
2015-01-01
Virtual reality (VR) technology can provide a safe environment for observing, learning, and practicing use of behavioral weight management skills, which could be particularly useful in enhancing minimal contact online weight management programs. The Experience Success (ES) project developed a system for creating and deploying VR scenarios for online weight management skills training. Virtual environments populated with virtual actors allow users to experiment with implementing behavioral skills via a PC-based point and click interface. A culturally sensitive virtual coach guides the experience, including planning for real-world skill use. Thirty-seven overweight/obese women provided feedback on a test scenario focused on social eating situations. They reported that the scenario gave them greater skills, confidence, and commitment for controlling eating in social situations. © 2014 Diabetes Technology Society.
Spitalnick, Josh S.; Hadley, Wendy; Bond, Dale S.; Wing, Rena R.
2014-01-01
Virtual reality (VR) technology can provide a safe environment for observing, learning, and practicing use of behavioral weight management skills, which could be particularly useful in enhancing minimal contact online weight management programs. The Experience Success (ES) project developed a system for creating and deploying VR scenarios for online weight management skills training. Virtual environments populated with virtual actors allow users to experiment with implementing behavioral skills via a PC-based point and click interface. A culturally sensitive virtual coach guides the experience, including planning for real-world skill use. Thirty-seven overweight/obese women provided feedback on a test scenario focused on social eating situations. They reported that the scenario gave them greater skills, confidence, and commitment for controlling eating in social situations. PMID:25367014
[Application of virtual reality in the motor aspects of neurorehabilitation].
Peñasco-Martín, Benito; de los Reyes-Guzmán, Ana; Gil-Agudo, Ángel; Bernal-Sahún, Alberto; Pérez-Aguilar, Beatriz; de la Peña-González, Ana Isabel
2010-10-16
Virtual reality allows the user to interact with elements within a simulated scene. In recent times we have been witness to the introduction of virtual reality-based devices as one of the most significant novelties in neurorehabilitation. To review the clinical applications of the developments based on virtual reality for the neurorehabilitation treatment of the motor aspects of the most frequent disabling processes with a neurological origin. A review was carried out of the Medline, Physiotherapy Evidence Database, Ovid and Cochrane Library databases up until April 2009. This was completed with a web search using Google. No clinical trial conducted on its effectiveness has been found to date. The information that was collected is based on the description of the various prototypes produced by the different groups involved in their development. In most cases they are clinical trials conducted with a small number of patients, which have focused more on testing the validity of the device and checking whether it works correctly than on attempting to prove its clinical effectiveness. Although most of the clinical applications refer to patients with stroke, there were also several applications for patients with spinal cord injuries, multiple sclerosis, Parkinson's disease or balance disorders. Virtual reality is a novel tool with a promising future in neurorehabilitation. Further studies are needed to demonstrate its clinical effectiveness as compared to the traditional techniques.
The Use of Virtual Reality in Patients with Eating Disorders: Systematic Review.
Clus, Damien; Larsen, Mark Erik; Lemey, Christophe; Berrouiguet, Sofian
2018-04-27
Patients with eating disorders are characterized by pathological eating habits and a tendency to overestimate their weight and body shape. Virtual reality shows promise for the evaluation and management of patients with eating disorders. This technology, when accepted by this population, allows immersion in virtual environments, assessment, and therapeutic approaches, by exposing users to high-calorie foods or changes in body shape. To better understand the value of virtual reality, we conducted a review of the literature, including clinical studies proposing the use of virtual reality for the evaluation and management of patients with eating disorders. We searched PubMed, PsycINFO, ScienceDirect, the Cochrane Library, Scopus, and Web of Science up to April 2017. We created the list of keywords based on two domains: virtual reality and eating disorders. We used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses to identify, select, and critically appraise relevant research while minimizing bias. The initial database searches identified 311 articles, 149 of which we removed as duplicates. We analyzed the resulting set of 26 unique studies that met the inclusion criteria. Of these, 8 studies were randomized controlled trials, 13 were nonrandomized studies, and 5 were clinical trials with only 1 participant. Most articles focused on clinical populations (19/26, 73%), with the remainder reporting case-control studies (7/26, 27%). Most of the studies used visual immersive equipment (16/26, 62%) with a head-mounted display (15/16, 94%). Two main areas of interest emerged from these studies: virtual work on patients’ body image (7/26, 27%) and exposure to virtual food stimuli (10/26, 38%). We conducted a broad analysis of studies on the use of virtual reality in patients with eating disorders. This review of the literature showed that virtual reality is an acceptable and promising therapeutic tool for patients with eating disorders. ©Damien Clus, Mark Erik Larsen, Christophe Lemey, Sofian Berrouiguet. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 27.04.2018.
Poeschl, Sandra; Doering, Nicola
2014-01-01
Realistic models in virtual reality training applications are considered to positively influence presence and performance. The experimental study presented, analyzed the effect of simulation fidelity (static vs. animated audience) on presence as a prerequisite for performance in a prototype virtual fear of public speaking application with a sample of N = 40 academic non-phobic users. Contrary to the state of research, no influence was shown on virtual presence and perceived realism, but an animated audience led to significantly higher effects in anxiety during giving a talk. Although these findings could be explained by an application that might not have been realistic enough, they still question the role of presence as a mediating factor in virtual exposure applications.
VirSSPA- a virtual reality tool for surgical planning workflow.
Suárez, C; Acha, B; Serrano, C; Parra, C; Gómez, T
2009-03-01
A virtual reality tool, called VirSSPA, was developed to optimize the planning of surgical processes. Segmentation algorithms for Computed Tomography (CT) images: a region growing procedure was used for soft tissues and a thresholding algorithm was implemented to segment bones. The algorithms operate semiautomati- cally since they only need seed selection with the mouse on each tissue segmented by the user. The novelty of the paper is the adaptation of an enhancement method based on histogram thresholding applied to CT images for surgical planning, which simplifies subsequent segmentation. A substantial improvement of the virtual reality tool VirSSPA was obtained with these algorithms. VirSSPA was used to optimize surgical planning, to decrease the time spent on surgical planning and to improve operative results. The success rate increases due to surgeons being able to see the exact extent of the patient's ailment. This tool can decrease operating room time, thus resulting in reduced costs. Virtual simulation was effective for optimizing surgical planning, which could, consequently, result in improved outcomes with reduced costs.
Virtual reality-based cognitive training for drug abusers: A randomised controlled trial.
Man, David W K
2018-05-08
Non-pharmacological means are being developed to enhance cognitive abilities in drug abusers. This study evaluated virtual reality (VR) as an intervention tool for enhancing cognitive and vocational outcomes in 90 young ketamine users (KU) randomly assigned to a treatment group (virtual reality group, VRG; tutor-administered group, TAG) or wait-listed control group (CG). Two training programmes with similar content but different delivery modes (VR-based and manual-based) were applied using a virtual boutique as a training scenario. Outcome assessments comprised the Digit Vigilance Test, Rivermead Behavioural Memory Test, Wisconsin Cart Sorting Test, work-site test and self-efficacy pre- and post-test and during 3- and 6-month follow-ups. The VRG exhibited significant improvements in attention and improvements in memory that were maintained after 3 months. Both the VRG and TAG exhibited significantly improved vocational skills after training which were maintained during follow-up, and improved self-efficacy. VR-based cognitive training might target cognitive problems in KU.
Measuring Reduction Methods for VR Sickness in Virtual Environments
ERIC Educational Resources Information Center
Magaki, Takurou; Vallance, Michael
2017-01-01
Recently, virtual reality (VR) technologies have developed remarkably. However, some users have negative symptoms during VR experiences or post-experiences. Consequently, alleviating VR sickness is a major challenge, but an effective reduction method has not yet been discovered. The purpose of this article is to compare and evaluate VR sickness in…
ERIC Educational Resources Information Center
Brown, Abbie; Sugar, William
2009-01-01
Second Life is a three-dimensional, multi-user virtual environment that has attracted particular attention for its instructional potential in professional development and higher education settings. This article describes Second Life in general and explores the benefits and challenges of using it for teaching and learning.
Ketelhut, Diane Jass; Niemi, Steven M
2007-01-01
This article examines several new and exciting communication technologies. Many of the technologies were developed by the entertainment industry; however, other industries are adopting and modifying them for their own needs. These new technologies allow people to collaborate across distance and time and to learn in simulated work contexts. The article explores the potential utility of these technologies for advancing laboratory animal care and use through better education and training. Descriptions include emerging technologies such as augmented reality and multi-user virtual environments, which offer new approaches with different capabilities. Augmented reality interfaces, characterized by the use of handheld computers to infuse the virtual world into the real one, result in deeply immersive simulations. In these simulations, users can access virtual resources and communicate with real and virtual participants. Multi-user virtual environments enable multiple participants to simultaneously access computer-based three-dimensional virtual spaces, called "worlds," and to interact with digital tools. They allow for authentic experiences that promote collaboration, mentoring, and communication. Because individuals may learn or train differently, it is advantageous to combine the capabilities of these technologies and applications with more traditional methods to increase the number of students who are served by using current methods alone. The use of these technologies in animal care and use programs can create detailed training and education environments that allow students to learn the procedures more effectively, teachers to assess their progress more objectively, and researchers to gain insights into animal care.
Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T
2007-07-01
Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.
Huber, Tobias; Paschold, Markus; Hansen, Christian; Lang, Hauke; Kneist, Werner
2018-06-01
Immersive virtual reality (VR) laparoscopy simulation connects VR simulation with head-mounted displays to increase presence during VR training. The goal of the present study was the comparison of 2 different surroundings according to performance and users' preference. With a custom immersive virtual reality laparoscopy simulator, an artificially created VR operating room (AVR) and a highly immersive VR operating room (IVR) were compared. Participants (n = 30) performed 3 tasks (peg transfer, fine dissection, and cholecystectomy) in AVR and IVR in a crossover study design. No overall difference in virtual laparoscopic performance was obtained when comparing results from AVR with IVR. Most participants preferred the IVR surrounding (n = 24). Experienced participants (n = 10) performed significantly better than novices (n = 10) in all tasks regardless of the surrounding ( P < .05). Participants with limited experience (n = 10) showed differing results. Presence, immersion, and exhilaration were significantly higher in IVR. Two thirds assumed that IVR would have a positive influence on their laparoscopic simulator use. This first study comparing AVR and IVR did not reveal differences in virtual laparoscopic performance. IVR is considered the more realistic surrounding and is therefore preferred by the participants.
ERIC Educational Resources Information Center
Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca
2009-01-01
The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…
The impact of self-avatars on trust and collaboration in shared virtual environments.
Pan, Ye; Steed, Anthony
2017-01-01
A self-avatar is known to have a potentially significant impact on the user's experience of the immersive content but it can also affect how users interact with each other in a shared virtual environment (SVE). We implemented an SVE for a consumer virtual reality system where each user's body could be represented by a jointed self-avatar that was dynamically controlled by head and hand controllers. We investigated the impact of a self-avatar on collaborative outcomes such as completion time and trust formation during competitive and cooperative tasks. We used two different embodiment levels: no self-avatar and self-avatar, and compared these to an in-person face to face version of the tasks. We found that participants could finish the task more quickly when they cooperated than when they competed, for both the self-avatar condition and the face to face condition, but not for the no self-avatar condition. In terms of trust formation, both the self-avatar condition and the face to face condition led to higher scores than the no self-avatar condition; however, collaboration style had no significant effect on trust built between partners. The results are further evidence of the importance of a self-avatar representation in immersive virtual reality.
An efficient and scalable deformable model for virtual reality-based medical applications.
Choi, Kup-Sze; Sun, Hanqiu; Heng, Pheng-Ann
2004-09-01
Modeling of tissue deformation is of great importance to virtual reality (VR)-based medical simulations. Considerable effort has been dedicated to the development of interactively deformable virtual tissues. In this paper, an efficient and scalable deformable model is presented for virtual-reality-based medical applications. It considers deformation as a localized force transmittal process which is governed by algorithms based on breadth-first search (BFS). The computational speed is scalable to facilitate real-time interaction by adjusting the penetration depth. Simulated annealing (SA) algorithms are developed to optimize the model parameters by using the reference data generated with the linear static finite element method (FEM). The mechanical behavior and timing performance of the model have been evaluated. The model has been applied to simulate the typical behavior of living tissues and anisotropic materials. Integration with a haptic device has also been achieved on a generic personal computer (PC) platform. The proposed technique provides a feasible solution for VR-based medical simulations and has the potential for multi-user collaborative work in virtual environment.
A Survey of Mobile and Wireless Technologies for Augmented Reality Systems (Preprint)
2008-02-01
Windows XP. A number of researchers have started employing them in AR simulations such as Wagner et al [25], Newman et al [46] and specifically the Sony ...different music clubs and styles of music according to the selection and tastes of the listeners. In the intro sequence the user can select an animated...3-D character (avatar) as his or her virtual persona and visit the different music rooms in the virtual disco. Users can download or stream music in
Virtual reality 3D headset based on DMD light modulators
NASA Astrophysics Data System (ADS)
Bernacki, Bruce E.; Evans, Allan; Tang, Edward
2014-06-01
We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micromirror devices (DMD). Current methods for presenting information for virtual reality are focused on either polarizationbased modulators such as liquid crystal on silicon (LCoS) devices, or miniature LCD or LED displays often using lenses to place the image at infinity. LCoS modulators are an area of active research and development, and reduce the amount of viewing light by 50% due to the use of polarization. Viewable LCD or LED screens may suffer low resolution, cause eye fatigue, and exhibit a "screen door" or pixelation effect due to the low pixel fill factor. Our approach leverages a mature technology based on silicon micro mirrors delivering 720p resolution displays in a small form-factor with high fill factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high-definition resolution and low power consumption, and many of the design methods developed for DMD projector applications can be adapted to display use. Potential applications include night driving with natural depth perception, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design concept is described in which light from the DMD is imaged to infinity and the user's own eye lens forms a real image on the user's retina resulting in a virtual retinal display.
Assessment of wheelchair driving performance in a virtual reality-based simulator
Mahajan, Harshal P.; Dicianno, Brad E.; Cooper, Rory A.; Ding, Dan
2013-01-01
Objective To develop a virtual reality (VR)-based simulator that can assist clinicians in performing standardized wheelchair driving assessments. Design A completely within-subjects repeated measures design. Methods Participants drove their wheelchairs along a virtual driving circuit modeled after the Power Mobility Road Test (PMRT) and in a hallway with decreasing width. The virtual simulator was displayed on computer screen and VR screens and participants interacted with it using a set of instrumented rollers and a wheelchair joystick. Driving performances of participants were estimated and compared using quantitative metrics from the simulator. Qualitative ratings from two experienced clinicians were used to estimate intra- and inter-rater reliability. Results Ten regular wheelchair users (seven men, three women; mean age ± SD, 39.5 ± 15.39 years) participated. The virtual PMRT scores from the two clinicians show high inter-rater reliability (78–90%) and high intra-rater reliability (71–90%) for all test conditions. More research is required to explore user preferences and effectiveness of the two control methods (rollers and mathematical model) and the display screens. Conclusions The virtual driving simulator seems to be a promising tool for wheelchair driving assessment that clinicians can use to supplement their real-world evaluations. PMID:23820148
Direct Manipulation in Virtual Reality
NASA Technical Reports Server (NTRS)
Bryson, Steve
2003-01-01
Virtual Reality interfaces offer several advantages for scientific visualization such as the ability to perceive three-dimensional data structures in a natural way. The focus of this chapter is direct manipulation, the ability for a user in virtual reality to control objects in the virtual environment in a direct and natural way, much as objects are manipulated in the real world. Direct manipulation provides many advantages for the exploration of complex, multi-dimensional data sets, by allowing the investigator the ability to intuitively explore the data environment. Because direct manipulation is essentially a control interface, it is better suited for the exploration and analysis of a data set than for the publishing or communication of features found in that data set. Thus direct manipulation is most relevant to the analysis of complex data that fills a volume of three-dimensional space, such as a fluid flow data set. Direct manipulation allows the intuitive exploration of that data, which facilitates the discovery of data features that would be difficult to find using more conventional visualization methods. Using a direct manipulation interface in virtual reality, an investigator can, for example, move a data probe about in space, watching the results and getting a sense of how the data varies within its spatial volume.
Interfacing modeling suite Physics Of Eclipsing Binaries 2.0 with a Virtual Reality Platform
NASA Astrophysics Data System (ADS)
Harriett, Edward; Conroy, Kyle; Prša, Andrej; Klassner, Frank
2018-01-01
To explore alternate methods for modeling eclipsing binary stars, we extrapolate upon PHOEBE’s (PHysics Of Eclipsing BinariEs) capabilities in a virtual reality (VR) environment to create an immersive and interactive experience for users. The application used is Vizard, a python-scripted VR development platform for environments such as Cave Automatic Virtual Environment (CAVE) and other off-the-shelf VR headsets. Vizard allows the freedom for all modeling to be precompiled without compromising functionality or usage on its part. The system requires five arguments to be precomputed using PHOEBE’s python front-end: the effective temperature, flux, relative intensity, vertex coordinates, and orbits; the user can opt to implement other features from PHOEBE to be accessed within the simulation as well. Here we present the method for making the data observables accessible in real time. An Occulus Rift will be available for a live showcase of various cases of VR rendering of PHOEBE binary systems including detached and contact binary stars.
Subjective visual vertical assessment with mobile virtual reality system.
Ulozienė, Ingrida; Totilienė, Milda; Paulauskas, Andrius; Blažauskas, Tomas; Marozas, Vaidotas; Kaski, Diego; Ulozas, Virgilijus
2017-01-01
The subjective visual vertical (SVV) is a measure of a subject's perceived verticality, and a sensitive test of vestibular dysfunction. Despite this, and consequent upon technical and logistical limitations, SVV has not entered mainstream clinical practice. The aim of the study was to develop a mobile virtual reality based system for SVV test, evaluate the suitability of different controllers and assess the system's usability in practical settings. In this study, we describe a novel virtual reality based system that has been developed to test SVV using integrated software and hardware, and report normative values across healthy population. Participants wore a mobile virtual reality headset in order to observe a 3D stimulus presented across separate conditions - static, dynamic and an immersive real-world ("boat in the sea") SVV tests. The virtual reality environment was controlled by the tester using a Bluetooth connected controllers. Participants controlled the movement of a vertical arrow using either a gesture control armband or a general-purpose gamepad, to indicate perceived verticality. We wanted to compare 2 different methods for object control in the system, determine normal values and compare them with literature data, to evaluate the developed system with the help of the system usability scale questionnaire and evaluate possible virtually induced dizziness with the help of subjective visual analog scale. There were no statistically significant differences in SVV values during static, dynamic and virtual reality stimulus conditions, obtained using the two different controllers and the results are compared to those previously reported in the literature using alternative methodologies. The SUS scores for the system were high, with a median of 82.5 for the Myo controller and of 95.0 for the Gamepad controller, representing a statistically significant difference between the two controllers (P<0.01). The median of virtual reality-induced dizziness for both devices was 0.7. The mobile virtual reality based system for implementation of subjective visual vertical test, is accurate and applicable in the clinical environment. The gamepad-based virtual object control method was preferred by the users. The tests were well tolerated with low dizziness scores in the majority of patients. Copyright © 2018 The Lithuanian University of Health Sciences. Production and hosting by Elsevier Sp. z o.o. All rights reserved.
SmallTool - a toolkit for realizing shared virtual environments on the Internet
NASA Astrophysics Data System (ADS)
Broll, Wolfgang
1998-09-01
With increasing graphics capabilities of computers and higher network communication speed, networked virtual environments have become available to a large number of people. While the virtual reality modelling language (VRML) provides users with the ability to exchange 3D data, there is still a lack of appropriate support to realize large-scale multi-user applications on the Internet. In this paper we will present SmallTool, a toolkit to support shared virtual environments on the Internet. The toolkit consists of a VRML-based parsing and rendering library, a device library, and a network library. This paper will focus on the networking architecture, provided by the network library - the distributed worlds transfer and communication protocol (DWTP). DWTP provides an application-independent network architecture to support large-scale multi-user environments on the Internet.
Paolini, Gabriele; Peruzzi, Agnese; Mirelman, Anat; Cereatti, Andrea; Gaukrodger, Stephen; Hausdorff, Jeffrey M; Della Croce, Ugo
2014-09-01
The use of virtual reality for the provision of motor-cognitive gait training has been shown to be effective for a variety of patient populations. The interaction between the user and the virtual environment is achieved by tracking the motion of the body parts and replicating it in the virtual environment in real time. In this paper, we present the validation of a novel method for tracking foot position and orientation in real time, based on the Microsoft Kinect technology, to be used for gait training combined with virtual reality. The validation of the motion tracking method was performed by comparing the tracking performance of the new system against a stereo-photogrammetric system used as gold standard. Foot position errors were in the order of a few millimeters (average RMSD from 4.9 to 12.1 mm in the medio-lateral and vertical directions, from 19.4 to 26.5 mm in the anterior-posterior direction); the foot orientation errors were also small (average %RMSD from 5.6% to 8.8% in the medio-lateral and vertical directions, from 15.5% to 18.6% in the anterior-posterior direction). The results suggest that the proposed method can be effectively used to track feet motion in virtual reality and treadmill-based gait training programs.
HyFinBall: A Two-Handed, Hybrid 2D/3D Desktop VR Interface for Visualization
2013-01-01
user study . This is done in the context of a rich, visual analytics interface containing coordinated views with 2D and 3D visualizations and...the user interface (hardware and software), the design space, as well as preliminary results of a formal user study . This is done in the context of a ... virtual reality , user interface , two-handed interface , hybrid user interface , multi-touch, gesture,
An Interactive Augmented Reality Implementation of Hijaiyah Alphabet for Children Education
NASA Astrophysics Data System (ADS)
Rahmat, R. F.; Akbar, F.; Syahputra, M. F.; Budiman, M. A.; Hizriadi, A.
2018-03-01
Hijaiyah alphabet is letters used in the Qur’an. An attractive and exciting learning process of Hijaiyah alphabet is necessary for the children. One of the alternatives to create attractive and interesting learning process of Hijaiyah alphabet is to develop it into a mobile application using augmented reality technology. Augmented reality is a technology that combines two-dimensional or three-dimensional virtual objects into actual three-dimensional circles and projects them in real time. The purpose of application aims to foster the children interest in learning Hijaiyah alphabet. This application is using Smartphone and marker as the medium. It was built using Unity and augmented reality library, namely Vuforia, then using Blender as the 3D object modeling software. The output generated from this research is the learning application of Hijaiyah letters using augmented reality. How to use it is as follows: first, place marker that has been registered and printed; second, the smartphone camera will track the marker. If the marker is invalid, the user should repeat the tracking process. If the marker is valid and identified, the marker will have projected the objects of Hijaiyah alphabet in three-dimensional form. Lastly, the user can learn and understand the shape and pronunciation of Hijaiyah alphabet by touching the virtual button on the marker
Ambient clumsiness in virtual environments
NASA Astrophysics Data System (ADS)
Ruzanka, Silvia; Behar, Katherine
2010-01-01
A fundamental pursuit of Virtual Reality is the experience of a seamless connection between the user's body and actions within the simulation. Virtual worlds often mediate the relationship between the physical and virtual body through creating an idealized representation of the self in an idealized space. This paper argues that the very ubiquity of the medium of virtual environments, such as the massively popular Second Life, has now made them mundane, and that idealized representations are no longer appropriate. In our artwork we introduce the attribute of clumsiness to Second Life by creating and distributing scripts that cause users' avatars to exhibit unpredictable stumbling, tripping, and momentary poor coordination, thus subtly and unexpectedly intervening with, rather than amplifying, a user's intent. These behaviors are publicly distributed, and manifest only occasionally - rather than intentional, conscious actions, they are involuntary and ambient. We suggest that the physical human body is itself an imperfect interface, and that the continued blurring of distinctions between the physical body and virtual representations calls for the introduction of these mundane, clumsy elements.
Virtual reality applications to automated rendezvous and capture
NASA Technical Reports Server (NTRS)
Hale, Joseph; Oneil, Daniel
1991-01-01
Virtual Reality (VR) is a rapidly developing Human/Computer Interface (HCI) technology. The evolution of high-speed graphics processors and development of specialized anthropomorphic user interface devices, that more fully involve the human senses, have enabled VR technology. Recently, the maturity of this technology has reached a level where it can be used as a tool in a variety of applications. This paper provides an overview of: VR technology, VR activities at Marshall Space Flight Center (MSFC), applications of VR to Automated Rendezvous and Capture (AR&C), and identifies areas of VR technology that requires further development.
Augmented Reality, Virtual Reality and Their Effect on Learning Style in the Creative Design Process
ERIC Educational Resources Information Center
Chandrasekera, Tilanka; Yoon, So-Yeon
2018-01-01
Research has shown that user characteristics such as preference for using an interface can result in effective use of the interface. Research has also suggested that there is a relationship between learner preference and creativity. This study uses the VARK learning styles inventory to assess students learning style then explores how this learning…
Implementing virtual reality interfaces for the geosciences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, W.; Jacobsen, J.; Austin, A.
1996-06-01
For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called {open_quotes}Virtual Reality{close_quotes} (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools. VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter threemore » or six-dimensional parameters. In the latter, three or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world, but the keyboard is the obstacle in that typing is cumbersome, error-prone and typically slow. In the latter, the user can interact with these parameters by means of motor skills which are highly developed. Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as a tool for computing streamlines via manipulation of a {open_quotes}rake.{close_quotes} The rake is presented to the user in the form of a {open_quotes}virtual well{close_quotes} icon, and provides parameters used by the streamlines algorithm.« less
Immersive Collaboration Simulations: Multi-User Virtual Environments and Augmented Realities
NASA Technical Reports Server (NTRS)
Dede, Chris
2008-01-01
Emerging information technologies are reshaping the following: shifts in the knowledge and skills society values, development of new methods of teaching and learning, and changes in the characteristics of learning.
Yap, Hwa Jen; Taha, Zahari; Md Dawal, Siti Zawiah; Chang, Siow-Wee
2014-01-01
Traditional robotic work cell design and programming are considered inefficient and outdated in current industrial and market demands. In this research, virtual reality (VR) technology is used to improve human-robot interface, whereby complicated commands or programming knowledge is not required. The proposed solution, known as VR-based Programming of a Robotic Work Cell (VR-Rocell), consists of two sub-programmes, which are VR-Robotic Work Cell Layout (VR-RoWL) and VR-based Robot Teaching System (VR-RoT). VR-RoWL is developed to assign the layout design for an industrial robotic work cell, whereby VR-RoT is developed to overcome safety issues and lack of trained personnel in robot programming. Simple and user-friendly interfaces are designed for inexperienced users to generate robot commands without damaging the robot or interrupting the production line. The user is able to attempt numerous times to attain an optimum solution. A case study is conducted in the Robotics Laboratory to assemble an electronics casing and it is found that the output models are compatible with commercial software without loss of information. Furthermore, the generated KUKA commands are workable when loaded into a commercial simulator. The operation of the actual robotic work cell shows that the errors may be due to the dynamics of the KUKA robot rather than the accuracy of the generated programme. Therefore, it is concluded that the virtual reality based solution approach can be implemented in an industrial robotic work cell. PMID:25360663
Yap, Hwa Jen; Taha, Zahari; Dawal, Siti Zawiah Md; Chang, Siow-Wee
2014-01-01
Traditional robotic work cell design and programming are considered inefficient and outdated in current industrial and market demands. In this research, virtual reality (VR) technology is used to improve human-robot interface, whereby complicated commands or programming knowledge is not required. The proposed solution, known as VR-based Programming of a Robotic Work Cell (VR-Rocell), consists of two sub-programmes, which are VR-Robotic Work Cell Layout (VR-RoWL) and VR-based Robot Teaching System (VR-RoT). VR-RoWL is developed to assign the layout design for an industrial robotic work cell, whereby VR-RoT is developed to overcome safety issues and lack of trained personnel in robot programming. Simple and user-friendly interfaces are designed for inexperienced users to generate robot commands without damaging the robot or interrupting the production line. The user is able to attempt numerous times to attain an optimum solution. A case study is conducted in the Robotics Laboratory to assemble an electronics casing and it is found that the output models are compatible with commercial software without loss of information. Furthermore, the generated KUKA commands are workable when loaded into a commercial simulator. The operation of the actual robotic work cell shows that the errors may be due to the dynamics of the KUKA robot rather than the accuracy of the generated programme. Therefore, it is concluded that the virtual reality based solution approach can be implemented in an industrial robotic work cell.
An interactive VR system based on full-body tracking and gesture recognition
NASA Astrophysics Data System (ADS)
Zeng, Xia; Sang, Xinzhu; Chen, Duo; Wang, Peng; Guo, Nan; Yan, Binbin; Wang, Kuiru
2016-10-01
Most current virtual reality (VR) interactions are realized with the hand-held input device which leads to a low degree of presence. There is other solutions using sensors like Leap Motion to recognize the gestures of users in order to interact in a more natural way, but the navigation in these systems is still a problem, because they fail to map the actual walking to virtual walking only with a partial body of the user represented in the synthetic environment. Therefore, we propose a system in which users can walk around in the virtual environment as a humanoid model, selecting menu items and manipulating with the virtual objects using natural hand gestures. With a Kinect depth camera, the system tracks the joints of the user, mapping them to a full virtual body which follows the move of the tracked user. The movements of the feet can be detected to determine whether the user is in walking state, so that the walking of model in the virtual world can be activated and stopped by means of animation control in Unity engine. This method frees the hands of users comparing to traditional navigation way using hand-held device. We use the point cloud data getting from Kinect depth camera to recognize the gestures of users, such as swiping, pressing and manipulating virtual objects. Combining the full body tracking and gestures recognition using Kinect, we achieve our interactive VR system in Unity engine with a high degree of presence.
Cranial implant design using augmented reality immersive system.
Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary
2007-01-01
Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.
Immersive Earth Science: Data Visualization in Virtual Reality
NASA Astrophysics Data System (ADS)
Skolnik, S.; Ramirez-Linan, R.
2017-12-01
Utilizing next generation technology, Navteca's exploration of 3D and volumetric temporal data in Virtual Reality (VR) takes advantage of immersive user experiences where stakeholders are literally inside the data. No longer restricted by the edges of a screen, VR provides an innovative way of viewing spatially distributed 2D and 3D data that leverages a 360 field of view and positional-tracking input, allowing users to see and experience data differently. These concepts are relevant to many sectors, industries, and fields of study, as real-time collaboration in VR can enhance understanding and mission with VR visualizations that display temporally-aware 3D, meteorological, and other volumetric datasets. The ability to view data that is traditionally "difficult" to visualize, such as subsurface features or air columns, is a particularly compelling use of the technology. Various development iterations have resulted in Navteca's proof of concept that imports and renders volumetric point-cloud data in the virtual reality environment by interfacing PC-based VR hardware to a back-end server and popular GIS software. The integration of the geo-located data in VR and subsequent display of changeable basemaps, overlaid datasets, and the ability to zoom, navigate, and select specific areas show the potential for immersive VR to revolutionize the way Earth data is viewed, analyzed, and communicated.
Virtual Reality Simulation of the International Space Welding Experiment
NASA Technical Reports Server (NTRS)
Phillips, James A.
1996-01-01
Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.
Virtual reality as a new trend in mechanical and electrical engineering education
NASA Astrophysics Data System (ADS)
Kamińska, Dorota; Sapiński, Tomasz; Aitken, Nicola; Rocca, Andreas Della; Barańska, Maja; Wietsma, Remco
2017-12-01
In their daily practice, academics frequently face lack of access to modern equipment and devices, which are currently in use on the market. Moreover, many students have problems with understanding issues connected to mechanical and electrical engineering due to the complexity, necessity of abstract thinking and the fact that those concepts are not fully tangible. Many studies indicate that virtual reality can be successfully used as a training tool in various domains, such as development, health-care, the military or school education. In this paper, an interactive training strategy for mechanical and electrical engineering education shall be proposed. The prototype of the software consists of a simple interface, meaning it is easy for comprehension and use. Additionally, the main part of the prototype allows the user to virtually manipulate a 3D object that should be analyzed and studied. Initial studies indicate that the use of virtual reality can contribute to improving the quality and efficiency of higher education, as well as qualifications, competencies and the skills of graduates, and increase their competitiveness in the labour market.
Can hazard risk be communicated through a virtual experience?
Mitchell, J T
1997-09-01
Cyberspace, defined by William Gibson as a consensual hallucination, now refers to all computer-generated interactive environments. Virtual reality, one of a class of interactive cyberspaces, allows us to create and interact directly with objects not available in the everyday world. Despite successes in the entertainment and aviation industries, this technology has been called a 'solution in search of a problem'. The purpose of this commentary is to suggest such a problem: the inability to acquire experience with a hazard to motivate mitigation. Direct experience with a hazard has been demonstrated as a powerful incentive to adopt mitigation measures. While we lack the ability to summon hazard events at will in order to gain access to that experience, a virtual environment can provide an arena where potential victims are exposed to a hazard's effects. Immersion as an active participant within the hazard event through virtual reality may stimulate users to undertake mitigation steps that might otherwise remain undone. This paper details the possible direction in which virtual reality may be applied to hazards mitigation through a discussion of the technology, the role of hazard experience, the creation of a hazard stimulation and the issues constraining implementation.
Can walking motions improve visually induced rotational self-motion illusions in virtual reality?
Riecke, Bernhard E; Freiberg, Jacob B; Grechkin, Timofey Y
2015-02-04
Illusions of self-motion (vection) can provide compelling sensations of moving through virtual environments without the need for complex motion simulators or large tracked physical walking spaces. Here we explore the interaction between biomechanical cues (stepping along a rotating circular treadmill) and visual cues (viewing simulated self-rotation) for providing stationary users a compelling sensation of rotational self-motion (circular vection). When tested individually, biomechanical and visual cues were similarly effective in eliciting self-motion illusions. However, in combination they yielded significantly more intense self-motion illusions. These findings provide the first compelling evidence that walking motions can be used to significantly enhance visually induced rotational self-motion perception in virtual environments (and vice versa) without having to provide for physical self-motion or motion platforms. This is noteworthy, as linear treadmills have been found to actually impair visually induced translational self-motion perception (Ash, Palmisano, Apthorp, & Allison, 2013). Given the predominant focus on linear walking interfaces for virtual-reality locomotion, our findings suggest that investigating circular and curvilinear walking interfaces offers a promising direction for future research and development and can help to enhance self-motion illusions, presence and immersion in virtual-reality systems. © 2015 ARVO.
An artificial reality environment for remote factory control and monitoring
NASA Technical Reports Server (NTRS)
Kosta, Charles Paul; Krolak, Patrick D.
1993-01-01
Work has begun on the merger of two well known systems, VEOS (HITLab) and CLIPS (NASA). In the recent past, the University of Massachusetts Lowell developed a parallel version of NASA CLIPS, called P-CLIPS. This modification allows users to create smaller expert systems which are able to communicate with each other to jointly solve problems. With the merger of a VEOS message system, PCLIPS-V can now act as a group of entities working within VEOS. To display the 3D virtual world we have been using a graphics package called HOOPS, from Ithaca Software. The artificial reality environment we have set up contains actors and objects as found in our Lincoln Logs Factory of the Future project. The environment allows us to view and control the objects within the virtual world. All communication between the separate CLIPS expert systems is done through VEOS. A graphical renderer generates camera views on X-Windows devices; Head Mounted Devices are not required. This allows more people to make use of this technology. We are experimenting with different types of virtual vehicles to give the user a sense that he or she is actually moving around inside the factory looking ahead through windows and virtual monitors.
Virtual Worlds for Virtual Organizing
NASA Astrophysics Data System (ADS)
Rhoten, Diana; Lutters, Wayne
The members and resources of a virtual organization are dispersed across time and space, yet they function as a coherent entity through the use of technologies, networks, and alliances. As virtual organizations proliferate and become increasingly important in society, many may exploit the technical architecture s of virtual worlds, which are the confluence of computer-mediated communication, telepresence, and virtual reality originally created for gaming. A brief socio-technical history describes their early origins and the waves of progress followed by stasis that brought us to the current period of renewed enthusiasm. Examination of contemporary examples demonstrates how three genres of virtual worlds have enabled new arenas for virtual organizing: developer-defined closed worlds, user-modifiable quasi-open worlds, and user-generated open worlds. Among expected future trends are an increase in collaboration born virtually rather than imported from existing organizations, a tension between high-fidelity recreations of the physical world and hyper-stylized imaginations of fantasy worlds, and the growth of specialized worlds optimized for particular sectors, companies, or cultures.
Learning Application of Astronomy Based Augmented Reality using Android Platform
NASA Astrophysics Data System (ADS)
Maleke, B.; Paseru, D.; Padang, R.
2018-02-01
Astronomy is a branch of science involving observations of celestial bodies such as stars, planets, nebular comets, star clusters, and galaxies as well as natural phenomena occurring outside the Earth’s atmosphere. The way of learning of Astronomy is quite varied, such as by using a book or observe directly with a telescope. But both ways of learning have shortcomings, for example learning through books is only presented in the form of interesting 2D drawings. While learning with a telescope requires a fairly expensive cost to buy the equipment. This study will present a more interesting way of learning from the previous one, namely through Augmented Reality (AR) application using Android platform. Augmented Reality is a combination of virtual world (virtual) and real world (real) made by computer. Virtual objects can be text, animation, 3D models or videos that are combined with the actual environment so that the user feels the virtual object is in his environment. With the use of the Android platform, this application makes the learning method more interesting because it can be used on various Android smartphones so that learning can be done anytime and anywhere. The methodology used in making applications is Multimedia Lifecycle, along with C # language for AR programming and flowchart as a modelling tool. The results of research on some users stated that this application can run well and can be used as an alternative way of learning Astronomy with more interesting.
Model of Illusions and Virtual Reality
Gonzalez-Franco, Mar; Lanier, Jaron
2017-01-01
In Virtual Reality (VR) it is possible to induce illusions in which users report and behave as if they have entered into altered situations and identities. The effect can be robust enough for participants to respond “realistically,” meaning behaviors are altered as if subjects had been exposed to the scenarios in reality. The circumstances in which such VR illusions take place were first introduced in the 80's. Since then, rigorous empirical evidence has explored a wide set of illusory experiences in VR. Here, we compile this research and propose a neuroscientific model explaining the underlying perceptual and cognitive mechanisms that enable illusions in VR. Furthermore, we describe the minimum instrumentation requirements to support illusory experiences in VR, and discuss the importance and shortcomings of the generic model. PMID:28713323
NASA Astrophysics Data System (ADS)
Krum, David M.; Sadek, Ramy; Kohli, Luv; Olson, Logan; Bolas, Mark
2010-01-01
As part of the Institute for Creative Technologies and the School of Cinematic Arts at the University of Southern California, the Mixed Reality lab develops technologies and techniques for presenting realistic immersive training experiences. Such experiences typically place users within a complex ecology of social actors, physical objects, and collections of intents, motivations, relationships, and other psychological constructs. Currently, it remains infeasible to completely synthesize the interactivity and sensory signatures of such ecologies. For this reason, the lab advocates mixed reality methods for training and conducts experiments exploring such methods. Currently, the lab focuses on understanding and exploiting the elasticity of human perception with respect to representational differences between real and virtual environments. This paper presents an overview of three projects: techniques for redirected walking, displays for the representation of virtual humans, and audio processing to increase stress.
Web-based three-dimensional Virtual Body Structures: W3D-VBS.
Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex
2002-01-01
Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user's progress through evaluation tools helps customize lesson plans. A self-guided "virtual tour" of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Markidis, S.; Rizwan, U.
The use of virtual nuclear control room can be an effective and powerful tool for training personnel working in the nuclear power plants. Operators could experience and simulate the functioning of the plant, even in critical situations, without being in a real power plant or running any risk. 3D models can be exported to Virtual Reality formats and then displayed in the Virtual Reality environment providing an immersive 3D experience. However, two major limitations of this approach are that 3D models exhibit static textures, and they are not fully interactive and therefore cannot be used effectively in training personnel. Inmore » this paper we first describe a possible solution for embedding the output of a computer application in a 3D virtual scene, coupling real-world applications and VR systems. The VR system reported here grabs the output of an application running on an X server; creates a texture with the output and then displays it on a screen or a wall in the virtual reality environment. We then propose a simple model for providing interaction between the user in the VR system and the running simulator. This approach is based on the use of internet-based application that can be commanded by a laptop or tablet-pc added to the virtual environment. (authors)« less
Comparative study on collaborative interaction in non-immersive and immersive systems
NASA Astrophysics Data System (ADS)
Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki
2007-09-01
This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.
Codd, Anthony M; Choudhury, Bipasha
2011-01-01
The use of cadavers to teach anatomy is well established, but limitations with this approach have led to the introduction of alternative teaching methods. One such method is the use of three-dimensional virtual reality computer models. An interactive, three-dimensional computer model of human forearm anterior compartment musculoskeletal anatomy was produced using the open source 3D imaging program "Blender." The aim was to evaluate the use of 3D virtual reality when compared with traditional anatomy teaching methods. Three groups were identified from the University of Manchester second year Human Anatomy Research Skills Module class: a "control" group (no prior knowledge of forearm anatomy), a "traditional methods" group (taught using dissection and textbooks), and a "model" group (taught solely using e-resource). The groups were assessed on anatomy of the forearm by a ten question practical examination. ANOVA analysis showed the model group mean test score to be significantly higher than the control group (mean 7.25 vs. 1.46, P < 0.001) and not significantly different to the traditional methods group (mean 6.87, P > 0.5). Feedback from all users of the e-resource was positive. Virtual reality anatomy learning can be used to compliment traditional teaching methods effectively. Copyright © 2011 American Association of Anatomists.
Visualizing Mars Using Virtual Reality: A State of the Art Mapping Technique Used on Mars Pathfinder
NASA Technical Reports Server (NTRS)
Stoker, C.; Zbinden, E.; Blackmon, T.; Nguyen, L.
1999-01-01
We describe an interactive terrain visualization system which rapidly generates and interactively displays photorealistic three-dimensional (3-D) models produced from stereo images. This product, first demonstrated in Mars Pathfinder, is interactive, 3-D, and can be viewed in an immersive display which qualifies it for the name Virtual Reality (VR). The use of this technology on Mars Pathfinder was the first use of VR for geologic analysis. A primary benefit of using VR to display geologic information is that it provides an improved perception of depth and spatial layout of the remote site. The VR aspect of the display allows an operator to move freely in the environment, unconstrained by the physical limitations of the perspective from which the data were acquired. Virtual Reality offers a way to archive and retrieve information in a way that is intuitively obvious. Combining VR models with stereo display systems can give the user a sense of presence at the remote location. The capability, to interactively perform measurements from within the VR model offers unprecedented ease in performing operations that are normally time consuming and difficult using other techniques. Thus, Virtual Reality can be a powerful a cartographic tool. Additional information is contained in the original extended abstract.
Hybrid 2-D and 3-D Immersive and Interactive User Interface for Scientific Data Visualization
2017-08-01
visualization, 3-D interactive visualization, scientific visualization, virtual reality, real -time ray tracing 16. SECURITY CLASSIFICATION OF: 17...scientists to employ in the real world. Other than user-friendly software and hardware setup, scientists also need to be able to perform their usual...and scientific visualization communities mostly have different research priorities. For the VR community, the ability to support real -time user
Dissociation in virtual reality: depersonalization and derealization
NASA Astrophysics Data System (ADS)
Garvey, Gregory P.
2010-01-01
This paper looks at virtual worlds such as Second Life7 (SL) as possible incubators of dissociation disorders as classified by the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition3 (also known as the DSM-IV). Depersonalization is where "a person feels that he or she has changed in some way or is somehow unreal." Derealization when "the same beliefs are held about one's surroundings." Dissociative Identity Disorder (DID), previously known as multiple personality disorder fits users of Second Life who adopt "in-world" avatars and in effect, enact multiple distinct identities or personalities (known as alter egos or alters). Select questions from the Structured Clinical Interview for Depersonalization (SCI-DER)8 will be discussed as they might apply to the user's experience in Second Life. Finally I would like to consider the hypothesis that rather than a pathological disorder, dissociation is a normal response to the "artificial reality" of Second Life.
Practical low-cost stereo head-mounted display
NASA Astrophysics Data System (ADS)
Pausch, Randy; Dwivedi, Pramod; Long, Allan C., Jr.
1991-08-01
A high-resolution head-mounted display has been developed from substantially cheaper components than previous systems. Monochrome displays provide 720 by 280 monochrome pixels to each eye in a one-inch-square region positioned approximately one inch from each eye. The display hardware is the Private Eye, manufactured by Reflection Technologies, Inc. The tracking system uses the Polhemus Isotrak, providing (x,y,z, azimuth, elevation and roll) information on the user''s head position and orientation 60 times per second. In combination with a modified Nintendo Power Glove, this system provides a full-functionality virtual reality/simulation system. Using two host 80386 computers, real-time wire frame images can be produced. Other virtual reality systems require roughly 250,000 in hardware, while this one requires only 5,000. Stereo is particularly useful for this system because shading or occlusion cannot be used as depth cues.
Registration of an on-axis see-through head-mounted display and camera system
NASA Astrophysics Data System (ADS)
Luo, Gang; Rensing, Noa M.; Weststrate, Evan; Peli, Eli
2005-02-01
An optical see-through head-mounted display (HMD) system integrating a miniature camera that is aligned with the user's pupil is developed and tested. Such an HMD system has a potential value in many augmented reality applications, in which registration of the virtual display to the real scene is one of the critical aspects. The camera alignment to the user's pupil results in a simple yet accurate calibration and a low registration error across a wide range of depth. In reality, a small camera-eye misalignment may still occur in such a system due to the inevitable variations of HMD wearing position with respect to the eye. The effects of such errors are measured. Calculation further shows that the registration error as a function of viewing distance behaves nearly the same for different virtual image distances, except for a shift. The impact of prismatic effect of the display lens on registration is also discussed.
The Ethics of Virtual Reality Technology: Social Hazards and Public Policy Recommendations.
Spiegel, James S
2017-09-23
This article explores four major areas of moral concern regarding virtual reality (VR) technologies. First, VR poses potential mental health risks, including Depersonalization/Derealization Disorder. Second, VR technology raises serious concerns related to personal neglect of users' own actual bodies and real physical environments. Third, VR technologies may be used to record personal data which could be deployed in ways that threaten personal privacy and present a danger related to manipulation of users' beliefs, emotions, and behaviors. Finally, there are other moral and social risks associated with the way VR blurs the distinction between the real and illusory. These concerns regarding VR naturally raise questions about public policy. The article makes several recommendations for legal regulations of VR that together address each of the above concerns. It is argued that these regulations would not seriously threaten personal liberty but rather would protect and enhance the autonomy of VR consumers.
A Case-Based Study with Radiologists Performing Diagnosis Tasks in Virtual Reality.
Venson, José Eduardo; Albiero Berni, Jean Carlo; Edmilson da Silva Maia, Carlos; Marques da Silva, Ana Maria; Cordeiro d'Ornellas, Marcos; Maciel, Anderson
2017-01-01
In radiology diagnosis, medical images are most often visualized slice by slice. At the same time, the visualization based on 3D volumetric rendering of the data is considered useful and has increased its field of application. In this work, we present a case-based study with 16 medical specialists to assess the diagnostic effectiveness of a Virtual Reality interface in fracture identification over 3D volumetric reconstructions. We developed a VR volume viewer compatible with both the Oculus Rift and handheld-based head mounted displays (HMDs). We then performed user experiments to validate the approach in a diagnosis environment. In addition, we assessed the subjects' perception of the 3D reconstruction quality, ease of interaction and ergonomics, and also the users opinion on how VR applications can be useful in healthcare. Among other results, we have found a high level of effectiveness of the VR interface in identifying superficial fractures on head CTs.
Virtual reality based experiential cognitive treatment of anorexia nervosa.
Riva, G; Bacchetta, M; Baruffi, M; Rinaldi, S; Molinari, E
1999-09-01
The treatment of a 22-year old female university student diagnosed with Anorexia Nervosa is described. In the study the Experiential Cognitive Therapy (ECT) was used: a relatively short-term, integrated, patient oriented approach that focuses on individual discovery. Main characteristic of this approach is the use of Virtual Reality, a new technology that allows the user to be immersed in a computer-generated virtual world. At the end of the in-patient treatment, the subject increased her bodily awareness joined to a reduction in her level of body dissatisfaction. Moreover, the patient presented a high degree of motivation to change. The results are discussed with regard to Vitousek, Watson and Wilson (1998, Clinical Psychology Review, 18(4), 391-420) proposal of using the Socratic Method to face denial and resistance of anorectic patients.
a Low-Cost and Lightweight 3d Interactive Real Estate-Purposed Indoor Virtual Reality Application
NASA Astrophysics Data System (ADS)
Ozacar, K.; Ortakci, Y.; Kahraman, I.; Durgut, R.; Karas, I. R.
2017-11-01
Interactive 3D architectural indoor design have been more popular after it benefited from Virtual Reality (VR) technologies. VR brings computer-generated 3D content to real life scale and enable users to observe immersive indoor environments so that users can directly modify it. This opportunity enables buyers to purchase a property off-the-plan cheaper through virtual models. Instead of showing property through 2D plan or renders, this visualized interior architecture of an on-sale unbuilt property is demonstrated beforehand so that the investors have an impression as if they were in the physical building. However, current applications either use highly resource consuming software, or are non-interactive, or requires specialist to create such environments. In this study, we have created a real-estate purposed low-cost high quality fully interactive VR application that provides a realistic interior architecture of the property by using free and lightweight software: Sweet Home 3D and Unity. A preliminary study showed that participants generally liked proposed real estate-purposed VR application, and it satisfied the expectation of the property buyers.
An optical tracking system for virtual reality
NASA Astrophysics Data System (ADS)
Hrimech, Hamid; Merienne, Frederic
2009-03-01
In this paper we present a low-cost 3D tracking system which we have developed and tested in order to move away from traditional 2D interaction techniques (keyboard and mouse) in an attempt to improve user's experience while using a CVE. Such a tracking system is used to implement 3D interaction techniques that augment user experience, promote user's sense of transportation in the virtual world as well as user's awareness of their partners. The tracking system is a passive optical tracking system using stereoscopy a technique allowing the reconstruction of three-dimensional information from a couple of images. We have currently deployed our 3D tracking system on a collaborative research platform for investigating 3D interaction techniques in CVEs.
Virtual reality games for rehabilitation of people with stroke: perspectives from the users.
Lewis, Gwyn N; Woods, Claire; Rosie, Juliet A; McPherson, Kathryn M
2011-01-01
PURPOSE. The purpose of this study is to evaluate the feasibility and users' perspectives of a novel virtual reality (VR) game-based rehabilitation intervention for people with stroke. METHOD. Six people with upper limb hemiplegia participated in a 6-week intervention that involved VR games. A series of eight progressively complex games was developed that required participants to navigate a submarine in a virtual ocean environment. Movement of the submarine was directed by forces applied to an arm interface by the affected limb. Outcome measures included assessments of arm function, questionnaires evaluating the intervention and a semi-structured interview concerning the participants' opinion of the intervention. RESULTS. All participants improved their performance on the games, although there were limited changes in clinical measures of arm function. All participants reported that they enjoyed the intervention with a wide range of overall perceptions of the experience of using VR. Three themes emerging from the interview data were: stretching myself, purpose and expectations of the intervention and future improvements. CONCLUSIONS. Participants found that taking part in this pilot study was enjoyable and challenging. Participants' feedback suggested that the games may be motivating and engaging for future users and have provided a basis for further development of the intervention.
Ali, Saad; Qandeel, Monther; Ramakrishna, Rishi; Yang, Carina W
2018-02-01
Fluoroscopy-guided lumbar puncture (FGLP) is a basic procedural component of radiology residency and neuroradiology fellowship training. Performance of the procedure with limited experience is associated with increased patient discomfort as well as increased radiation dose, puncture attempts, and complication rate. Simulation in health care is a developing field that has potential for enhancing procedural training. We demonstrate the design and utility of a virtual reality simulator for performing FGLP. An FGLP module was developed on an ImmersiveTouch platform, which digitally reproduces the procedural environment with a hologram-like projection. From computed tomography datasets of healthy adult spines, we constructed a 3-D model of the lumbar spine and overlying soft tissues. We assigned different physical characteristics to each tissue type, which the user can experience through haptic feedback while advancing a virtual spinal needle. Virtual fluoroscopy as well as 3-D images can be obtained for procedural planning and guidance. The number of puncture attempts, the distance to the target, the number of fluoroscopic shots, and the approximate radiation dose can be calculated. Preliminary data from users who participated in the simulation were obtained in a postsimulation survey. All users found the simulation to be a realistic replication of the anatomy and procedure and would recommend to a colleague. On a scale of 1-5 (lowest to highest) rating the virtual simulator training overall, the mean score was 4.3 (range 3-5). We describe the design of a virtual reality simulator for performing FGLP and present the initial experience with this new technique. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Virtual Reality Visualization of Permafrost Dynamics Along a Transect Through Northern Alaska
NASA Astrophysics Data System (ADS)
Chappell, G. G.; Brody, B.; Webb, P.; Chord, J.; Romanovsky, V.; Tipenko, G.
2004-12-01
Understanding permafrost dynamics poses a significant challenge for researchers and planners. Our project uses nontraditional visualization tools to create a 3-D interactive virtual-reality environment in which permafrost dynamics can be explored and experimented with. We have incorporated a numerical soil temperature model by Gennadiy Tipenko and Vladimir Romanovsky of the Geophysical institute at the University of Alaska Fairbanks into an animated tour in space and time in the virtual reality facility of the Arctic Region Supercomputing Center at the University of Alaska Fairbanks. The software is being written by undergraduate interns Patrick Webb and Jordanna Chord under the direction of Professors Chappell and Brody. When using our software, the user appears to be surrounded by a 3-D computer-generated model of the state of Alaska. The eastern portion of the state is displaced upward from the western portion. The data are represented on an animated vertical strip running between the two parts, as if eastern Alaska were raised up, and the soil at the cut could be viewed. We use coloring to highlight significant properties and features of the soil: temperature, the active layer, etc. The user can view data from various parts of the state simply by walking to the appropriate location in the model, or by using a flying-style interface to cover longer distances. Using a control panel, the user can also alter the time, viewing the data for a particular date, or watching the data change with time: a high-speed movie in which long-term changes in permafrost are readily apparent. In the second phase of the project, we connect the visualization directly to the model, running in real time. We allow the user to manipulate the input data and get immediate visual feedback. For example, the user might specify the kind and placement of ground cover, by ``painting'' snowpack, plant species, or fire damage, and be able to see the effect on permafrost stability with no significant time lag.
Using smartphone technology to deliver a virtual pedestrian environment: usability and validation.
Schwebel, David C; Severson, Joan; He, Yefei
2017-09-01
Various programs effectively teach children to cross streets more safely, but all are labor- and cost-intensive. Recent developments in mobile phone technology offer opportunity to deliver virtual reality pedestrian environments to mobile smartphone platforms. Such an environment may offer a cost- and labor-effective strategy to teach children to cross streets safely. This study evaluated usability, feasibility, and validity of a smartphone-based virtual pedestrian environment. A total of 68 adults completed 12 virtual crossings within each of two virtual pedestrian environments, one delivered by smartphone and the other a semi-immersive kiosk virtual environment. Participants completed self-report measures of perceived realism and simulator sickness experienced in each virtual environment, plus self-reported demographic and personality characteristics. All participants followed system instructions and used the smartphone-based virtual environment without difficulty. No significant simulator sickness was reported or observed. Users rated the smartphone virtual environment as highly realistic. Convergent validity was detected, with many aspects of pedestrian behavior in the smartphone-based virtual environment matching behavior in the kiosk virtual environment. Anticipated correlations between personality and kiosk virtual reality pedestrian behavior emerged for the smartphone-based system. A smartphone-based virtual environment can be usable and valid. Future research should develop and evaluate such a training system.
Direct manipulation of virtual objects
NASA Astrophysics Data System (ADS)
Nguyen, Long K.
Interacting with a Virtual Environment (VE) generally requires the user to correctly perceive the relative position and orientation of virtual objects. For applications requiring interaction in personal space, the user may also need to accurately judge the position of the virtual object relative to that of a real object, for example, a virtual button and the user's real hand. This is difficult since VEs generally only provide a subset of the cues experienced in the real world. Complicating matters further, VEs presented by currently available visual displays may be inaccurate or distorted due to technological limitations. Fundamental physiological and psychological aspects of vision as they pertain to the task of object manipulation were thoroughly reviewed. Other sensory modalities -- proprioception, haptics, and audition -- and their cross-interactions with each other and with vision are briefly discussed. Visual display technologies, the primary component of any VE, were canvassed and compared. Current applications and research were gathered and categorized by different VE types and object interaction techniques. While object interaction research abounds in the literature, pockets of research gaps remain. Direct, dexterous, manual interaction with virtual objects in Mixed Reality (MR), where the real, seen hand accurately and effectively interacts with virtual objects, has not yet been fully quantified. An experimental test bed was designed to provide the highest accuracy attainable for salient visual cues in personal space. Optical alignment and user calibration were carefully performed. The test bed accommodated the full continuum of VE types and sensory modalities for comprehensive comparison studies. Experimental designs included two sets, each measuring depth perception and object interaction. The first set addressed the extreme end points of the Reality-Virtuality (R-V) continuum -- Immersive Virtual Environment (IVE) and Reality Environment (RE). This validated, linked, and extended several previous research findings, using one common test bed and participant pool. The results provided a proven method and solid reference points for further research. The second set of experiments leveraged the first to explore the full R-V spectrum and included additional, relevant sensory modalities. It consisted of two full-factorial experiments providing for rich data and key insights into the effect of each type of environment and each modality on accuracy and timeliness of virtual object interaction. The empirical results clearly showed that mean depth perception error in personal space was less than four millimeters whether the stimuli presented were real, virtual, or mixed. Likewise, mean error for the simple task of pushing a button was less than four millimeters whether the button was real or virtual. Mean task completion time was less than one second. Key to the high accuracy and quick task performance time observed was the correct presentation of the visual cues, including occlusion, stereoscopy, accommodation, and convergence. With performance results already near optimal level with accurate visual cues presented, adding proprioception, audio, and haptic cues did not significantly improve performance. Recommendations for future research include enhancement of the visual display and further experiments with more complex tasks and additional control variables.
Transforming an educational virtual reality simulation into a work of fine art.
Panaiotis; Addison, Laura; Vergara, Víctor M; Hakamata, Takeshi; Alverson, Dale C; Saiki, Stanley M; Caudell, Thomas Preston
2008-01-01
This paper outlines user interface and interaction issues, technical considerations, and problems encountered in transforming an educational VR simulation of a reified kidney nephron into an interactive artwork appropriate for a fine arts museum.
ERIC Educational Resources Information Center
Auld, Lawrence W. S.; Pantelidis, Veronica S.
1994-01-01
Describes the Virtual Reality and Education Lab (VREL) established at East Carolina University to study the implications of virtual reality for elementary and secondary education. Highlights include virtual reality software evaluation; hardware evaluation; computer-based curriculum objectives which could use virtual reality; and keeping current…
Ranky, Richard G; Sivak, Mark L; Lewis, Jeffrey A; Gade, Venkata K; Deutsch, Judith E; Mavroidis, Constantinos
2014-06-05
Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider's lower extremities. The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders.
Wearable computer for mobile augmented-reality-based controlling of an intelligent robot
NASA Astrophysics Data System (ADS)
Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino
2000-10-01
An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.
Measuring the Usability of Augmented Reality e-Learning Systems: A User-Centered Evaluation Approach
NASA Astrophysics Data System (ADS)
Pribeanu, Costin; Balog, Alexandru; Iordache, Dragoş Daniel
The development of Augmented Reality (AR) systems is creating new challenges and opportunities for the designers of e-learning systems. The mix of real and virtual requires appropriate interaction techniques that have to be evaluated with users in order to avoid usability problems. Formative usability aims at finding usability problems as early as possible in the development life cycle and is suitable to support the development of such novel interactive systems. This work presents an approach to the user-centered usability evaluation of an e-learning scenario for Biology developed on an Augmented Reality educational platform. The evaluation has been carried on during and after a summer school held within the ARiSE research project. The basic idea was to perform usability evaluation twice. In this respect, we conducted user testing with a small number of students during the summer school in order to get a fast feedback from users having good knowledge in Biology. Then, we repeated the user testing in different conditions and with a relatively larger number of representative users. In this paper we describe both experiments and compare the usability evaluation results.
Poeschl, Sandra; Doering, Nicola
2015-01-01
Virtual reality exposure therapy (VRET) applications use high levels of fidelity in order to produce high levels of presence and thereby elicit an emotional response for the user (like fear for phobia treatment). State of research shows mixed results for the correlation between anxiety and presence in virtual reality exposure, with differing results depending on specific anxiety disorders. A positive correlation for anxiety and presence for social anxiety disorder is not proven up to now. One reason might be that plausibility of the simulation, namely including key triggers for social anxiety (for example verbal and non-verbal behavior of virtual agents that reflects potentially negative human evaluation) might not be acknowledged in current presence questionnaires. A German scale for measuring co-presence and social presence for virtual reality (VR) fear of public speaking scenarios was developed based on a translation and adaption of existing co-presence and social presence questionnaires. A sample of N = 151 students rated co-presence and social presence after using a fear of public speaking application. Four correlated factors were derived by item- and principle axis factor analysis (Promax rotation), representing the presenter's reaction to virtual agents, the reactions of the virtual agents as perceived by the presenter, impression of interaction possibilities, and (co-)presence of other people in the virtual environment. The scale developed can be used as a starting point for future research and test construction for VR applications with a social context.
Corrêa, Ana Grasielle Dionísio; de Assis, Gilda Aparecida; do Nascimento, Marilena; de Deus Lopes, Roseli
2017-04-01
Augmented Reality musical software (GenVirtual) is a technology, which primarily allows users to develop music activities for rehabilitation. This study aimed to analyse the perceptions of health care professionals regarding the clinical utility of GenVirtual. A second objective was to identify improvements to GenVirtual software and similar technologies. Music therapists, occupational therapists, physiotherapists and speech and language therapist who assist people with physical and cognitive disabilities were enrolled in three focus groups. The quantitative and qualitative data were collected through inductive thematic analysis. Three main themes were identified: the use of GenVirtual in health care areas; opportunities for realistic application of GenVirtual; and limitations in the use of GenVirtual. The registration units identified were: motor stimulation, cognitive stimulation, verbal learning, recreation activity, musicality, accessibility, motivation, sonic accuracy, interference of lighting, poor sound, children and adults. This research suggested that the GenVirtual is a complementary tool to conventional clinical practice and has great potential to motor and cognitive rehabilitation of children and adults. Implications for Rehabilitation Gaining health professional' perceptions of the Augmented Reality musical game (GenVirtual) give valuable information as to the clinical utility of the software. GenVirtual was perceived as a tool that could be used as enhancing the motor and cognitive rehabilitation process. GenVirtual was viewed as a tool that could enhance clinical practice and communication among various agencies, but it was suggested that it should be used with caution to avoid confusion and replacement of important services.
Fisher, J Brian; Porter, Susan M
2002-01-01
This paper describes an application of a display approach which uses chromakey techniques to composite real and computer-generated images allowing a user to see his hands and medical instruments collocated with the display of virtual objects during a medical training simulation. Haptic feedback is provided through the use of a PHANTOM force feedback device in addition to tactile augmentation, which allows the user to touch virtual objects by introducing corresponding real objects in the workspace. A simplified catheter introducer insertion simulation was developed to demonstrate the capabilities of this approach.
Graafland, Maurits; Bok, Kiki; Schreuder, Henk W R; Schijven, Marlies P
2014-06-01
Untrained laparoscopic camera assistants in minimally invasive surgery (MIS) may cause suboptimal view of the operating field, thereby increasing risk for errors. Camera navigation is often performed by the least experienced member of the operating team, such as inexperienced surgical residents, operating room nurses, and medical students. The operating room nurses and medical students are currently not included as key user groups in structured laparoscopic training programs. A new virtual reality laparoscopic camera navigation (LCN) module was specifically developed for these key user groups. This multicenter prospective cohort study assesses face validity and construct validity of the LCN module on the Simendo virtual reality simulator. Face validity was assessed through a questionnaire on resemblance to reality and perceived usability of the instrument among experts and trainees. Construct validity was assessed by comparing scores of groups with different levels of experience on outcome parameters of speed and movement proficiency. The results obtained show uniform and positive evaluation of the LCN module among expert users and trainees, signifying face validity. Experts and intermediate experience groups performed significantly better in task time and camera stability during three repetitions, compared to the less experienced user groups (P < .007). Comparison of learning curves showed significant improvement of proficiency in time and camera stability for all groups during three repetitions (P < .007). The results of this study show face validity and construct validity of the LCN module. The module is suitable for use in training curricula for operating room nurses and novice surgical trainees, aimed at improving team performance in minimally invasive surgery. © The Author(s) 2013.
See-through 3D technology for augmented reality
NASA Astrophysics Data System (ADS)
Lee, Byoungho; Lee, Seungjae; Li, Gang; Jang, Changwon; Hong, Jong-Young
2017-06-01
Augmented reality is recently attracting a lot of attention as one of the most spotlighted next-generation technologies. In order to get toward realization of ideal augmented reality, we need to integrate 3D virtual information into real world. This integration should not be noticed by users blurring the boundary between the virtual and real worlds. Thus, ultimate device for augmented reality can reconstruct and superimpose 3D virtual information on the real world so that they are not distinguishable, which is referred to as see-through 3D technology. Here, we introduce our previous researches to combine see-through displays and 3D technologies using emerging optical combiners: holographic optical elements and index matched optical elements. Holographic optical elements are volume gratings that have angular and wavelength selectivity. Index matched optical elements are partially reflective elements using a compensation element for index matching. Using these optical combiners, we could implement see-through 3D displays based on typical methodologies including integral imaging, digital holographic displays, multi-layer displays, and retinal projection. Some of these methods are expected to be optimized and customized for head-mounted or wearable displays. We conclude with demonstration and analysis of fundamental researches for head-mounted see-through 3D displays.
Cheng, Yufang; Huang, Ruowen
2012-01-01
The focus of this study is using data glove to practice Joint attention skill in virtual reality environment for people with pervasive developmental disorder (PDD). The virtual reality environment provides a safe environment for PDD people. Especially, when they made errors during practice in virtual reality environment, there is no suffering or dangerous consequences to deal with. Joint attention is a critical skill in the disorder characteristics of children with PDD. The absence of joint attention is a deficit frequently affects their social relationship in daily life. Therefore, this study designed the Joint Attention Skills Learning (JASL) systems with data glove tool to help children with PDD to practice joint attention behavior skills. The JASL specifically focus the skills of pointing, showing, sharing things and behavior interaction with other children with PDD. The system is designed in playroom-scene and presented in the first-person perspectives for users. The functions contain pointing and showing, moving virtual objects, 3D animation, text, speaking sounds, and feedback. The method was employed single subject multiple-probe design across subjects' designs, and analysis of visual inspection in this study. It took 3 months to finish the experimental section. Surprisingly, the experiment results reveal that the participants have further extension in improving the joint attention skills in their daily life after using the JASL system. The significant potential in this particular treatment of joint attention for each participant will be discussed in details in this paper. Copyright © 2012 Elsevier Ltd. All rights reserved.
Chuang, Shih-Chyueh; Hwang, Fu-Kwun; Tsai, Chin-Chung
2008-04-01
The purpose of this study was to investigate the perceptions of Internet users of a physics virtual laboratory, Demolab, in Taiwan. Learners' perceptions of Internet-based learning environments were explored and the role of gender was examined by using preferred and actual forms of a revised Constructivist Internet-based Learning Environment Survey (CILES). The students expressed a clear gap between ideal and reality, and they showed higher preferences for many features of constructivist Internet-based learning environments than for features they had actually learned in Demolab. The results further suggested that male users prefer to be involved in the process of discussion and to show critical judgments. In addition, male users indicated they enjoyed the process of negotiation and discussion with others and were able to engage in reflective thoughts while learning in Demolab. In light of these findings, male users seemed to demonstrate better adaptability to the constructivist Internet-based learning approach than female users did. Although this study indicated certain differences between males and females in their responses to Internet-based learning environments, they also shared numerous similarities. A well-established constructivist Internet-based learning environment may encourage more female learners to participate in the science community.
Educational Systems Design Implications of Electronic Publishing.
ERIC Educational Resources Information Center
Romiszowski, Alexander J.
1994-01-01
Discussion of electronic publishing focuses on the four main purposes of media in general: communication, entertainment, motivation, and education. Highlights include electronic journals and books; hypertext; user control; computer graphics and animation; electronic games; virtual reality; multimedia; electronic performance support;…
Computer Based Training: Field Deployable Trainer and Shared Virtual Reality
NASA Technical Reports Server (NTRS)
Mullen, Terence J.
1997-01-01
Astronaut training has traditionally been conducted at specific sites with specialized facilities. Because of its size and nature the training equipment is generally not portable. Efforts are now under way to develop training tools that can be taken to remote locations, including into orbit. Two of these efforts are the Field Deployable Trainer and Shared Virtual Reality projects. Field Deployable Trainer NASA has used the recent shuttle mission by astronaut Shannon Lucid to the Russian space station, Mir, as an opportunity to develop and test a prototype of an on-orbit computer training system. A laptop computer with a customized user interface, a set of specially prepared CD's, and video tapes were taken to the Mir by Ms. Lucid. Based upon the feedback following the launch of the Lucid flight, our team prepared materials for the next Mir visitor. Astronaut John Blaha will fly on NASA/MIR Long Duration Mission 3, set to launch in mid September. He will take with him a customized hard disk drive and a package of compact disks containing training videos, references and maps. The FDT team continues to explore and develop new and innovative ways to conduct offsite astronaut training using personal computers. Shared Virtual Reality Training NASA's Space Flight Training Division has been investigating the use of virtual reality environments for astronaut training. Recent efforts have focused on activities requiring interaction by two or more people, called shared VR. Dr. Bowen Loftin, from the University of Houston, directs a virtual reality laboratory that conducts much of the NASA sponsored research. I worked on a project involving the development of a virtual environment that can be used to train astronauts and others to operate a science unit called a Biological Technology Facility (BTF). Facilities like this will be used to house and control microgravity experiments on the space station. It is hoped that astronauts and instructors will ultimately be able to share common virtual environments and, using telephone links, conduct interactive training from separate locations.
Multiaccommodative stimuli in VR systems: problems & solutions.
Marran, L; Schor, C
1997-09-01
Virtual reality environments can introduce multiple and sometimes conflicting accommodative stimuli. For instance, with the high-powered lenses commonly used in head-mounted displays, small discrepancies in screen lens placement, caused by manufacturer error or user adjustment focus error, can change the focal depths of the image by a couple of diopters. This can introduce a binocular accommodative stimulus or, if the displacement between the two screens is unequal, an unequal (anisometropic) accommodative stimulus for the two eyes. Systems that allow simultaneous viewing of virtual and real images can also introduce a conflict in accommodative stimuli: When real and virtual images are at different focal planes, both cannot be in focus at the same time, though they may appear to be in similar locations in space. In this paper four unique designs are described that minimize the range of accommodative stimuli and maximize the visual system's ability to cope efficiently with the focus conflicts that remain: pinhole optics, monocular lens addition combined with aniso-accommodation, chromatic bifocal, and bifocal lens system. The advantages and disadvantages of each design are described and recommendation for design choice is given after consideration of the end use of the virtual reality system (e.g., low or high end, entertainment, technical, or medical use). The appropriate design modifications should allow greater user comfort and better performance.
DWTP: a basis for networked VR on the Internet
NASA Astrophysics Data System (ADS)
Broll, Wolfgang; Schick, Daniel
1998-04-01
Shared virtual worlds are one of today's major research topics. While limited to particular application areas and high speed networks in the past, they become more and more available to a large number of users. One reason for this development was the introduction of VRML (the Virtual Reality Modeling Language), which has been established as a standard of the exchange of 3D worlds on the Internet. Although a number of prototype systems have been developed to realize shared multi-user worlds based on VRML, no suitable network protocol to support the demands of such environments has yet been established. In this paper we will introduce our approach of a network protocol for shared virtual environments: DWTP--the Distributed Worlds Transfer and communication Protocol. We will show how DWTP meets the demands of shared virtual environments on the Internet. We will further present SmallView, our prototype of a distributed multi-user VR system, to show how DWTP can be used to realize shared worlds.
Collaborative voxel-based surgical virtual environments.
Acosta, Eric; Muniz, Gilbert; Armonda, Rocco; Bowyer, Mark; Liu, Alan
2008-01-01
Virtual Reality-based surgical simulators can utilize Collaborative Virtual Environments (C-VEs) to provide team-based training. To support real-time interactions, C-VEs are typically replicated on each user's local computer and a synchronization method helps keep all local copies consistent. This approach does not work well for voxel-based C-VEs since large and frequent volumetric updates make synchronization difficult. This paper describes a method that allows multiple users to interact within a voxel-based C-VE for a craniotomy simulator being developed. Our C-VE method requires smaller update sizes and provides faster synchronization update rates than volumetric-based methods. Additionally, we address network bandwidth/latency issues to simulate networked haptic and bone drilling tool interactions with a voxel-based skull C-VE.
Virtual reality for health care: a survey.
Moline, J
1997-01-01
This report surveys the state of the art in applications of virtual environments and related technologies for health care. Applications of these technologies are being developed for health care in the following areas: surgical procedures (remote surgery or telepresence, augmented or enhanced surgery, and planning and simulation of procedures before surgery); medical therapy; preventive medicine and patient education; medical education and training; visualization of massive medical databases; skill enhancement and rehabilitation; and architectural design for health-care facilities. To date, such applications have improved the quality of health care, and in the future they will result in substantial cost savings. Tools that respond to the needs of present virtual environment systems are being refined or developed. However, additional large-scale research is necessary in the following areas: user studies, use of robots for telepresence procedures, enhanced system reality, and improved system functionality.
[Use of virtual reality in forensic psychiatry. A new paradigm?].
Fromberger, P; Jordan, K; Müller, J L
2014-03-01
For more than 20 years virtual realities (VR) have been successfully used in the assessment and treatment of psychiatric disorders. The most important advantages of VR are the high ecological validity of virtual environments, the entire controllability of virtual stimuli in the virtual environment and the capability to induce the sensation of being in the virtual environment instead of the physical environment. VRs provide the opportunity to face the user with stimuli and situations which are not available or too risky in reality. Despite these advantages VR-based applications have not yet been applied in forensic psychiatry. On the basis of an overview of the recent state-of-the-art in VR-based applications in general psychiatry, the article demonstrates the advantages and possibilities of VR-based applications in forensic psychiatry. Up to now only preliminary studies regarding the VR-based assessment of pedophilic interests exist. These studies demonstrate the potential of ecologically valid VR-based applications for the assessment of forensically relevant disorders. One of the most important advantages is the possibility of VR to assess the behavior of forensic inpatients in crime-related situations without endangering others. This provides completely new possibilities not only regarding the assessment but also for the treatment of forensic inpatients. Before utilizing these possibilities in the clinical practice exhaustive research and development will be necessary. Given the high potential of VR-based applications, this effort would be worth it.
Visualizing the process of interaction in a 3D environment
NASA Astrophysics Data System (ADS)
Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh
2007-03-01
As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.
NASA's Hybrid Reality Lab: One Giant Leap for Full Dive
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2017-01-01
This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.
Glegg, Stephanie M N; Holsti, Liisa; Stanton, Sue; Hanna, Steven; Velikonja, Diana; Ansley, Barbara; Sartor, Denise; Brum, Christine
2017-04-01
To evaluate the impact of knowledge translation (KT) on factors influencing virtual reality (VR) adoption and to identify support needs of therapists. Intervention will be associated with improvements in therapists' perceived ease of use and self-efficacy, and an associated increase in intentions to use VR. Single group mixed-methods pre-test-post-test evaluation of convenience sample of physical, occupational and rehabilitation therapists (n=37) from two brain injury rehabilitation centres. ADOPT-VR administered pre/post KT intervention, consisting of interactive education, clinical manual, technical and clinical support. Increases in perceived ease of use (p=0.000) and self-efficacy (p=0.001), but not behavioural intention to use VR (p=0.158) were found following KT, along with decreases in the frequency of perceived barriers. Post-test changes in the frequency and nature of perceived facilitators and barriers were evident, with increased emphasis on peer influence, organisational-level supports and client factors. Additional support needs were related to clinical reasoning, treatment programme development, technology selection and troubleshooting. KT strategies hold potential for targeting therapists' perceptions of low self-efficacy and ease of use of this technology. Changes in perceived barriers, facilitators and support needs at post-test demonstrated support for repeated evaluation and multi-phased training initiatives to address therapists' needs over time. Implications for Rehabilitation Therapists' learning and support needs in integrating virtual reality extend beyond technical proficiency to include clinical decision-making and application competencies spanning the entire rehabilitation process. Phased, multi-faceted strategies may be valuable in addressing therapists' changing needs as they progress from novice to experienced virtual reality users. The ADOPT-VR is a sensitive measure to re-evaluate the personal, social, environmental, technology-specific and system-level factors influencing virtual reality adoption over time.
Innovative application of virtual display technique in virtual museum
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-09-01
Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.
Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders
Chicchi Giglioli, Irene Alice; Pedroli, Elisa
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283
Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology.
Hybrid diffractive-refractive optical system design of head-mounted display for augmented reality
NASA Astrophysics Data System (ADS)
Zhang, Huijuan
2005-02-01
An optical see-through head-mounted display for augmented reality is designed in this paper. Considering the factors, such as the optical performance, the utilization ratios of energy of real world and virtual world, the feelings of users when he wears it and etc., a structure of the optical see-through is adopted. With the characteristics of the particular negative dispersive and the power of realizing random-phase modulation, the diffractive surface is helpful for optical system of reducing weight, simplifying structure and etc., and a diffractive surface is introduced in our optical system. The optical system with 25 mm eye relief, 12 mm exit pupil and 20° (H)x15.4° (V) field-of-view is designed. The utilization ratios of energy of real world and virtual world are 1/4 and 1/2, respectively. The angular resolution of display is 0.27 mrad and it less than that of the minimum of human eyes. The diameter of this system is less than 46mm, and it applies the binocular. This diffractive-refractive optical system of see-through head-mounted display not only satisfies the demands of user"s factors in structure, but also with high resolution, very small chromatic aberration and distortion, and satisfies the need of augmented reality. In the end, the parameters of the diffractive surface are discussed.
ERIC Educational Resources Information Center
Pantelidis, Veronica S.
2009-01-01
Many studies have been conducted on the use of virtual reality in education and training. This article lists examples of such research. Reasons to use virtual reality are discussed. Advantages and disadvantages of using virtual reality are presented, as well as suggestions on when to use and when not to use virtual reality. A model that can be…
Shono, Naoyuki; Kin, Taichi; Nomura, Seiji; Miyawaki, Satoru; Saito, Toki; Imai, Hideaki; Nakatomi, Hirofumi; Oyama, Hiroshi; Saito, Nobuhito
2018-05-01
A virtual reality simulator for aneurysmal clipping surgery is an attractive research target for neurosurgeons. Brain deformation is one of the most important functionalities necessary for an accurate clipping simulator and is vastly affected by the status of the supporting tissue, such as the arachnoid membrane. However, no virtual reality simulator implementing the supporting tissue of the brain has yet been developed. To develop a virtual reality clipping simulator possessing interactive brain deforming capability closely dependent on arachnoid dissection and apply it to clinical cases. Three-dimensional computer graphics models of cerebral tissue and surrounding structures were extracted from medical images. We developed a new method for modifiable cerebral tissue complex deformation by incorporating a nonmedical image-derived virtual arachnoid/trabecula in a process called multitissue integrated interactive deformation (MTIID). MTIID made it possible for cerebral tissue complexes to selectively deform at the site of dissection. Simulations for 8 cases of actual clipping surgery were performed before surgery and evaluated for their usefulness in surgical approach planning. Preoperatively, each operative field was precisely reproduced and visualized with the virtual brain retraction defined by users. The clear visualization of the optimal approach to treating the aneurysm via an appropriate arachnoid incision was possible with MTIID. A virtual clipping simulator mainly focusing on supporting tissues and less on physical properties seemed to be useful in the surgical simulation of cerebral aneurysm clipping. To our knowledge, this article is the first to report brain deformation based on supporting tissues.
Suitability of virtual prototypes to support human factors/ergonomics evaluation during the design.
Aromaa, Susanna; Väänänen, Kaisa
2016-09-01
In recent years, the use of virtual prototyping has increased in product development processes, especially in the assessment of complex systems targeted at end-users. The purpose of this study was to evaluate the suitability of virtual prototyping to support human factors/ergonomics evaluation (HFE) during the design phase. Two different virtual prototypes were used: augmented reality (AR) and virtual environment (VE) prototypes of a maintenance platform of a rock crushing machine. Nineteen designers and other stakeholders were asked to assess the suitability of the prototype for HFE evaluation. Results indicate that the system model characteristics and user interface affect the experienced suitability. The VE system was valued as being more suitable to support the assessment of visibility, reach, and the use of tools than the AR system. The findings of this study can be used as a guidance for the implementing virtual prototypes in the product development process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Tuning self-motion perception in virtual reality with visual illusions.
Bruder, Gerd; Steinicke, Frank; Wieland, Phil; Lappe, Markus
2012-07-01
Motion perception in immersive virtual environments significantly differs from the real world. For example, previous work has shown that users tend to underestimate travel distances in virtual environments (VEs). As a solution to this problem, researchers proposed to scale the mapped virtual camera motion relative to the tracked real-world movement of a user until real and virtual motion are perceived as equal, i.e., real-world movements could be mapped with a larger gain to the VE in order to compensate for the underestimation. However, introducing discrepancies between real and virtual motion can become a problem, in particular, due to misalignments of both worlds and distorted space cognition. In this paper, we describe a different approach that introduces apparent self-motion illusions by manipulating optic flow fields during movements in VEs. These manipulations can affect self-motion perception in VEs, but omit a quantitative discrepancy between real and virtual motions. In particular, we consider to which regions of the virtual view these apparent self-motion illusions can be applied, i.e., the ground plane or peripheral vision. Therefore, we introduce four illusions and show in experiments that optic flow manipulation can significantly affect users' self-motion judgments. Furthermore, we show that with such manipulations of optic flow fields the underestimation of travel distances can be compensated.
Developing an Augmented Reality Environment for Earth Science Education
NASA Astrophysics Data System (ADS)
Pratt, M. J.; Skemer, P. A.; Arvidson, R. E.
2017-12-01
The emerging field of augmented reality (AR) provides new and exciting ways to explore geologic phenomena for research and education. The primary advantage of AR is that it allows users to physically explore complex three-dimensional structures that were previously inaccessible, for example a remote geologic outcrop or a mineral structure at the atomic scale. It is used, for example, with OnSight software during tactical operations to plan the Mars Curiosity rover's traverses by providing virtual views to walk through terrain and the rover at true scales. This mode of physical exploration allows users more freedom to investigate and understand the 3D structure than is possible on a flat computer screen, or within a static PowerPoint presentation during a classroom lecture. The Microsoft HoloLens headset provides the most-advanced, mobile AR platform currently available to developers. The Fossett Laboratory for Virtual Planetary Exploration at Washington University in St. Louis has applied this technology, coupled with photogrammetric software and the Unity 3D gaming engine, to develop photorealistic environments of 3D geologic outcrops from around the world. The untethered HoloLens provides an ideal platform for a classroom setting as it allows for shared experiences of the holograms of interest, projecting them in the same location for all users to explore. Furthermore, the HoloLens allows for face-to-face communication during use that is important in teaching, a feature that virtual reality does not allow. Our development of an AR application includes the design of an online database of photogrammetric outcrop models curated for the current limitations of AR technology. This database will be accessible to both those wishing to submit models, and is free to those wishing to use the application for teaching, outreach or research purposes.
Evaluating the use of augmented reality to support undergraduate student learning in geomorphology
NASA Astrophysics Data System (ADS)
Ockelford, A.; Bullard, J. E.; Burton, E.; Hackney, C. R.
2016-12-01
Augmented Reality (AR) supports the understanding of complex phenomena by providing unique visual and interactive experiences that combine real and virtual information and help communicate abstract problems to learners. With AR, designers can superimpose virtual graphics over real objects, allowing users to interact with digital content through physical manipulation. One of the most significant pedagogic features of AR is that it provides an essentially student-centred and flexible space in which students can learn. By actively engaging participants using a design-thinking approach, this technology has the potential to provide a more productive and engaging learning environment than real or virtual learning environments alone. AR is increasingly being used in support of undergraduate learning and public engagement activities across engineering, medical and humanities disciplines but it is not widely used across the geosciences disciplines despite the obvious applicability. This paper presents preliminary results from a multi-institutional project which seeks to evaluate the benefits and challenges of using an augmented reality sand box to support undergraduate learning in geomorphology. The sandbox enables users to create and visualise topography. As the sand is sculpted, contours are projected onto the miniature landscape. By hovering a hand over the box, users can make it `rain' over the landscape and the water `flows' down in to rivers and valleys. At undergraduate level, the sand-box is an ideal focus for problem-solving exercises, for example exploring how geomorphology controls hydrological processes, how such processes can be altered and the subsequent impacts of the changes for environmental risk. It is particularly valuable for students who favour a visual or kinesthetic learning style. Results presented in this paper discuss how the sandbox provides a complex interactive environment that encourages communication, collaboration and co-design.
Training software using virtual-reality technology and pre-calculated effective dose data.
Ding, Aiping; Zhang, Di; Xu, X George
2009-05-01
This paper describes the development of a software package, called VR Dose Simulator, which aims to provide interactive radiation safety and ALARA training to radiation workers using virtual-reality (VR) simulations. Combined with a pre-calculated effective dose equivalent (EDE) database, a virtual radiation environment was constructed in VR authoring software, EON Studio, using 3-D models of a real nuclear power plant building. Models of avatars representing two workers were adopted with arms and legs of the avatar being controlled in the software to simulate walking and other postures. Collision detection algorithms were developed for various parts of the 3-D power plant building and avatars to confine the avatars to certain regions of the virtual environment. Ten different camera viewpoints were assigned to conveniently cover the entire virtual scenery in different viewing angles. A user can control the avatar to carry out radiological engineering tasks using two modes of avatar navigation. A user can also specify two types of radiation source: Cs and Co. The location of the avatar inside the virtual environment during the course of the avatar's movement is linked to the EDE database. The accumulative dose is calculated and displayed on the screen in real-time. Based on the final accumulated dose and the completion status of all virtual tasks, a score is given to evaluate the performance of the user. The paper concludes that VR-based simulation technologies are interactive and engaging, thus potentially useful in improving the quality of radiation safety training. The paper also summarizes several challenges: more streamlined data conversion, realistic avatar movement and posture, more intuitive implementation of the data communication between EON Studio and VB.NET, and more versatile utilization of EDE data such as a source near the body, etc., all of which needs to be addressed in future efforts to develop this type of software.
Modulation of thermal pain-related brain activity with virtual reality: evidence from fMRI.
Hoffman, Hunter G; Richards, Todd L; Coda, Barbara; Bills, Aric R; Blough, David; Richards, Anne L; Sharar, Sam R
2004-06-07
This study investigated the neural correlates of virtual reality analgesia. Virtual reality significantly reduced subjective pain ratings (i.e. analgesia). Using fMRI, pain-related brain activity was measured for each participant during conditions of no virtual reality and during virtual reality (order randomized). As predicted, virtual reality significantly reduced pain-related brain activity in all five regions of interest; the anterior cingulate cortex, primary and secondary somatosensory cortex, insula, and thalamus (p<0.002, corrected). Results showed direct modulation of human brain pain responses by virtual reality distraction. Copyright 2004 Lippincott Williams and Wilkins
3D virtual environment of Taman Mini Indonesia Indah in a web
NASA Astrophysics Data System (ADS)
Wardijono, B. A.; Wardhani, I. P.; Chandra, Y. I.; Pamungkas, B. U. G.
2018-05-01
Taman Mini Indonesia Indah known as TMII is a largest recreational park based on culture in Indonesia. This park has 250 acres that consist of houses from provinces in Indonesia. In TMII, there are traditional houses of the various provinces in Indonesia. The official website of TMII has informed the traditional houses, but the information was limited to public. To provide information more detail about TMII to the public, this research aims to create and develop virtual traditional houses as 3d graphics models and show it via website. The Virtual Reality (VR) technology was used to display the visualization of the TMII and the surrounding environment. This research used Blender software to create the 3D models and Unity3D software to make virtual reality models that can be showed on a web. This research has successfully created 33 virtual traditional houses of province in Indonesia. The texture of traditional house was taken from original to make the culture house realistic. The result of this research was the website of TMII including virtual culture houses that can be displayed through the web browser. The website consists of virtual environment scenes and internet user can walkthrough and navigates inside the scenes.
An augmented reality system validation for the treatment of cockroach phobia.
Bretón-López, Juani; Quero, Soledad; Botella, Cristina; García-Palacios, Azucena; Baños, Rosa Maria; Alcañiz, Mariano
2010-12-01
Augmented reality (AR) is a new technology in which various virtual elements are incorporated into the user's perception of the real world. The most significant aspect of AR is that the virtual elements add relevant and helpful information to the real scene. AR shares some important characteristics with virtual reality as applied in clinical psychology. However, AR offers additional features that might be crucial for treating certain problems. An AR system designed to treat insect phobia has been used for treating phobia of small animals, and positive preliminary data about the global efficacy of the system have been obtained. However, it is necessary to determine the capacity of similar AR systems and their elements that are designed to evoke anxiety in participants; this is achieved by testing the correspondence between the inclusion of feared stimuli and the induction of anxiety. The objective of the present work is to validate whether the stimuli included in the AR-Insect Phobia system are capable of inducing anxiety in six participants diagnosed with cockroach phobia. Results support the adequacy of each element of the system in inducing anxiety in all participants.
Kim, Hyun K; Park, Jaehyun; Choi, Yeongcheol; Choe, Mungyeong
2018-05-01
This study aims to develop a motion sickness measurement index in a virtual reality (VR) environment. The VR market is in an early stage of market formation and technological development, and thus, research on the side effects of VR devices such as simulator motion sickness is lacking. In this study, we used the simulator sickness questionnaire (SSQ), which has been traditionally used for simulator motion sickness measurement. To measure the motion sickness in a VR environment, 24 users performed target selection tasks using a VR device. The SSQ was administered immediately after each task, and the order of work was determined using the Latin square design. The existing SSQ was revised to develop a VR sickness questionnaire, which is used as the measurement index in a VR environment. In addition, the target selection method and button size were found to be significant factors that affect motion sickness in a VR environment. The results of this study are expected to be used for measuring and designing simulator sickness using VR devices in future studies. Copyright © 2018 Elsevier Ltd. All rights reserved.
Roberts, Amy Restorick; Schutter, Bob De; Franks, Kelley; Radina, M Elise
2018-02-21
This study explores how older adults respond to audiovisual virtual reality (VR) and perceive its usefulness to their lives. Focus groups were conducted with residents of a retirement community after they viewed two audiovisual VR simulations (n = 41). Thematic analysis was used to identify patterns in responses. Older adults described positive and negative emotional reactions to aspects of the VR experience, articulated content preferences, shared ideas to improve the usability of the equipment, and identified facilitators and barriers that influenced perceived usefulness. Recommendations for improving this technology include maximizing the positive aspects of VR through increasing interactivity, facilitating socializing with friends or family, and enhancing older adults' ease of use. Desired content of simulations involved travel, continuing education, reminiscence, and self-care/therapy. Virtual reality was reviewed positively, yet modifications are necessary to facilitate optimal user experience and potential benefit for this population. As older adults are interested in using VR, especially if poor health prevents the continuation of desirable activities or new experiences, it is important to respond to older adults' preferences and remove barriers that limit use and enjoyment.
Mirelman, Anat; Rochester, Lynn; Reelick, Miriam; Nieuwhof, Freek; Pelosin, Elisa; Abbruzzese, Giovanni; Dockx, Kim; Nieuwboer, Alice; Hausdorff, Jeffrey M
2013-02-06
Recent work has demonstrated that fall risk can be attributed to cognitive as well as motor deficits. Indeed, everyday walking in complex environments utilizes executive function, dual tasking, planning and scanning, all while walking forward. Pilot studies suggest that a multi-modal intervention that combines treadmill training to target motor function and a virtual reality obstacle course to address the cognitive components of fall risk may be used to successfully address the motor-cognitive interactions that are fundamental for fall risk reduction. The proposed randomized controlled trial will evaluate the effects of treadmill training augmented with virtual reality on fall risk. Three hundred older adults with a history of falls will be recruited to participate in this study. This will include older adults (n=100), patients with mild cognitive impairment (n=100), and patients with Parkinson's disease (n=100). These three sub-groups will be recruited in order to evaluate the effects of the intervention in people with a range of motor and cognitive deficits. Subjects will be randomly assigned to the intervention group (treadmill training with virtual reality) or to the active-control group (treadmill training without virtual reality). Each person will participate in a training program set in an outpatient setting 3 times per week for 6 weeks. Assessments will take place before, after, and 1 month and 6 months after the completion of the training. A falls calendar will be kept by each participant for 6 months after completing the training to assess fall incidence (i.e., the number of falls, multiple falls and falls rate). In addition, we will measure gait under usual and dual task conditions, balance, community mobility, health related quality of life, user satisfaction and cognitive function. This randomized controlled trial will demonstrate the extent to which an intervention that combines treadmill training augmented by virtual reality reduces fall risk, improves mobility and enhances cognitive function in a diverse group of older adults. In addition, the comparison to an active control group that undergoes treadmill training without virtual reality will provide evidence as to the added value of addressing motor cognitive interactions as an integrated unit. (NIH)-NCT01732653.
2013-01-01
Background Recent work has demonstrated that fall risk can be attributed to cognitive as well as motor deficits. Indeed, everyday walking in complex environments utilizes executive function, dual tasking, planning and scanning, all while walking forward. Pilot studies suggest that a multi-modal intervention that combines treadmill training to target motor function and a virtual reality obstacle course to address the cognitive components of fall risk may be used to successfully address the motor-cognitive interactions that are fundamental for fall risk reduction. The proposed randomized controlled trial will evaluate the effects of treadmill training augmented with virtual reality on fall risk. Methods/Design Three hundred older adults with a history of falls will be recruited to participate in this study. This will include older adults (n=100), patients with mild cognitive impairment (n=100), and patients with Parkinson’s disease (n=100). These three sub-groups will be recruited in order to evaluate the effects of the intervention in people with a range of motor and cognitive deficits. Subjects will be randomly assigned to the intervention group (treadmill training with virtual reality) or to the active-control group (treadmill training without virtual reality). Each person will participate in a training program set in an outpatient setting 3 times per week for 6 weeks. Assessments will take place before, after, and 1 month and 6 months after the completion of the training. A falls calendar will be kept by each participant for 6 months after completing the training to assess fall incidence (i.e., the number of falls, multiple falls and falls rate). In addition, we will measure gait under usual and dual task conditions, balance, community mobility, health related quality of life, user satisfaction and cognitive function. Discussion This randomized controlled trial will demonstrate the extent to which an intervention that combines treadmill training augmented by virtual reality reduces fall risk, improves mobility and enhances cognitive function in a diverse group of older adults. In addition, the comparison to an active control group that undergoes treadmill training without virtual reality will provide evidence as to the added value of addressing motor cognitive interactions as an integrated unit. Trial Registration (NIH)–NCT01732653 PMID:23388087
Designing informed game-based rehabilitation tasks leveraging advances in virtual reality.
Lange, Belinda; Koenig, Sebastian; Chang, Chien-Yen; McConnell, Eric; Suma, Evan; Bolas, Mark; Rizzo, Albert
2012-01-01
This paper details a brief history and rationale for the use of virtual reality (VR) technology for clinical research and intervention, and then focuses on game-based VR applications in the area of rehabilitation. An analysis of the match between rehabilitation task requirements and the assets available with VR technology is presented. Low-cost camera-based systems capable of tracking user behavior at sufficient levels for game-based virtual rehabilitation activities are currently available for in-home use. Authoring software is now being developed that aims to provide clinicians with a usable toolkit for leveraging this technology. This will facilitate informed professional input on software design, development and application to ensure safe and effective use in the rehabilitation context. The field of rehabilitation generally stands to benefit from the continual advances in VR technology, concomitant system cost reductions and an expanding clinical research literature and knowledge base. Home-based activity within VR systems that are low-cost, easy to deploy and maintain, and meet the requirements for "good" interactive rehabilitation tasks could radically improve users' access to care, adherence to prescribed training and subsequently enhance functional activity in everyday life in clinical populations.
NASA Technical Reports Server (NTRS)
Orr, Joel N.
1995-01-01
This reflection of human-computer interface and its requirements as virtual technology is advanced, proposes a new term: 'Pezonomics'. The term replaces the term ergonomics ('the law of work') with a definition pointing to 'the law of play.' The necessity of this term, the author reasons, comes from the need to 'capture the essence of play and calibrate our computer systems to its cadences.' Pezonomics will ensure that artificial environments, in particular virtual reality, are user friendly.
The Optokinetic Cervical Reflex (OKCR) in Pilots of High-Performance Aircraft.
1997-04-01
Coupled System virtual reality - the attempt to create a realistic, three-dimensional environment or synthetic immersive environment in which the user ...factors interface between the pilot and the flight environment. The final section is a case study of head- and helmet-mounted displays (HMD) and the impact...themselves as actually moving (flying) through a virtual environment. However, in the studies of Held, et al. (1975) and Young, et al. (1975) the
An applications-oriented approach to the development of virtual environments
NASA Technical Reports Server (NTRS)
Crowe, Michael X.
1994-01-01
The field of Virtual Reality (VR) is diverse, ranging in scope from research into fundamental enabling technologies to the building of full-scale entertainment facilities. However, the concept of virtual reality means many things to many people. Ideally, a definition of VR should derive from how it can provide solutions to existing challenges in building advanced human computer interfaces. The measure of success for VR lies in its ability to enhance the assimilation of complex information, whether to aid in difficult decision making processes, or to recreate real experiences in a compelling way. This philosophy is described using an example from a VR-based advertising project. The common and unique elements of this example are explained, though the fundamental development process is the same for all virtual environments that support information transfer. In short, this development approach is an applications oriented approach that begins by establishing and prioritizing user requirements and seeks to add value to the information transfer process through the appropriate use of VR technology.
Freeman, Daniel; Bradley, Jonathan; Antley, Angus; Bourke, Emilie; DeWeever, Natalie; Evans, Nicole; Černis, Emma; Sheaves, Bryony; Waite, Felicity; Dunn, Graham; Slater, Mel; Clark, David M
2016-07-01
Persecutory delusions may be unfounded threat beliefs maintained by safety-seeking behaviours that prevent disconfirmatory evidence being successfully processed. Use of virtual reality could facilitate new learning. To test the hypothesis that enabling patients to test the threat predictions of persecutory delusions in virtual reality social environments with the dropping of safety-seeking behaviours (virtual reality cognitive therapy) would lead to greater delusion reduction than exposure alone (virtual reality exposure). Conviction in delusions and distress in a real-world situation were assessed in 30 patients with persecutory delusions. Patients were then randomised to virtual reality cognitive therapy or virtual reality exposure, both with 30 min in graded virtual reality social environments. Delusion conviction and real-world distress were then reassessed. In comparison with exposure, virtual reality cognitive therapy led to large reductions in delusional conviction (reduction 22.0%, P = 0.024, Cohen's d = 1.3) and real-world distress (reduction 19.6%, P = 0.020, Cohen's d = 0.8). Cognitive therapy using virtual reality could prove highly effective in treating delusions. © The Royal College of Psychiatrists 2016.
Freeman, Daniel; Bradley, Jonathan; Antley, Angus; Bourke, Emilie; DeWeever, Natalie; Evans, Nicole; Černis, Emma; Sheaves, Bryony; Waite, Felicity; Dunn, Graham; Slater, Mel; Clark, David M.
2016-01-01
Background Persecutory delusions may be unfounded threat beliefs maintained by safety-seeking behaviours that prevent disconfirmatory evidence being successfully processed. Use of virtual reality could facilitate new learning. Aims To test the hypothesis that enabling patients to test the threat predictions of persecutory delusions in virtual reality social environments with the dropping of safety-seeking behaviours (virtual reality cognitive therapy) would lead to greater delusion reduction than exposure alone (virtual reality exposure). Method Conviction in delusions and distress in a real-world situation were assessed in 30 patients with persecutory delusions. Patients were then randomised to virtual reality cognitive therapy or virtual reality exposure, both with 30 min in graded virtual reality social environments. Delusion conviction and real-world distress were then reassessed. Results In comparison with exposure, virtual reality cognitive therapy led to large reductions in delusional conviction (reduction 22.0%, P = 0.024, Cohen's d = 1.3) and real-world distress (reduction 19.6%, P = 0.020, Cohen's d = 0.8). Conclusion Cognitive therapy using virtual reality could prove highly effective in treating delusions. PMID:27151071
Virtual reality and paranoid ideations in people with an 'at-risk mental state' for psychosis.
Valmaggia, Lucia R; Freeman, Daniel; Green, Catherine; Garety, Philippa; Swapp, David; Antley, Angus; Prescott, Corinne; Fowler, David; Kuipers, Elizabeth; Bebbington, Paul; Slater, Mel; Broome, Matthew; McGuire, Philip K
2007-12-01
Virtual reality provides a means of studying paranoid thinking in controlled laboratory conditions. However, this method has not been used with a clinical group. To establish the feasibility and safety of using virtual reality methodology in people with an at-risk mental state and to investigate the applicability of a cognitive model of paranoia to this group. Twenty-one participants with an at-risk mental state were assessed before and after entering a virtual reality environment depicting the inside of an underground train. Virtual reality did not raise levels of distress at the time of testing or cause adverse experiences over the subsequent week. Individuals attributed mental states to virtual reality characters including hostile intent. Persecutory ideation in virtual reality was predicted by higher levels of trait paranoia, anxiety, stress, immersion in virtual reality, perseveration and interpersonal sensitivity. Virtual reality is an acceptable experimental technique for use with individuals with at-risk mental states. Paranoia in virtual reality was understandable in terms of the cognitive model of persecutory delusions.
Augmented Reality Based Doppler Lidar Data Visualization: Promises and Challenges
NASA Astrophysics Data System (ADS)
Cherukuru, N. W.; Calhoun, R.
2016-06-01
Augmented reality (AR) is a technology in which the enables the user to view virtual content as if it existed in real world. We are exploring the possibility of using this technology to view radial velocities or processed wind vectors from a Doppler wind lidar, thus giving the user an ability to see the wind in a literal sense. This approach could find possible applications in aviation safety, atmospheric data visualization as well as in weather education and public outreach. As a proof of concept, we used the lidar data from a recent field campaign and developed a smartphone application to view the lidar scan in augmented reality. In this paper, we give a brief methodology of this feasibility study, present the challenges and promises of using AR technology in conjunction with Doppler wind lidars.
NASA Astrophysics Data System (ADS)
Homainejad, Amir S.; Satari, Mehran
2000-05-01
VR is possible which brings users to the reality by computer and VE is a simulated world which takes users to any points and directions of the object. VR and VE can be very useful if accurate and precise data are sued, and allows users to work with realistic model. Photogrammetry is a technique which is able to collect and provide accurate and precise data for building 3D model in a computer. Data can be collected from various sensor and cameras, and methods of data collector are vary based on the method of image acquiring. Indeed VR includes real-time graphics, 3D model, and display and it has application in the entertainment industry, flight simulators, industrial design.
NASA Technical Reports Server (NTRS)
Gutensohn, Michael
2018-01-01
The task for this project was to design, develop, test, and deploy a facial recognition system for the Kennedy Space Center Augmented/Virtual Reality Lab. This system will serve as a means of user authentication as part of the NUI of the lab. The overarching goal is to create a seamless user interface that will allow the user to initiate and interact with AR and VR experiences without ever needing to use a mouse or keyboard at any step in the process.
Virtual Reality: Developing a VR space for Academic activities
NASA Astrophysics Data System (ADS)
Kaimaris, D.; Stylianidis, E.; Karanikolas, N.
2014-05-01
Virtual reality (VR) is extensively used in various applications; in industry, in academia, in business, and is becoming more and more affordable for end users from the financial point of view. At the same time, in academia and higher education more and more applications are developed, like in medicine, engineering, etc. and students are inquiring to be well-prepared for their professional life after their educational life cycle. Moreover, VR is providing the benefits having the possibility to improve skills but also to understand space as well. This paper presents the methodology used during a course, namely "Geoinformatics applications" at the School of Spatial Planning and Development (Eng.), Aristotle University of Thessaloniki, to create a virtual School space. The course design focuses on the methods and techniques to be used in order to develop the virtual environment. In addition the project aspires to become more and more effective for the students and provide a real virtual environment with useful information not only for the students but also for any citizen interested in the academic life at the School.
Navarro-Haro, María V; López-Del-Hoyo, Yolanda; Campos, Daniel; Linehan, Marsha M; Hoffman, Hunter G; García-Palacios, Azucena; Modrego-Alarcón, Marta; Borao, Luis; García-Campayo, Javier
2017-01-01
Regular mindfulness practice benefits people both mentally and physically, but many populations who could benefit do not practice mindfulness. Virtual Reality (VR) is a new technology that helps capture participants' attention and gives users the illusion of "being there" in the 3D computer generated environment, facilitating sense of presence. By limiting distractions from the real world, increasing sense of presence and giving people an interesting place to go to practice mindfulness, Virtual Reality may facilitate mindfulness practice. Traditional Dialectical Behavioral Therapy (DBT®) mindfulness skills training was specifically designed for clinical treatment of people who have trouble focusing attention, however severe patients often show difficulties or lack of motivation to practice mindfulness during the training. The present pilot study explored whether a sample of mindfulness experts would find useful and recommend a new VR Dialectical Behavioral Therapy (DBT®) mindfulness skills training technique and whether they would show any benefit. Forty four participants attending a mindfulness conference put on an Oculus Rift DK2 Virtual Reality helmet and floated down a calm 3D computer generated virtual river while listening to digitized DBT® mindfulness skills training instructions. On subjective questionnaires completed by the participants before and after the VR DBT® mindfulness skills training session, participants reported increases/improvements in state of mindfulness, and reductions in negative emotional states. After VR, participants reported significantly less sadness, anger, and anxiety, and reported being significantly more relaxed. Participants reported a moderate to strong illusion of going inside the 3D computer generated world (i.e., moderate to high "presence" in VR) and showed high acceptance of VR as a technique to practice mindfulness. These results show encouraging preliminary evidence of the feasibility and acceptability of using VR to practice mindfulness based on clinical expert feedback. VR is a technology with potential to increase computerized dissemination of DBT® skills training modules. Future research is warranted.
Navarro-Haro, María V.; López-del-Hoyo, Yolanda; Campos, Daniel; Linehan, Marsha M.; Hoffman, Hunter G.; García-Palacios, Azucena; Modrego-Alarcón, Marta; Borao, Luis; García-Campayo, Javier
2017-01-01
Regular mindfulness practice benefits people both mentally and physically, but many populations who could benefit do not practice mindfulness. Virtual Reality (VR) is a new technology that helps capture participants’ attention and gives users the illusion of “being there” in the 3D computer generated environment, facilitating sense of presence. By limiting distractions from the real world, increasing sense of presence and giving people an interesting place to go to practice mindfulness, Virtual Reality may facilitate mindfulness practice. Traditional Dialectical Behavioral Therapy (DBT®) mindfulness skills training was specifically designed for clinical treatment of people who have trouble focusing attention, however severe patients often show difficulties or lack of motivation to practice mindfulness during the training. The present pilot study explored whether a sample of mindfulness experts would find useful and recommend a new VR Dialectical Behavioral Therapy (DBT®) mindfulness skills training technique and whether they would show any benefit. Forty four participants attending a mindfulness conference put on an Oculus Rift DK2 Virtual Reality helmet and floated down a calm 3D computer generated virtual river while listening to digitized DBT® mindfulness skills training instructions. On subjective questionnaires completed by the participants before and after the VR DBT® mindfulness skills training session, participants reported increases/improvements in state of mindfulness, and reductions in negative emotional states. After VR, participants reported significantly less sadness, anger, and anxiety, and reported being significantly more relaxed. Participants reported a moderate to strong illusion of going inside the 3D computer generated world (i.e., moderate to high “presence” in VR) and showed high acceptance of VR as a technique to practice mindfulness. These results show encouraging preliminary evidence of the feasibility and acceptability of using VR to practice mindfulness based on clinical expert feedback. VR is a technology with potential to increase computerized dissemination of DBT® skills training modules. Future research is warranted. PMID:29166665
The CAVE (TM) automatic virtual environment: Characteristics and applications
NASA Technical Reports Server (NTRS)
Kenyon, Robert V.
1995-01-01
Virtual reality may best be defined as the wide-field presentation of computer-generated, multi-sensory information that tracks a user in real time. In addition to the more well-known modes of virtual reality -- head-mounted displays and boom-mounted displays -- the Electronic Visualization Laboratory at the University of Illinois at Chicago recently introduced a third mode: a room constructed from large screens on which the graphics are projected on to three walls and the floor. The CAVE is a multi-person, room sized, high resolution, 3D video and audio environment. Graphics are rear projected in stereo onto three walls and the floor, and viewed with stereo glasses. As a viewer wearing a location sensor moves within its display boundaries, the correct perspective and stereo projections of the environment are updated, and the image moves with and surrounds the viewer. The other viewers in the CAVE are like passengers in a bus, along for the ride. 'CAVE,' the name selected for the virtual reality theater, is both a recursive acronym (Cave Automatic Virtual Environment) and a reference to 'The Simile of the Cave' found in Plato's 'Republic,' in which the philosopher explores the ideas of perception, reality, and illusion. Plato used the analogy of a person facing the back of a cave alive with shadows that are his/her only basis for ideas of what real objects are. Rather than having evolved from video games or flight simulation, the CAVE has its motivation rooted in scientific visualization and the SIGGRAPH 92 Showcase effort. The CAVE was designed to be a useful tool for scientific visualization. The Showcase event was an experiment; the Showcase chair and committee advocated an environment for computational scientists to interactively present their research at a major professional conference in a one-to-many format on high-end workstations attached to large projection screens. The CAVE was developed as a 'virtual reality theater' with scientific content and projection that met the criteria of Showcase.
Virtual Reality-Enhanced Extinction of Phobias and Post-Traumatic Stress.
Maples-Keller, Jessica L; Yasinski, Carly; Manjin, Nicole; Rothbaum, Barbara Olasov
2017-07-01
Virtual reality (VR) refers to an advanced technological communication interface in which the user is actively participating in a computer-generated 3-dimensional virtual world that includes computer sensory input devices used to simulate real-world interactive experiences. VR has been used within psychiatric treatment for anxiety disorders, particularly specific phobias and post-traumatic stress disorder, given several advantages that VR provides for use within treatment for these disorders. Exposure therapy for anxiety disorder is grounded in fear-conditioning models, in which extinction learning involves the process through which conditioned fear responses decrease or are inhibited. The present review will provide an overview of extinction training and anxiety disorder treatment, advantages for using VR within extinction training, a review of the literature regarding the effectiveness of VR within exposure therapy for specific phobias and post-traumatic stress disorder, and limitations and future directions of the extant empirical literature.
Three-dimensional user interfaces for scientific visualization
NASA Technical Reports Server (NTRS)
VanDam, Andries (Principal Investigator)
1996-01-01
The focus of this grant was to experiment with novel user interfaces for scientific visualization applications using both desktop and virtual reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past three years, and subsumes all prior reports.
Laparoscopic assistance by operating room nurses: Results of a virtual-reality study.
Paschold, M; Huber, T; Maedge, S; Zeissig, S R; Lang, H; Kneist, W
2017-04-01
Laparoscopic assistance is often entrusted to a less experienced resident, medical student, or operating room nurse. Data regarding laparoscopic training for operating room nurses are not available. The aim of the study was to analyse the initial performance level and learning curves of operating room nurses in basic laparoscopic surgery compared with medical students and surgical residents to determine their ability to assist with this type of procedure. The study was designed to compare the initial virtual reality performance level and learning curves of user groups to analyse competence in laparoscopic assistance. The study subjects were operating room nurses, medical students, and first year residents. Participants performed three validated tasks (camera navigation, peg transfer, fine dissection) on a virtual reality laparoscopic simulator three times in 3 consecutive days. Laparoscopic experts were enrolled as a control group. Participants filled out questionnaires before and after the course. Nurses and students were comparable in their initial performance (p>0.05). Residents performed better in camera navigation than students and nurses and reached the expert level for this task. Residents, students, and nurses had comparable bimanual skills throughout the study; while, experts performed significantly better in bimanual manoeuvres at all times (p<0.05). The included user groups had comparable skills for bimanual tasks. Residents with limited experience reached the expert level in camera navigation. With training, nurses, students, and first year residents are equally capable of assisting in basic laparoscopic procedures. Copyright © 2017 Elsevier Ltd. All rights reserved.
Virtual reality simulators and training in laparoscopic surgery.
Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos
2015-01-01
Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
Virtual reality and 3D visualizations in heart surgery education.
Friedl, Reinhard; Preisack, Melitta B; Klas, Wolfgang; Rose, Thomas; Stracke, Sylvia; Quast, Klaus J; Hannekum, Andreas; Gödje, Oliver
2002-01-01
Computer assisted teaching plays an increasing role in surgical education. The presented paper describes the development of virtual reality (VR) and 3D visualizations for educational purposes concerning aortocoronary bypass grafting and their prototypical implementation into a database-driven and internet-based educational system in heart surgery. A multimedia storyboard has been written and digital video has been encoded. Understanding of these videos was not always satisfying; therefore, additional 3D and VR visualizations have been modelled as VRML, QuickTime, QuickTime Virtual Reality and MPEG-1 applications. An authoring process in terms of integration and orchestration of different multimedia components to educational units has been started. A virtual model of the heart has been designed. It is highly interactive and the user is able to rotate it, move it, zoom in for details or even fly through. It can be explored during the cardiac cycle and a transparency mode demonstrates coronary arteries, movement of the heart valves, and simultaneous blood-flow. Myocardial ischemia and the effect of an IMA-Graft on myocardial perfusion is simulated. Coronary artery stenoses and bypass-grafts can be interactively added. 3D models of anastomotique techniques and closed thrombendarterectomy have been developed. Different visualizations have been prototypically implemented into a teaching application about operative techniques. Interactive virtual reality and 3D teaching applications can be used and distributed via the World Wide Web and have the power to describe surgical anatomy and principles of surgical techniques, where temporal and spatial events play an important role, in a way superior to traditional teaching methods.
Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Tewfik, Marc A
2014-01-01
The technical challenges of endoscopic sinus surgery (ESS) and the high risk of complications support the development of alternative modalities to train residents in these procedures. Virtual reality simulation is becoming a useful tool for training the skills necessary for minimally invasive surgery; however, there are currently no ESS virtual reality simulators available with valid evidence supporting their use in resident education. Our aim was to develop a new rhinology simulator, as well as to define potential performance metrics for trainee assessment. The McGill simulator for endoscopic sinus surgery (MSESS), a new sinus surgery virtual reality simulator with haptic feedback, was developed (a collaboration between the McGill University Department of Otolaryngology-Head and Neck Surgery, the Montreal Neurologic Institute Simulation Lab, and the National Research Council of Canada). A panel of experts in education, performance assessment, rhinology, and skull base surgery convened to identify core technical abilities that would need to be taught by the simulator, as well as performance metrics to be developed and captured. The MSESS allows the user to perform basic sinus surgery skills, such as an ethmoidectomy and sphenoidotomy, through the use of endoscopic tools in a virtual nasal model. The performance metrics were developed by an expert panel and include measurements of safety, quality, and efficiency of the procedure. The MSESS incorporates novel technological advancements to create a realistic platform for trainees. To our knowledge, this is the first simulator to combine novel tools such as the endonasal wash and elaborate anatomic deformity with advanced performance metrics for ESS.
Advanced 3-dimensional planning in neurosurgery.
Ferroli, Paolo; Tringali, Giovanni; Acerbi, Francesco; Schiariti, Marco; Broggi, Morgan; Aquino, Domenico; Broggi, Giovanni
2013-01-01
During the past decades, medical applications of virtual reality technology have been developing rapidly, ranging from a research curiosity to a commercially and clinically important area of medical informatics and technology. With the aid of new technologies, the user is able to process large amounts of data sets to create accurate and almost realistic reconstructions of anatomic structures and related pathologies. As a result, a 3-diensional (3-D) representation is obtained, and surgeons can explore the brain for planning or training. Further improvement such as a feedback system increases the interaction between users and models by creating a virtual environment. Its use for advanced 3-D planning in neurosurgery is described. Different systems of medical image volume rendering have been used and analyzed for advanced 3-D planning: 1 is a commercial "ready-to-go" system (Dextroscope, Bracco, Volume Interaction, Singapore), whereas the others are open-source-based software (3-D Slicer, FSL, and FreesSurfer). Different neurosurgeons at our institution experienced how advanced 3-D planning before surgery allowed them to facilitate and increase their understanding of the complex anatomic and pathological relationships of the lesion. They all agreed that the preoperative experience of virtually planning the approach was helpful during the operative procedure. Virtual reality for advanced 3-D planning in neurosurgery has achieved considerable realism as a result of the available processing power of modern computers. Although it has been found useful to facilitate the understanding of complex anatomic relationships, further effort is needed to increase the quality of the interaction between the user and the model.
Building the Joint Battlespace Infosphere. Volume 2: Interactive Information Technologies
1999-12-17
G. A . Vouros, “ A Knowledge- Based Methodology for Supporting Multilingual and User -Tailored Interfaces ,” Interacting With Computers, Vol. 9 (1998), p...project is to develop a two-handed user interface to the stereoscopic field analyzer, an interactive 3-D scientific visualization system. The...62 See http://www.hitl.washington.edu/research/vrd/. 63 R. Baumann and R. Clavel, “Haptic Interface for Virtual Reality Based
Learning Rationales and Virtual Reality Technology in Education.
ERIC Educational Resources Information Center
Chiou, Guey-Fa
1995-01-01
Defines and describes virtual reality technology and differentiates between virtual learning environment, learning material, and learning tools. Links learning rationales to virtual reality technology to pave conceptual foundations for application of virtual reality technology education. Constructivism, case-based learning, problem-based learning,…
Virtual reality for emergency training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altinkemer, K.
1995-12-31
Virtual reality is a sequence of scenes generated by a computer as a response to the five different senses. These senses are sight, sound, taste, touch, smell. Other senses that can be used in virtual reality include balance, pheromonal, and immunological senses. Many application areas include: leisure and entertainment, medicine, architecture, engineering, manufacturing, and training. Virtual reality is especially important when it is used for emergency training and management of natural disasters including earthquakes, floods, tornados and other situations which are hard to emulate. Classical training methods for these extraordinary environments lack the realistic surroundings that virtual reality can provide.more » In order for virtual reality to be a successful training tool the design needs to include certain aspects; such as how real virtual reality should be and how much fixed cost is entailed in setting up the virtual reality trainer. There are also pricing questions regarding the price per training session on virtual reality trainer, and the appropriate training time length(s).« less
Creating a Vision Channel for Observing Deep-Seated Anatomy in Medical Augmented Reality
NASA Astrophysics Data System (ADS)
Wimmer, Felix; Bichlmeier, Christoph; Heining, Sandro M.; Navab, Nassir
The intent of medical Augmented Reality (AR) is to augment the surgeon's real view on the patient with the patient's interior anatomy resulting from a suitable visualization of medical imaging data. This paper presents a fast and user-defined clipping technique for medical AR allowing for cutting away any parts of the virtual anatomy and images of the real part of the AR scene hindering the surgeon's view onto the deepseated region of interest. Modeled on cut-away techniques from scientific illustrations and computer graphics, the method creates a fixed vision channel to the inside of the patient. It enables a clear view on the focussed virtual anatomy and moreover improves the perception of spatial depth.
Gelsomini, Mirko; Garzotto, Franca; Montesano, Daniele; Occhiuto, Daniele
2016-08-01
Our research aims at supporting existing therapies for children with intellectual and developmental disorders (IDD). The personal and social autonomy is the desired end state to be achieved to enable a smooth integration in the real world. We developed and tested a framework for storytelling and learning activities that exploits an immersive virtual reality viewer to interact with target users. We co-designed our system with experts from the medical sector, identifying features that allow patients to stay focused on exercises to perform. Our approach triggers a learning process for a seamless assimilation of common behavioral skills useful in every day's life. This paper highlights the technologic challenges in healthcare and discusses cutting-edge interaction paradigms.
Chuah, Joon Hao; Lok, Benjamin; Black, Erik
2013-04-01
Health sciences students often practice and are evaluated on interview and exam skills by working with standardized patients (people that role play having a disease or condition). However, standardized patients do not exist for certain vulnerable populations such as children and the intellectually disabled. As a result, students receive little to no exposure to vulnerable populations before becoming working professionals. To address this problem and thereby increase exposure to vulnerable populations, we propose using virtual humans to simulate members of vulnerable populations. We created a mixed reality pediatric patient that allowed students to practice pediatric developmental exams. Practicing several exams is necessary for students to understand how to properly interact with and correctly assess a variety of children. Practice also increases a student's confidence in performing the exam. Effective practice requires students to treat the virtual child realistically. Treating the child realistically might be affected by how the student and virtual child physically interact, so we created two object interaction interfaces - a natural interface and a mouse-based interface. We tested the complete mixed reality exam and also compared the two object interaction interfaces in a within-subjects user study with 22 participants. Our results showed that the participants accepted the virtual child as a child and treated it realistically. Participants also preferred the natural interface, but the interface did not affect how realistically participants treated the virtual child.
CLEW: A Cooperative Learning Environment for the Web.
ERIC Educational Resources Information Center
Ribeiro, Marcelo Blois; Noya, Ricardo Choren; Fuks, Hugo
This paper outlines CLEW (collaborative learning environment for the Web). The project combines MUD (Multi-User Dimension), workflow, VRML (Virtual Reality Modeling Language) and educational concepts like constructivism in a learning environment where students actively participate in the learning process. The MUD shapes the environment structure.…
Virtual Reality and the Virtual Library.
ERIC Educational Resources Information Center
Oppenheim, Charles
1993-01-01
Explains virtual reality, including proper and improper uses of the term, and suggests ways that libraries might be affected by it. Highlights include elements of virtual reality systems; possible virtual reality applications, including architecture, the chemical industry, transport planning, armed forces, and entertainment; and the virtual…
Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.
Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E
2007-01-01
This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.
1993-04-01
until exhausted. SECURITY CLASSIFICATION OF THIS PAGE All other editions are obsolete. UNCLASSIFIED " VIRTUAL REALITY JAMES F. DAILEY, LIEUTENANT COLONEL...US" This paper reviews the exciting field of virtual reality . The author describes the basic concepts of virtual reality and finds that its numerous...potential benefits to society could revolutionize everyday life. The various components that make up a virtual reality system are described in detail
Mobile Virtual Reality : A Solution for Big Data Visualization
NASA Astrophysics Data System (ADS)
Marshall, E.; Seichter, N. D.; D'sa, A.; Werner, L. A.; Yuen, D. A.
2015-12-01
Pursuits in geological sciences and other branches of quantitative sciences often require data visualization frameworks that are in continual need of improvement and new ideas. Virtual reality is a medium of visualization that has large audiences originally designed for gaming purposes; Virtual reality can be captured in Cave-like environment but they are unwieldy and expensive to maintain. Recent efforts by major companies such as Facebook have focussed more on a large market , The Oculus is the first of such kind of mobile devices The operating system Unity makes it possible for us to convert the data files into a mesh of isosurfaces and be rendered into 3D. A user is immersed inside of the virtual reality and is able to move within and around the data using arrow keys and other steering devices, similar to those employed in XBox.. With introductions of products like the Oculus Rift and Holo Lens combined with ever increasing mobile computing strength, mobile virtual reality data visualization can be implemented for better analysis of 3D geological and mineralogical data sets. As more new products like the Surface Pro 4 and other high power yet very mobile computers are introduced to the market, the RAM and graphics card capacity necessary to run these models is more available, opening doors to this new reality. The computing requirements needed to run these models are a mere 8 GB of RAM and 2 GHz of CPU speed, which many mobile computers are starting to exceed. Using Unity 3D software to create a virtual environment containing a visual representation of the data, any data set converted into FBX or OBJ format which can be traversed by wearing the Oculus Rift device. This new method for analysis in conjunction with 3D scanning has potential applications in many fields, including the analysis of precious stones or jewelry. Using hologram technology to capture in high-resolution the 3D shape, color, and imperfections of minerals and stones, detailed review and analysis of the stone can be done remotely without ever seeing the real thing. This strategy can be game-changer for shoppers without having to go to the store.
Jensen, Katrine; Bjerrum, Flemming; Hansen, Henrik Jessen; Petersen, René Horsleben; Pedersen, Jesper Holst; Konge, Lars
2015-10-01
The aims of this study were to develop virtual reality simulation software for video-assisted thoracic surgery (VATS) lobectomy, to explore the opinions of thoracic surgeons concerning the VATS lobectomy simulator and to test the validity of the simulator metrics. Experienced VATS surgeons worked with computer specialists to develop a VATS lobectomy software for a virtual reality simulator. Thoracic surgeons with different degrees of experience in VATS were enrolled at the 22nd meeting of the European Society of Thoracic Surgeons (ESTS) held in Copenhagen in June 2014. The surgeons were divided according to the number of performed VATS lobectomies: novices (0 VATS lobectomies), intermediates (1-49 VATS lobectomies) and experienced (>50 VATS lobectomies). The participants all performed a lobectomy of a right upper lobe on the simulator and answered a questionnaire regarding content validity. Metrics were compared between the three groups. We succeeded in developing the first version of a virtual reality VATS lobectomy simulator. A total of 103 thoracic surgeons completed the simulated lobectomy and were distributed as follows: novices n = 32, intermediates n = 45 and experienced n = 26. All groups rated the overall user realism of the VATS lobectomy scenario to a median of 5 on a scale 1-7, with 7 being the best score. The experienced surgeons found the graphics and movements realistic and rated the scenario high in terms of usefulness as a training tool for novice and intermediate experienced thoracic surgeons, but not very useful as a training tool for experienced surgeons. The metric scores were not statistically significant between groups. This is the first study to describe a commercially available virtual reality simulator for a VATS lobectomy. More than 100 thoracic surgeons found the simulator realistic, and hence it showed good content validity. However, none of the built-in simulator metrics could significantly distinguish between novice, intermediate experienced and experienced surgeons, and further development of the simulator software is necessary to develop valid metrics. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
2014-01-01
Background Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. Methods In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. Results The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider’s lower extremities. Conclusions The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders. PMID:24902780
VR-Planets : a 3D immersive application for real-time flythrough images of planetary surfaces
NASA Astrophysics Data System (ADS)
Civet, François; Le Mouélic, Stéphane
2015-04-01
During the last two decades, a fleet of planetary probes has acquired several hundred gigabytes of images of planetary surfaces. Mars has been particularly well covered thanks to the Mars Global Surveyor, Mars Express and Mars Reconnaissance Orbiter spacecrafts. HRSC, CTX, HiRISE instruments allowed the computation of Digital Elevation Models with a resolution from hundreds of meters up to 1 meter per pixel, and corresponding orthoimages with a resolution from few hundred of meters up to 25 centimeters per pixel. The integration of such huge data sets into a system allowing user-friendly manipulation either for scientific investigation or for public outreach can represent a real challenge. We are investigating how innovative tools can be used to freely fly over reconstructed landscapes in real time, using technologies derived from the game industry and virtual reality. We have developed an application based on a game engine, using planetary data, to immerse users in real martian landscapes. The user can freely navigate in each scene at full spatial resolution using a game controller. The actual rendering is compatible with several visualization devices such as 3D active screen, virtual reality headsets (Oculus Rift), and android devices.
An experimental study on CHVE's performance evaluation.
Paiva, Paulo V F; Machado, Liliane S; Oliveira, Jauvane C
2012-01-01
Virtual reality-based training simulators, with collaborative capabilities, are known to improve the way users interact with one another while learning or improving skills on a given medical procedure. Performance evaluation of Collaborative Haptic Virtual Environments (CHVE) allows us to understand how such systems can work in the Internet, as well as the requirements for multisensorial and real-time data. This work discloses new performance evaluation results for the collaborative module of the CyberMed VR framework.
Virtual reality hardware for use in interactive 3D data fusion and visualization
NASA Astrophysics Data System (ADS)
Gourley, Christopher S.; Abidi, Mongi A.
1997-09-01
Virtual reality has become a tool for use in many areas of research. We have designed and built a VR system for use in range data fusion and visualization. One major VR tool is the CAVE. This is the ultimate visualization tool, but comes with a large price tag. Our design uses a unique CAVE whose graphics are powered by a desktop computer instead of a larger rack machine making it much less costly. The system consists of a screen eight feet tall by twenty-seven feet wide giving a variable field-of-view currently set at 160 degrees. A silicon graphics Indigo2 MaxImpact with the impact channel option is used for display. This gives the capability to drive three projectors at a resolution of 640 by 480 for use in displaying the virtual environment and one 640 by 480 display for a user control interface. This machine is also the first desktop package which has built-in hardware texture mapping. This feature allows us to quickly fuse the range and intensity data and other multi-sensory data. The final goal is a complete 3D texture mapped model of the environment. A dataglove, magnetic tracker, and spaceball are to be used for manipulation of the data and navigation through the virtual environment. This system gives several users the ability to interactively create 3D models from multiple range images.
Modeling of luminance distribution in CAVE-type virtual reality systems
NASA Astrophysics Data System (ADS)
Meironke, Michał; Mazikowski, Adam
2017-08-01
At present, one of the most advanced virtual reality systems are CAVE-type (Cave Automatic Virtual Environment) installations. Such systems are usually consisted of four, five or six projection screens and in case of six screens arranged in form of a cube. Providing the user with a high level of immersion feeling in such systems is largely dependent of optical properties of the system. The modeling of physical phenomena plays nowadays a huge role in the most fields of science and technology. It allows to simulate work of device without a need to make any changes in the physical constructions. In this paper distribution of luminance in CAVE-type virtual reality systems were modelled. Calculations were performed for the model of 6-walled CAVE-type installation, based on Immersive 3D Visualization Laboratory, situated at the Faculty of Electronics, Telecommunications and Informatics at the Gdańsk University of Technology. Tests have been carried out for two different scattering distribution of the screen material in order to check how these characteristicinfluence on the luminance distribution of the whole CAVE. The basis assumption and simplification of modeled CAVE-type installation and results were presented. The brief discussion about the results and usefulness of developed model were also carried out.
Milella, Ferdinando; Pinto, Carlo; Cant, Iain; White, Mark; Meyer, Georg
2018-01-01
Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as ‘presence’, when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user’s overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience. PMID:29390023
Virtual reality for stroke rehabilitation.
Laver, Kate E; Lange, Belinda; George, Stacey; Deutsch, Judith E; Saposnik, Gustavo; Crotty, Maria
2017-11-20
Virtual reality and interactive video gaming have emerged as recent treatment approaches in stroke rehabilitation with commercial gaming consoles in particular, being rapidly adopted in clinical settings. This is an update of a Cochrane Review published first in 2011 and then again in 2015. Primary objective: to determine the efficacy of virtual reality compared with an alternative intervention or no intervention on upper limb function and activity.Secondary objectives: to determine the efficacy of virtual reality compared with an alternative intervention or no intervention on: gait and balance, global motor function, cognitive function, activity limitation, participation restriction, quality of life, and adverse events. We searched the Cochrane Stroke Group Trials Register (April 2017), CENTRAL, MEDLINE, Embase, and seven additional databases. We also searched trials registries and reference lists. Randomised and quasi-randomised trials of virtual reality ("an advanced form of human-computer interface that allows the user to 'interact' with and become 'immersed' in a computer-generated environment in a naturalistic fashion") in adults after stroke. The primary outcome of interest was upper limb function and activity. Secondary outcomes included gait and balance and global motor function. Two review authors independently selected trials based on pre-defined inclusion criteria, extracted data, and assessed risk of bias. A third review author moderated disagreements when required. The review authors contacted investigators to obtain missing information. We included 72 trials that involved 2470 participants. This review includes 35 new studies in addition to the studies included in the previous version of this review. Study sample sizes were generally small and interventions varied in terms of both the goals of treatment and the virtual reality devices used. The risk of bias present in many studies was unclear due to poor reporting. Thus, while there are a large number of randomised controlled trials, the evidence remains mostly low quality when rated using the GRADE system. Control groups usually received no intervention or therapy based on a standard-care approach. results were not statistically significant for upper limb function (standardised mean difference (SMD) 0.07, 95% confidence intervals (CI) -0.05 to 0.20, 22 studies, 1038 participants, low-quality evidence) when comparing virtual reality to conventional therapy. However, when virtual reality was used in addition to usual care (providing a higher dose of therapy for those in the intervention group) there was a statistically significant difference between groups (SMD 0.49, 0.21 to 0.77, 10 studies, 210 participants, low-quality evidence). when compared to conventional therapy approaches there were no statistically significant effects for gait speed or balance. Results were statistically significant for the activities of daily living (ADL) outcome (SMD 0.25, 95% CI 0.06 to 0.43, 10 studies, 466 participants, moderate-quality evidence); however, we were unable to pool results for cognitive function, participation restriction, or quality of life. Twenty-three studies reported that they monitored for adverse events; across these studies there were few adverse events and those reported were relatively mild. We found evidence that the use of virtual reality and interactive video gaming was not more beneficial than conventional therapy approaches in improving upper limb function. Virtual reality may be beneficial in improving upper limb function and activities of daily living function when used as an adjunct to usual care (to increase overall therapy time). There was insufficient evidence to reach conclusions about the effect of virtual reality and interactive video gaming on gait speed, balance, participation, or quality of life. This review found that time since onset of stroke, severity of impairment, and the type of device (commercial or customised) were not strong influencers of outcome. There was a trend suggesting that higher dose (more than 15 hours of total intervention) was preferable as were customised virtual reality programs; however, these findings were not statistically significant.
Novel interactive virtual showcase based on 3D multitouch technology
NASA Astrophysics Data System (ADS)
Yang, Tao; Liu, Yue; Lu, You; Wang, Yongtian
2009-11-01
A new interactive virtual showcase is proposed in this paper. With the help of virtual reality technology, the user of the proposed system can watch the virtual objects floating in the air from all four sides and interact with the virtual objects by touching the four surfaces of the virtual showcase. Unlike traditional multitouch system, this system cannot only realize multi-touch on a plane to implement 2D translation, 2D scaling, and 2D rotation of the objects; it can also realize the 3D interaction of the virtual objects by recognizing and analyzing the multi-touch that can be simultaneously captured from the four planes. Experimental results show the potential of the proposed system to be applied in the exhibition of historical relics and other precious goods.
The expert surgical assistant. An intelligent virtual environment with multimodal input.
Billinghurst, M; Savage, J; Oppenheimer, P; Edmond, C
1996-01-01
Virtual Reality has made computer interfaces more intuitive but not more intelligent. This paper shows how an expert system can be coupled with multimodal input in a virtual environment to provide an intelligent simulation tool or surgical assistant. This is accomplished in three steps. First, voice and gestural input is interpreted and represented in a common semantic form. Second, a rule-based expert system is used to infer context and user actions from this semantic representation. Finally, the inferred user actions are matched against steps in a surgical procedure to monitor the user's progress and provide automatic feedback. In addition, the system can respond immediately to multimodal commands for navigational assistance and/or identification of critical anatomical structures. To show how these methods are used we present a prototype sinus surgery interface. The approach described here may easily be extended to a wide variety of medical and non-medical training applications by making simple changes to the expert system database and virtual environment models. Successful implementation of an expert system in both simulated and real surgery has enormous potential for the surgeon both in training and clinical practice.
Virtual Reality and Its Potential Application in Education and Training.
ERIC Educational Resources Information Center
Milheim, William D.
1995-01-01
An overview is provided of current trends in virtual reality research and development, including discussion of hardware, types of virtual reality, and potential problems with virtual reality. Implications for education and training are explored. (Author/JKP)
3D multiplayer virtual pets game using Google Card Board
NASA Astrophysics Data System (ADS)
Herumurti, Darlis; Riskahadi, Dimas; Kuswardayan, Imam
2017-08-01
Virtual Reality (VR) is a technology which allows user to interact with the virtual environment. This virtual environment is generated and simulated by computer. This technology can make user feel the sensation when they are in the virtual environment. The VR technology provides real virtual environment view for user and it is not viewed from screen. But it needs another additional device to show the view of virtual environment. This device is known as Head Mounted Device (HMD). Oculust Rift and Microsoft Hololens are the most famous HMD devices used in VR. And in 2014, Google Card Board was introduced at Google I/O developers conference. Google Card Board is VR platform which allows user to enjoy the VR with simple and cheap way. In this research, we explore Google Card Board to develop simulation game of raising pet. The Google Card Board is used to create view for the VR environment. The view and control in VR environment is built using Unity game engine. And the simulation process is designed using Finite State Machine (FSM). This FSM can help to design the process clearly. So the simulation process can describe the simulation of raising pet well. Raising pet is fun activity. But sometimes, there are many conditions which cause raising pet become difficult to do, i.e. environment condition, disease, high cost, etc. this research aims to explore and implement Google Card Board in simulation of raising pet.
MOOs for Teaching and Learning.
ERIC Educational Resources Information Center
Furst-Bowe, Julie
1996-01-01
Discusses the use of MOOs (Multi-User Dimension/Dungeon Object Oriented), text-based virtual reality environments, in education. Highlights include connecting to a network; exploring several MOOs to determine which is most appropriate; and familiarizing students with the MOO's interaction and behavior policies, as well as how to operate in the…
Rapid prototyping 3D virtual world interfaces within a virtual factory environment
NASA Technical Reports Server (NTRS)
Kosta, Charles Paul; Krolak, Patrick D.
1993-01-01
On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.
A Virtual Reality-Based Simulation of Abdominal Surgery
1994-06-30
415) 591-7881 In! IhNiI 1 SHORT TITLE: A Virtual Reality -Based Simulation of Abdominal Surgery REPORTING PERIOD: October 31, 1993-June 30, 1994 The...Report - A Virtual Reality -Based Simulation Of Abdominal Surgery Page 2 June 21, 1994 TECHNICAL REPORT SUMMARY Virtual Reality is a marriage between...applications of this technology. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations. simulate and
Virtual Reality Cue Refusal Video Game for Alcohol and Cigarette Recovery Support: Summative Study.
Metcalf, Mary; Rossie, Karen; Stokes, Katie; Tallman, Christina; Tanner, Bradley
2018-04-16
New technologies such as virtual reality, augmented reality, and video games hold promise to support and enhance individuals in addiction treatment and recovery. Quitting or decreasing cigarette or alcohol use can lead to significant health improvements for individuals, decreasing heart disease risk and cancer risks (for both nicotine and alcohol use), among others. However, remaining in recovery from use is a significant challenge for most individuals. We developed and assessed the Take Control game, a partially immersive Kinect for Windows platform game that allows users to counter substance cues through active movements (hitting, kicking, etc). Formative analysis during phase I and phase II guided development. We conducted a small wait-list control trial using a quasi-random sampling technique (systematic) with 61 participants in recovery from addiction to alcohol or tobacco. Participants used the game 3 times and reported on substance use, cravings, satisfaction with the game experience, self-efficacy related to recovery, and side effects from exposure to a virtual reality intervention and substance cues. Participants found the game engaging and fun and felt playing the game would support recovery efforts. On average, reported substance use decreased for participants during the intervention period. Participants in recovery for alcohol use saw more benefit than those in recovery for tobacco use, with a statistically significant increase in self-efficacy, attitude, and behavior during the intervention. Side effects from the use of a virtual reality intervention were minor and decreased over time; cravings and side effects also decreased during the study. The preliminary results suggest the intervention holds promise as an adjunct to standard treatment for those in recovery, particularly from alcohol use. ©Mary Metcalf, Karen Rossie, Katie Stokes, Christina Tallman, Bradley Tanner. Originally published in JMIR Serious Games (http://games.jmir.org), 16.04.2018.
Virtual Reality Cue Refusal Video Game for Alcohol and Cigarette Recovery Support: Summative Study
Rossie, Karen; Stokes, Katie; Tallman, Christina; Tanner, Bradley
2018-01-01
Background New technologies such as virtual reality, augmented reality, and video games hold promise to support and enhance individuals in addiction treatment and recovery. Quitting or decreasing cigarette or alcohol use can lead to significant health improvements for individuals, decreasing heart disease risk and cancer risks (for both nicotine and alcohol use), among others. However, remaining in recovery from use is a significant challenge for most individuals. Objective We developed and assessed the Take Control game, a partially immersive Kinect for Windows platform game that allows users to counter substance cues through active movements (hitting, kicking, etc). Methods Formative analysis during phase I and phase II guided development. We conducted a small wait-list control trial using a quasi-random sampling technique (systematic) with 61 participants in recovery from addiction to alcohol or tobacco. Participants used the game 3 times and reported on substance use, cravings, satisfaction with the game experience, self-efficacy related to recovery, and side effects from exposure to a virtual reality intervention and substance cues. Results Participants found the game engaging and fun and felt playing the game would support recovery efforts. On average, reported substance use decreased for participants during the intervention period. Participants in recovery for alcohol use saw more benefit than those in recovery for tobacco use, with a statistically significant increase in self-efficacy, attitude, and behavior during the intervention. Side effects from the use of a virtual reality intervention were minor and decreased over time; cravings and side effects also decreased during the study. Conclusions The preliminary results suggest the intervention holds promise as an adjunct to standard treatment for those in recovery, particularly from alcohol use. PMID:29661748
Virtual community centre for power wheelchair training: Experience of children and clinicians.
Torkia, Caryne; Ryan, Stephen E; Reid, Denise; Boissy, Patrick; Lemay, Martin; Routhier, François; Contardo, Resi; Woodhouse, Janet; Archambault, Phillipe S
2017-11-02
To: 1) characterize the overall experience in using the McGill immersive wheelchair - community centre (miWe-CC) simulator; and 2) investigate the experience of presence (i.e., sense of being in the virtual rather than in the real, physical environment) while driving a PW in the miWe-CC. A qualitative research design with structured interviews was used. Fifteen clinicians and 11 children were interviewed after driving a power wheelchair (PW) in the miWe-CC simulator. Data were analyzed using the conventional and directed content analysis approaches. Overall, participants enjoyed using the simulator and experienced a sense of presence in the virtual space. They felt a sense of being in the virtual environment, involved and focused on driving the virtual PW rather than on the surroundings of the actual room where they were. Participants reported several similarities between the virtual community centre layout and activities of the miWe-CC and the day-to-day reality of paediatric PW users. The simulator replicated participants' expectations of real-life PW use and promises to have an effect on improving the driving skills of new PW users. Implications for rehabilitation Among young users, the McGill immersive wheelchair (miWe) simulator provides an experience of presence within the virtual environment. This experience of presence is generated by a sense of being in the virtual scene, a sense of being involved, engaged, and focused on interacting within the virtual environment, and by the perception that the virtual environment is consistent with the real world. The miWe is a relevant and accessible approach, complementary to real world power wheelchair training for young users.
Virtual Reality as Innovative Approach to the Interior Designing
NASA Astrophysics Data System (ADS)
Kaleja, Pavol; Kozlovská, Mária
2017-06-01
We can observe significant potential of information and communication technologies (ICT) in interior designing field, by development of software and hardware virtual reality tools. Using ICT tools offer realistic perception of proposal in its initial idea (the study). A group of real-time visualization, supported by hardware tools like Oculus Rift HTC Vive, provides free walkthrough and movement in virtual interior with the possibility of virtual designing. By improving of ICT software tools for designing in virtual reality we can achieve still more realistic virtual environment. The contribution presented proposal of an innovative approach of interior designing in virtual reality, using the latest software and hardware ICT virtual reality technologies
The virtues of virtual reality in exposure therapy.
Gega, Lina
2017-04-01
Virtual reality can be more effective and less burdensome than real-life exposure. Optimal virtual reality delivery should incorporate in situ direct dialogues with a therapist, discourage safety behaviours, allow for a mismatch between virtual and real exposure tasks, and encourage self-directed real-life practice between and beyond virtual reality sessions. © The Royal College of Psychiatrists 2017.
Virtual Reality in the Classroom.
ERIC Educational Resources Information Center
Pantelidis, Veronica S.
1993-01-01
Considers the concept of virtual reality; reviews its history; describes general uses of virtual reality, including entertainment, medicine, and design applications; discusses classroom uses of virtual reality, including a software program called Virtus WalkThrough for use with a computer monitor; and suggests future possibilities. (34 references)…
Freeman, K M; Thompson, S F; Allely, E B; Sobel, A L; Stansfield, S A; Pugh, W M
2001-01-01
Rapid and effective medical intervention in response to civil and military-related disasters is crucial for saving lives and limiting long-term disability. Inexperienced providers may suffer in performance when faced with limited supplies and the demands of stabilizing casualties not generally encountered in the comparatively resource-rich hospital setting. Head trauma and multiple injury cases are particularly complex to diagnose and treat, requiring the integration and processing of complex multimodal data. In this project, collaborators adapted and merged existing technologies to produce a flexible, modular patient simulation system with both three-dimensional virtual reality and two-dimensional flat screen user interfaces for teaching cognitive assessment and treatment skills. This experiential, problem-based training approach engages the user in a stress-filled, high fidelity world, providing multiple learning opportunities within a compressed period of time and without risk. The system simulates both the dynamic state of the patient and the results of user intervention, enabling trainees to watch the virtual patient deteriorate or stabilize as a result of their decision-making speed and accuracy. Systems can be deployed to the field enabling trainees to practice repeatedly until their skills are mastered and to maintain those skills once acquired. This paper describes the technologies and the process used to develop the trainers, the clinical algorithms, and the incorporation of teaching points. We also characterize aspects of the actual simulation exercise through the lens of the trainee.
Ascending and Descending in Virtual Reality: Simple and Safe System Using Passive Haptics.
Nagao, Ryohei; Matsumoto, Keigo; Narumi, Takuji; Tanikawa, Tomohiro; Hirose, Michitaka
2018-04-01
This paper presents a novel interactive system that provides users with virtual reality (VR) experiences, wherein users feel as if they are ascending/descending stairs through passive haptic feedback. The passive haptic stimuli are provided by small bumps under the feet of users; these stimuli are provided to represent the edges of the stairs in the virtual environment. The visual stimuli of the stairs and shoes, provided by head-mounted displays, evoke a visuo-haptic interaction that modifies a user's perception of the floor shape. Our system enables users to experience all types of stairs, such as half-turn and spiral stairs, in a VR setting. We conducted a preliminary user study and two experiments to evaluate the proposed technique. The preliminary user study investigated the effectiveness of the basic idea associated with the proposed technique for the case of a user ascending stairs. The results demonstrated that the passive haptic feedback produced by the small bumps enhanced the user's feeling of presence and sense of ascending. We subsequently performed an experiment to investigate an improved viewpoint manipulation method and the interaction of the manipulation and haptics for both the ascending and descending cases. The experimental results demonstrated that the participants had a feeling of presence and felt a steep stair gradient under the condition of haptic feedback and viewpoint manipulation based on the characteristics of actual stair walking data. However, these results also indicated that the proposed system may not be as effective in providing a sense of descending stairs without an optimization of the haptic stimuli. We then redesigned the shape of the small bumps, and evaluated the design in a second experiment. The results indicated that the best shape to present haptic stimuli is a right triangle cross section in both the ascending and descending cases. Although it is necessary to install small protrusions in the determined direction, by using this optimized shape the users feeling of presence of the stairs and the sensation of walking up and down was enhanced.
Borrel, Alexandre; Fourches, Denis
2017-12-01
There is a growing interest for the broad use of Augmented Reality (AR) and Virtual Reality (VR) in the fields of bioinformatics and cheminformatics to visualize complex biological and chemical structures. AR and VR technologies allow for stunning and immersive experiences, offering untapped opportunities for both research and education purposes. However, preparing 3D models ready to use for AR and VR is time-consuming and requires a technical expertise that severely limits the development of new contents of potential interest for structural biologists, medicinal chemists, molecular modellers and teachers. Herein we present the RealityConvert software tool and associated website, which allow users to easily convert molecular objects to high quality 3D models directly compatible for AR and VR applications. For chemical structures, in addition to the 3D model generation, RealityConvert also generates image trackers, useful to universally call and anchor that particular 3D model when used in AR applications. The ultimate goal of RealityConvert is to facilitate and boost the development and accessibility of AR and VR contents for bioinformatics and cheminformatics applications. http://www.realityconvert.com. dfourch@ncsu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Training for percutaneous renal access on a virtual reality simulator.
Zhang, Yi; Yu, Cheng-fan; Liu, Jin-shun; Wang, Gang; Zhu, He; Na, Yan-qun
2013-01-01
The need to develop new methods of surgical training combined with advances in computing has led to the development of virtual reality surgical simulators. The PERC Mentor(TM) is designed to train the user in percutaneous renal collecting system access puncture. This study aimed to validate the use of this kind of simulator, in percutaneous renal access training. Twenty-one urologists were enrolled as trainees to learn a fluoroscopy-guided percutaneous renal accessing technique. An assigned percutaneous renal access procedure was immediately performed on the PERC Mentor(TM) after watching instruction video and an analog operation. Objective parameters were recorded by the simulator and subjective global rating scale (GRS) score were determined. Simulation training followed and consisted of 2 hours daily training sessions for 2 consecutive days. Twenty-four hours after the training session, trainees were evaluated performing the same procedure. The post-training evaluation was compared to the evaluation of the initial attempt. During the initial attempt, none of the trainees could complete the appointed procedure due to the lack of experience in fluoroscopy-guided percutaneous renal access. After the short-term training, all trainees were able to independently complete the procedure. Of the 21 trainees, 10 had primitive experience in ultrasound-guided percutaneous nephrolithotomy. Trainees were thus categorized into the group of primitive experience and inexperience. The total operating time and amount of contrast material used were significantly lower in the group of primitive experience versus the inexperience group (P = 0.03 and 0.02, respectively). The training on the virtual reality simulator, PERC Mentor(TM), can help trainees with no previous experience of fluoroscopy-guided percutaneous renal access to complete the virtual manipulation of the procedure independently. This virtual reality simulator may become an important training and evaluation tool in teaching fluoroscopy-guided percutaneous renal access.
Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays.
Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Cooper, Emily A; Wetzstein, Gordon
2017-02-28
From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.
Guided exploration in virtual environments
NASA Astrophysics Data System (ADS)
Beckhaus, Steffi; Eckel, Gerhard; Strothotte, Thomas
2001-06-01
We describe an application supporting alternating interaction and animation for the purpose of exploration in a surround- screen projection-based virtual reality system. The exploration of an environment is a highly interactive and dynamic process in which the presentation of objects of interest can give the user guidance while exploring the scene. Previous systems for automatic presentation of models or scenes need either cinematographic rules, direct human interaction, framesets or precalculation (e.g. precalculation of paths to a predefined goal). We report on the development of a system that can deal with rapidly changing user interest in objects of a scene or model as well as with dynamic models and changes of the camera position introduced interactively by the user. It is implemented as a potential-field based camera data generating system. In this paper we describe the implementation of our approach in a virtual art museum on the CyberStage, our surround-screen projection-based stereoscopic display. The paradigm of guided exploration is introduced describing the freedom of the user to explore the museum autonomously. At the same time, if requested by the user, guided exploration provides just-in-time navigational support. The user controls this support by specifying the current field of interest in high-level search criteria. We also present an informal user study evaluating this approach.
Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays
NASA Astrophysics Data System (ADS)
Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Cooper, Emily A.; Wetzstein, Gordon
2017-02-01
From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.
Development of a virtual reality training system for endoscope-assisted submandibular gland removal.
Miki, Takehiro; Iwai, Toshinori; Kotani, Kazunori; Dang, Jianwu; Sawada, Hideyuki; Miyake, Minoru
2016-11-01
Endoscope-assisted surgery has widely been adopted as a basic surgical procedure, with various training systems using virtual reality developed for this procedure. In the present study, a basic training system comprising virtual reality for the removal of submandibular glands under endoscope assistance was developed. The efficacy of the training system was verified in novice oral surgeons. A virtual reality training system was developed using existing haptic devices. Virtual reality models were constructed from computed tomography data to ensure anatomical accuracy. Novice oral surgeons were trained using the developed virtual reality training system. The developed virtual reality training system included models of the submandibular gland and surrounding connective tissues and blood vessels entering the submandibular gland. Cutting or abrasion of the connective tissue and manipulations, such as elevation of blood vessels, were reproduced by the virtual reality system. A training program using the developed system was devised. Novice oral surgeons were trained in accordance with the devised training program. Our virtual reality training system for endoscope-assisted removal of the submandibular gland is effective in the training of novice oral surgeons in endoscope-assisted surgery. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
An innovative virtual reality training tool for orthognathic surgery.
Pulijala, Y; Ma, M; Pears, M; Peebles, D; Ayoub, A
2018-02-01
Virtual reality (VR) surgery using Oculus Rift and Leap Motion devices is a multi-sensory, holistic surgical training experience. A multimedia combination including 360° videos, three-dimensional interaction, and stereoscopic videos in VR has been developed to enable trainees to experience a realistic surgery environment. The innovation allows trainees to interact with the individual components of the maxillofacial anatomy and apply surgical instruments while watching close-up stereoscopic three-dimensional videos of the surgery. In this study, a novel training tool for Le Fort I osteotomy based on immersive virtual reality (iVR) was developed and validated. Seven consultant oral and maxillofacial surgeons evaluated the application for face and content validity. Using a structured assessment process, the surgeons commented on the content of the developed training tool, its realism and usability, and the applicability of VR surgery for orthognathic surgical training. The results confirmed the clinical applicability of VR for delivering training in orthognathic surgery. Modifications were suggested to improve the user experience and interactions with the surgical instruments. This training tool is ready for testing with surgical trainees. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Harrison, C S; Grant, P M; Conway, B A
2010-01-01
The increasing importance of inclusive design and in particular accessibility guidelines established in the U.K. 1996 Disability Discrimination Act (DDA) has been a prime motivation for the work on wheelchair access, a subset of the DDA guidelines, described in this article. The development of these guidelines mirrors the long-standing provisions developed in the U.S. In order to raise awareness of these guidelines and in particular to give architects, building designers, and users a physical sensation of how a planned development could be experienced, a wheelchair virtual reality system was developed. This compares with conventional methods of measuring against drawings and comparing dimensions against building regulations, established in the U.K. under British standards. Features of this approach include the marriage of an electromechanical force-feedback system with high-quality immersive graphics as well as the potential ability to generate a physiological rating of buildings that do not yet exist. The provision of this sense of "feel" augments immersion within the virtual reality environment and also provides the basis from which both qualitative and quantitative measures of a building's access performance can be gained.
Virtual Reality: Toward Fundamental Improvements in Simulation-Based Training.
ERIC Educational Resources Information Center
Thurman, Richard A.; Mattoon, Joseph S.
1994-01-01
Considers the role and effectiveness of virtual reality in simulation-based training. The theoretical and practical implications of verity, integration, and natural versus artificial interface are discussed; a three-dimensional classification scheme for virtual reality is described; and the relationship between virtual reality and other…
Virtual Reality in Schools: The Ultimate Educational Technology.
ERIC Educational Resources Information Center
Reid, Robert D.; Sykes, Wylmarie
1999-01-01
Discusses the use of virtual reality as an educational tool. Highlights include examples of virtual reality in public schools that lead to a more active learning process, simulated environments, integrating virtual reality into any curriculum, benefits to teachers and students, and overcoming barriers to implementation. (LRW)
Video capture virtual reality as a flexible and effective rehabilitation tool
Weiss, Patrice L; Rand, Debbie; Katz, Noomi; Kizony, Rachel
2004-01-01
Video capture virtual reality (VR) uses a video camera and software to track movement in a single plane without the need to place markers on specific bodily locations. The user's image is thereby embedded within a simulated environment such that it is possible to interact with animated graphics in a completely natural manner. Although this technology first became available more than 25 years ago, it is only within the past five years that it has been applied in rehabilitation. The objective of this article is to describe the way this technology works, to review its assets relative to other VR platforms, and to provide an overview of some of the major studies that have evaluated the use of video capture technologies for rehabilitation. PMID:15679949
Survey on multisensory feedback virtual reality dental training systems.
Wang, D; Li, T; Zhang, Y; Hou, J
2016-11-01
Compared with traditional dental training methods, virtual reality training systems integrated with multisensory feedback possess potentials advantages. However, there exist many technical challenges in developing a satisfactory simulator. In this manuscript, we systematically survey several current dental training systems to identify the gaps between the capabilities of these systems and the clinical training requirements. After briefly summarising the components, functions and unique features of each system, we discuss the technical challenges behind these systems including the software, hardware and user evaluation methods. Finally, the clinical requirements of an ideal dental training system are proposed. Future research/development areas are identified based on an analysis of the gaps between current systems and clinical training requirements. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Virtual reality simulation for construction safety promotion.
Zhao, Dong; Lucas, Jason
2015-01-01
Safety is a critical issue for the construction industry. Literature argues that human error contributes to more than half of occupational incidents and could be directly impacted by effective training programs. This paper reviews the current safety training status in the US construction industry. Results from the review evidence the gap between the status and industry expectation on safety. To narrow this gap, this paper demonstrates the development and utilisation of a training program that is based on virtual reality (VR) simulation. The VR-based safety training program can offer a safe working environment where users can effectively rehearse tasks with electrical hazards and ultimately promote their abilities for electrical hazard cognition and intervention. Its visualisation and simulation can also remove the training barriers caused by electricity's features of invisibility and dangerousness.
The need for virtual reality simulators in dental education: A review.
Roy, Elby; Bakr, Mahmoud M; George, Roy
2017-04-01
Virtual reality simulators are becoming an essential part of modern education. The benefits of Virtual reality in dentistry is constantly being assessed as a method or an adjunct to improve fine motor skills, hand-eye coordination in pre-clinical settings and overcome the monetary and intellectual challenges involved with such training. This article, while providing an overview of the virtual reality dental simulators, also looks at the link between virtual reality simulation and current pedagogical knowledge.
Rodrigues-Baroni, Juliana M; Nascimento, Lucas R; Ada, Louise; Teixeira-Salmela, Luci F
2014-01-01
To systematically review the available evidence on the efficacy of walking training associated with virtual reality-based training in patients with stroke. The specific questions were: Is walking training associated with virtual reality-based training effective in increasing walking speed after stroke? Is this type of intervention more effective in increasing walking speed, than non-virtual reality-based walking interventions? A systematic review with meta-analysis of randomized clinical trials was conducted. Participants were adults with chronic stroke and the experimental intervention was walking training associated with virtual reality-based training to increase walking speed. The outcome data regarding walking speed were extracted from the eligible trials and were combined using a meta-analysis approach. Seven trials representing eight comparisons were included in this systematic review. Overall, the virtual reality-based training increased walking speed by 0.17 m/s (IC 95% 0.08 to 0.26), compared with placebo/nothing or non-walking interventions. In addition, the virtual reality-based training increased walking speed by 0.15 m/s (IC 95% 0.05 to 0.24), compared with non-virtual reality walking interventions. This review provided evidence that walking training associated with virtual reality-based training was effective in increasing walking speed after stroke, and resulted in better results than non-virtual reality interventions.
Rodrigues-Baroni, Juliana M.; Nascimento, Lucas R.; Ada, Louise; Teixeira-Salmela, Luci F.
2014-01-01
OBJECTIVE: To systematically review the available evidence on the efficacy of walking training associated with virtual reality-based training in patients with stroke. The specific questions were: Is walking training associated with virtual reality-based training effective in increasing walking speed after stroke? Is this type of intervention more effective in increasing walking speed, than non-virtual reality-based walking interventions? METHOD: A systematic review with meta-analysis of randomized clinical trials was conducted. Participants were adults with chronic stroke and the experimental intervention was walking training associated with virtual reality-based training to increase walking speed. The outcome data regarding walking speed were extracted from the eligible trials and were combined using a meta-analysis approach. RESULTS: Seven trials representing eight comparisons were included in this systematic review. Overall, the virtual reality-based training increased walking speed by 0.17 m/s (IC 95% 0.08 to 0.26), compared with placebo/nothing or non-walking interventions. In addition, the virtual reality-based training increased walking speed by 0.15 m/s (IC 95% 0.05 to 0.24), compared with non-virtual reality walking interventions. CONCLUSIONS: This review provided evidence that walking training associated with virtual reality-based training was effective in increasing walking speed after stroke, and resulted in better results than non-virtual reality interventions. PMID:25590442
Gibby, Jacob T; Swenson, Samuel A; Cvetko, Steve; Rao, Raj; Javan, Ramin
2018-06-22
Augmented reality has potential to enhance surgical navigation and visualization. We determined whether head-mounted display augmented reality (HMD-AR) with superimposed computed tomography (CT) data could allow the wearer to percutaneously guide pedicle screw placement in an opaque lumbar model with no real-time fluoroscopic guidance. CT imaging was obtained of a phantom composed of L1-L3 Sawbones vertebrae in opaque silicone. Preprocedural planning was performed by creating virtual trajectories of appropriate angle and depth for ideal approach into the pedicle, and these data were integrated into the Microsoft HoloLens using the Novarad OpenSight application allowing the user to view the virtual trajectory guides and CT images superimposed on the phantom in two and three dimensions. Spinal needles were inserted following the virtual trajectories to the point of contact with bone. Repeat CT revealed actual needle trajectory, allowing comparison with the ideal preprocedural paths. Registration of AR to phantom showed a roughly circular deviation with maximum average radius of 2.5 mm. Users took an average of 200 s to place a needle. Extrapolation of needle trajectory into the pedicle showed that of 36 needles placed, 35 (97%) would have remained within the pedicles. Needles placed approximated a mean distance of 4.69 mm in the mediolateral direction and 4.48 mm in the craniocaudal direction from pedicle bone edge. To our knowledge, this is the first peer-reviewed report and evaluation of HMD-AR with superimposed 3D guidance utilizing CT for spinal pedicle guide placement for the purpose of cannulation without the use of fluoroscopy.
A collaborative molecular modeling environment using a virtual tunneling service.
Lee, Jun; Kim, Jee-In; Kang, Lin-Woo
2012-01-01
Collaborative researches of three-dimensional molecular modeling can be limited by different time zones and locations. A networked virtual environment can be utilized to overcome the problem caused by the temporal and spatial differences. However, traditional approaches did not sufficiently consider integration of different computing environments, which were characterized by types of applications, roles of users, and so on. We propose a collaborative molecular modeling environment to integrate different molecule modeling systems using a virtual tunneling service. We integrated Co-Coot, which is a collaborative crystallographic object-oriented toolkit, with VRMMS, which is a virtual reality molecular modeling system, through a collaborative tunneling system. The proposed system showed reliable quantitative and qualitative results through pilot experiments.
Therapists' perception of benefits and costs of using virtual reality treatments.
Segal, Robert; Bhatia, Maneet; Drapeau, Martin
2011-01-01
Research indicates that virtual reality is effective in the treatment of many psychological difficulties and is being used more frequently. However, little is known about therapists' perception of the benefits and costs related to the use of virtual therapy in treatment delivery. In the present study, 271 therapists completed an online questionnaire that assessed their perceptions about the potential benefits and costs of using virtual reality in psychotherapy. Results indicated that therapists perceived the potential benefits as outweighing the potential costs. Therapists' self-reported knowledge of virtual reality, theoretical orientation, and interest in using virtual reality were found to be associated with perceptual measures. These findings contribute to the current knowledge of the perception of virtual reality amongst psychotherapists.
ERIC Educational Resources Information Center
Taçgin, Zeynep; Arslan, Ahmet
2017-01-01
The purpose of this study is to determine perception of postgraduate Computer Education and Instructional Technologies (CEIT) students regarding the concepts of Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR), Augmented Virtuality (AV) and Mirror Reality; and to offer a table that includes differences and similarities between…
Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing.
Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T
2016-03-18
In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user's hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning.
Applying Augmented Reality in practical classes for engineering students
NASA Astrophysics Data System (ADS)
Bazarov, S. E.; Kholodilin, I. Yu; Nesterov, A. S.; Sokhina, A. V.
2017-10-01
In this article the Augmented Reality application for teaching engineering students of electrical and technological specialties is introduced. In order to increase the motivation for learning and the independence of students, new practical guidelines on Augmented Reality were developed in the application to practical classes. During the application development, the authors used software such as Unity 3D and Vuforia. The Augmented Reality content consists of 3D-models, images and animations, which are superimposed on real objects, helping students to study specific tasks. A user who has a smartphone, a tablet PC, or Augmented Reality glasses can visualize on-screen virtual objects added to a real environment. Having analyzed the current situation in higher education: the learner’s interest in studying, their satisfaction with the educational process, and the impact of the Augmented Reality application on students, a questionnaire was developed and offered to students; the study involved 24 learners.
Interactive Immersive Virtualmuseum: Digital Documentation for Virtual Interaction
NASA Astrophysics Data System (ADS)
Clini, P.; Ruggeri, L.; Angeloni, R.; Sasso, M.
2018-05-01
Thanks to their playful and educational approach Virtual Museum systems are very effective for the communication of Cultural Heritage. Among the latest technologies Immersive Virtual Reality is probably the most appealing and potentially effective to serve this purpose; nevertheless, due to a poor user-system interaction, caused by an incomplete maturity of a specific technology for museum applications, it is still quite uncommon to find immersive installations in museums. This paper explore the possibilities offered by this technology and presents a workflow that, starting from digital documentation, makes possible an interaction with archaeological finds or any other cultural heritage inside different kinds of immersive virtual reality spaces. Two different cases studies are presented: the National Archaeological Museum of Marche in Ancona and the 3D reconstruction of the Roman Forum of Fanum Fortunae. Two different approaches not only conceptually but also in contents; while the Archaeological Museum is represented in the application simply using spherical panoramas to give the perception of the third dimension, the Roman Forum is a 3D model that allows visitors to move in the virtual space as in the real one. In both cases, the acquisition phase of the artefacts is central; artefacts are digitized with the photogrammetric technique Structure for Motion then they are integrated inside the immersive virtual space using a PC with a HTC Vive system that allows the user to interact with the 3D models turning the manipulation of objects into a fun and exciting experience. The challenge, taking advantage of the latest opportunities made available by photogrammetry and ICT, is to enrich visitors' experience in Real Museum making possible the interaction with perishable, damaged or lost objects and the public access to inaccessible or no longer existing places promoting in this way the preservation of fragile sites.
NASA Technical Reports Server (NTRS)
Hale, Joseph P.
1994-01-01
A virtual reality (VR) Applications Program has been under development at MSFC since 1989. Its objectives are to develop, assess, validate, and utilize VR in hardware development, operations development and support, missions operations training, and science training. A variety of activities are under way within many of these areas. One ongoing macro-ergonomic application of VR relates to the design of the Space Station Freedom Payload Control Area (PCA), the control room from which onboard payload operations are managed. Several preliminary conceptual PCA layouts have been developed and modeled in VR. Various managers and potential end users have virtually 'entered' these rooms and provided valuable feedback. Before VR can be used with confidence in a particular application, it must be validated, or calibrated, for that class of applications. Two associated validation studies for macro-ergonomic applications are under way to help characterize possible distortions of filtering of relevant perceptions in a virtual world. In both studies, existing control rooms and their 'virtual counterparts will be empirically compared using distance and heading estimations to objects and subjective assessments. Approaches and findings of the PCA activities and details of the studies are presented.
Orientation and metacognition in virtual space.
Tenbrink, Thora; Salwiczek, Lucie H
2016-05-01
Cognitive scientists increasingly use virtual reality scenarios to address spatial perception, orientation, and navigation. If based on desktops rather than mobile immersive environments, this involves a discrepancy between the physically experienced static position and the visually perceived dynamic scene, leading to cognitive challenges that users of virtual worlds may or may not be aware of. The frequently reported loss of orientation and worse performance in point-to-origin tasks relate to the difficulty of establishing a consistent reference system on an allocentric or egocentric basis. We address the verbalizability of spatial concepts relevant in this regard, along with the conscious strategies reported by participants. Behavioral and verbal data were collected using a perceptually sparse virtual tunnel scenario that has frequently been used to differentiate between humans' preferred reference systems. Surprisingly, the linguistic data we collected relate to reference system verbalizations known from the earlier literature only to a limited extent, but instead reveal complex cognitive mechanisms and strategies. Orientation in desktop virtual reality appears to pose considerable challenges, which participants react to by conceptualizing the task in individual ways that do not systematically relate to the generic concepts of egocentric and allocentric reference frames. (c) 2016 APA, all rights reserved).
Vourvopoulos, Athanasios; Bermúdez I Badia, Sergi
2016-08-09
The use of Brain-Computer Interface (BCI) technology in neurorehabilitation provides new strategies to overcome stroke-related motor limitations. Recent studies demonstrated the brain's capacity for functional and structural plasticity through BCI. However, it is not fully clear how we can take full advantage of the neurobiological mechanisms underlying recovery and how to maximize restoration through BCI. In this study we investigate the role of multimodal virtual reality (VR) simulations and motor priming (MP) in an upper limb motor-imagery BCI task in order to maximize the engagement of sensory-motor networks in a broad range of patients who can benefit from virtual rehabilitation training. In order to investigate how different BCI paradigms impact brain activation, we designed 3 experimental conditions in a within-subject design, including an immersive Multimodal Virtual Reality with Motor Priming (VRMP) condition where users had to perform motor-execution before BCI training, an immersive Multimodal VR condition, and a control condition with standard 2D feedback. Further, these were also compared to overt motor-execution. Finally, a set of questionnaires were used to gather subjective data on Workload, Kinesthetic Imagery and Presence. Our findings show increased capacity to modulate and enhance brain activity patterns in all extracted EEG rhythms matching more closely those present during motor-execution and also a strong relationship between electrophysiological data and subjective experience. Our data suggest that both VR and particularly MP can enhance the activation of brain patterns present during overt motor-execution. Further, we show changes in the interhemispheric EEG balance, which might play an important role in the promotion of neural activation and neuroplastic changes in stroke patients in a motor-imagery neurofeedback paradigm. In addition, electrophysiological correlates of psychophysiological responses provide us with valuable information about the motor and affective state of the user that has the potential to be used to predict MI-BCI training outcome based on user's profile. Finally, we propose a BCI paradigm in VR, which gives the possibility of motor priming for patients with low level of motor control.
From Vesalius to virtual reality: How embodied cognition facilitates the visualization of anatomy
NASA Astrophysics Data System (ADS)
Jang, Susan
This study examines the facilitative effects of embodiment of a complex internal anatomical structure through three-dimensional ("3-D") interactivity in a virtual reality ("VR") program. Since Shepard and Metzler's influential 1971 study, it has been known that 3-D objects (e.g., multiple-armed cube or external body parts) are visually and motorically embodied in our minds. For example, people take longer to rotate mentally an image of their hand not only when there is a greater degree of rotation, but also when the images are presented in a manner incompatible with their natural body movement (Parsons, 1987a, 1994; Cooper & Shepard, 1975; Sekiyama, 1983). Such findings confirm the notion that our mental images and rotations of those images are in fact confined by the laws of physics and biomechanics, because we perceive, think and reason in an embodied fashion. With the advancement of new technologies, virtual reality programs for medical education now enable users to interact directly in a 3-D environment with internal anatomical structures. Given that such structures are not readily viewable to users and thus not previously susceptible to embodiment, coupled with the VR environment also affording all possible degrees of rotation, how people learn from these programs raises new questions. If we embody external anatomical parts we can see, such as our hands and feet, can we embody internal anatomical parts we cannot see? Does manipulating the anatomical part in virtual space facilitate the user's embodiment of that structure and therefore the ability to visualize the structure mentally? Medical students grouped in yoked-pairs were tasked with mastering the spatial configuration of an internal anatomical structure; only one group was allowed to manipulate the images of this anatomical structure in a 3-D VR environment, whereas the other group could only view the manipulation. The manipulation group outperformed the visual group, suggesting that the interactivity that took place among the manipulation group promoted visual and motoric embodiment, which in turn enhanced learning. Moreover, when accounting for spatial ability, it was found that manipulation benefits students with low spatial ability more than students with high spatial ability.
Tools virtualization for command and control systems
NASA Astrophysics Data System (ADS)
Piszczek, Marek; Maciejewski, Marcin; Pomianek, Mateusz; Szustakowski, Mieczysław
2017-10-01
Information management is an inseparable part of the command process. The result is that the person making decisions at the command post interacts with data providing devices in various ways. Tools virtualization process can introduce a number of significant modifications in the design of solutions for management and command. The general idea involves replacing physical devices user interface with their digital representation (so-called Virtual instruments). A more advanced level of the systems "digitalization" is to use the mixed reality environments. In solutions using Augmented reality (AR) customized HMI is displayed to the operator when he approaches to each device. Identification of device is done by image recognition of photo codes. Visualization is achieved by (optical) see-through head mounted display (HMD). Control can be done for example by means of a handheld touch panel. Using the immersive virtual environment, the command center can be digitally reconstructed. Workstation requires only VR system (HMD) and access to information network. Operator can interact with devices in such a way as it would perform in real world (for example with the virtual hands). Because of their procedures (an analysis of central vision, eye tracking) MR systems offers another useful feature of reducing requirements for system data throughput. Due to the fact that at the moment we focus on the single device. Experiments carried out using Moverio BT-200 and SteamVR systems and the results of experimental application testing clearly indicate the ability to create a fully functional information system with the use of mixed reality technology.
Kinematic evaluation of virtual walking trajectories.
Cirio, Gabriel; Olivier, Anne-Hélène; Marchal, Maud; Pettré, Julien
2013-04-01
Virtual walking, a fundamental task in Virtual Reality (VR), is greatly influenced by the locomotion interface being used, by the specificities of input and output devices, and by the way the virtual environment is represented. No matter how virtual walking is controlled, the generation of realistic virtual trajectories is absolutely required for some applications, especially those dedicated to the study of walking behaviors in VR, navigation through virtual places for architecture, rehabilitation and training. Previous studies focused on evaluating the realism of locomotion trajectories have mostly considered the result of the locomotion task (efficiency, accuracy) and its subjective perception (presence, cybersickness). Few focused on the locomotion trajectory itself, but in situation of geometrically constrained task. In this paper, we study the realism of unconstrained trajectories produced during virtual walking by addressing the following question: did the user reach his destination by virtually walking along a trajectory he would have followed in similar real conditions? To this end, we propose a comprehensive evaluation framework consisting on a set of trajectographical criteria and a locomotion model to generate reference trajectories. We consider a simple locomotion task where users walk between two oriented points in space. The travel path is analyzed both geometrically and temporally in comparison to simulated reference trajectories. In addition, we demonstrate the framework over a user study which considered an initial set of common and frequent virtual walking conditions, namely different input devices, output display devices, control laws, and visualization modalities. The study provides insight into the relative contributions of each condition to the overall realism of the resulting virtual trajectories.
Simulators and virtual reality in surgical education.
Chou, Betty; Handa, Victoria L
2006-06-01
This article explores the pros and cons of virtual reality simulators, their abilities to train and assess surgical skills, and their potential future applications. Computer-based virtual reality simulators and more conventional box trainers are compared and contrasted. The virtual reality simulator provides objective assessment of surgical skills and immediate feedback further to enhance training. With this ability to provide standardized, unbiased assessment of surgical skills, the virtual reality trainer has the potential to be a tool for selecting, instructing, certifying, and recertifying gynecologists.
Miura, Satoshi; Kobayashi, Yo; Kawamura, Kazuya; Seki, Masatoshi; Nakashima, Yasutaka; Noguchi, Takehiko; Kasuya, Masahiro; Yokoo, Yuki; Fujie, Masakatsu G
2012-01-01
Surgical robots have improved considerably in recent years, but intuitive operability, which represents user inter-operability, has not been quantitatively evaluated. Therefore, for design of a robot with intuitive operability, we propose a method to measure brain activity to determine intuitive operability. The objective of this paper is to determine the master configuration against the monitor that allows users to perceive the manipulator as part of their own body. We assume that the master configuration produces an immersive reality experience for the user of putting his own arm into the monitor. In our experiments, as subjects controlled the hand controller to position the tip of the virtual slave manipulator on a target in a surgical simulator, we measured brain activity through brain-imaging devices. We performed our experiments for a variety of master manipulator configurations with the monitor position fixed. For all test subjects, we found that brain activity was stimulated significantly when the master manipulator was located behind the monitor. We conclude that this master configuration produces immersive reality through the body image, which is related to visual and somatic sense feedback.
Programming for Fun: MUDs as a Context for Collaborative Learning.
ERIC Educational Resources Information Center
Bruckman, Amy
Multi-User Dungeons (MUDs), are text-based virtual reality environments in which participants separated by great physical distances can communicate and collaborate in programming. Most MUDs started out as adventure games but are quickly being adapted for more "serious" endeavors. This paper presents a case study of the experiences of a…
NASA Technical Reports Server (NTRS)
Leifer, Larry; Michalowski, Stefan; Vanderloos, Machiel
1991-01-01
The Stanford/VA Interactive Robotics Laboratory set out in 1978 to test the hypothesis that industrial robotics technology could be applied to serve the manipulation needs of severely impaired individuals. Five generations of hardware, three generations of system software, and over 125 experimental subjects later, we believe that genuine utility is achievable. The experience includes development of over 65 task applications using voiced command, joystick control, natural language command and 3D object designation technology. A brief foray into virtual environments, using flight simulator technology, was instructive. If reality and virtuality come for comparable prices, you cannot beat reality. A detailed review of assistive robot anatomy and the performance specifications needed to achieve cost/beneficial utility will be used to support discussion of the future of rehabilitation telerobotics. Poised on the threshold of commercial viability, but constrained by the high cost of technically adequate manipulators, this worthy application domain flounders temporarily. In the long run, it will be the user interface that governs utility.
Evaluating display fidelity and interaction fidelity in a virtual reality game.
McMahan, Ryan P; Bowman, Doug A; Zielinski, David J; Brady, Rachael B
2012-04-01
In recent years, consumers have witnessed a technological revolution that has delivered more-realistic experiences in their own homes through high-definition, stereoscopic televisions and natural, gesture-based video game consoles. Although these experiences are more realistic, offering higher levels of fidelity, it is not clear how the increased display and interaction aspects of fidelity impact the user experience. Since immersive virtual reality (VR) allows us to achieve very high levels of fidelity, we designed and conducted a study that used a six-sided CAVE to evaluate display fidelity and interaction fidelity independently, at extremely high and low levels, for a VR first-person shooter (FPS) game. Our goal was to gain a better understanding of the effects of fidelity on the user in a complex, performance-intensive context. The results of our study indicate that both display and interaction fidelity significantly affect strategy and performance, as well as subjective judgments of presence, engagement, and usability. In particular, performance results were strongly in favor of two conditions: low-display, low-interaction fidelity (representative of traditional FPS games) and high-display, high-interaction fidelity (similar to the real world).
Manipulation of volumetric patient data in a distributed virtual reality environment.
Dech, F; Ai, Z; Silverstein, J C
2001-01-01
Due to increases in network speed and bandwidth, distributed exploration of medical data in immersive Virtual Reality (VR) environments is becoming increasingly feasible. The volumetric display of radiological data in such environments presents a unique set of challenges. The shear size and complexity of the datasets involved not only make them difficult to transmit to remote sites, but these datasets also require extensive user interaction in order to make them understandable to the investigator and manageable to the rendering hardware. A sophisticated VR user interface is required in order for the clinician to focus on the aspects of the data that will provide educational and/or diagnostic insight. We will describe a software system of data acquisition, data display, Tele-Immersion, and data manipulation that supports interactive, collaborative investigation of large radiological datasets. The hardware required in this strategy is still at the high-end of the graphics workstation market. Future software ports to Linux and NT, along with the rapid development of PC graphics cards, open the possibility for later work with Linux or NT PCs and PC clusters.
Will Anything Useful Come Out of Virtual Reality? Examination of a Naval Application
1993-05-01
The term virtual reality can encompass varying meanings, but some generally accepted attributes of a virtual environment are that it is immersive...technology, but at present there are few practical applications which are utilizing the broad range of virtual reality technology. This paper will discuss an...Operability, operator functions, Virtual reality , Man-machine interface, Decision aids/decision making, Decision support. ASW.
Cognitive training on stroke patients via virtual reality-based serious games.
Gamito, Pedro; Oliveira, Jorge; Coelho, Carla; Morais, Diogo; Lopes, Paulo; Pacheco, José; Brito, Rodrigo; Soares, Fabio; Santos, Nuno; Barata, Ana Filipa
2017-02-01
Use of virtual reality environments in cognitive rehabilitation offers cost benefits and other advantages. In order to test the effectiveness of a virtual reality application for neuropsychological rehabilitation, a cognitive training program using virtual reality was applied to stroke patients. A virtual reality-based serious games application for cognitive training was developed, with attention and memory tasks consisting of daily life activities. Twenty stroke patients were randomly assigned to two conditions: exposure to the intervention, and waiting list control. The results showed significant improvements in attention and memory functions in the intervention group, but not in the controls. Overall findings provide further support for the use of VR cognitive training applications in neuropsychological rehabilitation. Implications for Rehabilitation Improvements in memory and attention functions following a virtual reality-based serious games intervention. Training of daily-life activities using a virtual reality application. Accessibility to training contents.
The Potential of Using Virtual Reality Technology in Physical Activity Settings
ERIC Educational Resources Information Center
Pasco, Denis
2013-01-01
In recent years, virtual reality technology has been successfully used for learning purposes. The purposes of the article are to examine current research on the role of virtual reality in physical activity settings and discuss potential application of using virtual reality technology to enhance learning in physical education. The article starts…
Education about Hallucinations Using an Internet Virtual Reality System: A Qualitative Survey
ERIC Educational Resources Information Center
Yellowlees, Peter M.; Cook, James N.
2006-01-01
Objective: The authors evaluate an Internet virtual reality technology as an education tool about the hallucinations of psychosis. Method: This is a pilot project using Second Life, an Internet-based virtual reality system, in which a virtual reality environment was constructed to simulate the auditory and visual hallucinations of two patients…
Virtual reality measures in neuropsychological assessment: a meta-analytic review.
Neguț, Alexandra; Matu, Silviu-Andrei; Sava, Florin Alin; David, Daniel
2016-02-01
Virtual reality-based assessment is a new paradigm for neuropsychological evaluation, that might provide an ecological assessment, compared to paper-and-pencil or computerized neuropsychological assessment. Previous research has focused on the use of virtual reality in neuropsychological assessment, but no meta-analysis focused on the sensitivity of virtual reality-based measures of cognitive processes in measuring cognitive processes in various populations. We found eighteen studies that compared the cognitive performance between clinical and healthy controls on virtual reality measures. Based on a random effects model, the results indicated a large effect size in favor of healthy controls (g = .95). For executive functions, memory and visuospatial analysis, subgroup analysis revealed moderate to large effect sizes, with superior performance in the case of healthy controls. Participants' mean age, type of clinical condition, type of exploration within virtual reality environments, and the presence of distractors were significant moderators. Our findings support the sensitivity of virtual reality-based measures in detecting cognitive impairment. They highlight the possibility of using virtual reality measures for neuropsychological assessment in research applications, as well as in clinical practice.
Lee, Su-Hyun; Kim, Yu-Mi; Lee, Byoung-Hee
2015-07-01
[Purpose] This study investigated the therapeutic effects of virtual reality-based bilateral upper-extremity training on brain activity in patients with stroke. [Subjects and Methods] Eighteen chronic stroke patients were divided into two groups: the virtual reality-based bilateral upper-extremity training group (n = 10) and the bilateral upper-limb training group (n = 8). The virtual reality-based bilateral upper-extremity training group performed bilateral upper-extremity exercises in a virtual reality environment, while the bilateral upper-limb training group performed only bilateral upper-extremity exercise. All training was conducted 30 minutes per day, three times per week for six weeks, followed by brain activity evaluation. [Results] Electroencephalography showed significant increases in concentration in the frontopolar 2 and frontal 4 areas, and significant increases in brain activity in the frontopolar 1 and frontal 3 areas in the virtual reality-based bilateral upper-extremity training group. [Conclusion] Virtual reality-based bilateral upper-extremity training can improve the brain activity of stroke patients. Thus, virtual reality-based bilateral upper-extremity training is feasible and beneficial for improving brain activation in stroke patients.
Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis.
Bergeron, Mathieu; Lortie, Catherine L; Guitton, Matthieu J
2015-01-01
Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies.
Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis
Bergeron, Mathieu; Lortie, Catherine L.; Guitton, Matthieu J.
2015-01-01
Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies. PMID:26556560
Development and user validation of driving tasks for a power wheelchair simulator.
Archambault, Philippe S; Blackburn, Émilie; Reid, Denise; Routhier, François; Miller, William C
2017-07-01
Mobility is important for participation in daily activities and a power wheelchair (PW) can improve quality of life of individuals with mobility impairments. A virtual reality simulator may be helpful in complementing PW skills training, which is generally seen as insufficient by both clinicians and PW users. To this end, specific, ecologically valid activities, such as entering an elevator and navigating through a shopping mall crowd, have been added to the McGill wheelchair (miWe) simulator through a user-centred approach. The objective of this study was to validate the choice of simulated activities in a group of newly trained PW users. We recruited 17 new PW users, who practiced with the miWe simulator at home for two weeks. They then related their experience through the Short Feedback Questionnaire, the perceived Ease of Use Questionnaire, and semi-structured interviews. Participants in general greatly appreciated their experience with the simulator. During the interviews, this group made similar comments about the activities as our previous group of expert PW users had done. They also insisted on the importance of realism in the miWe activities, for their use in training. A PW simulator may be helpful if it supports the practice of activities in specific contexts (such as a bathroom or supermarket), to complement the basic skills training received in the clinic (such as driving forward, backward, turning, and avoiding obstacles). Implications for Rehabilitation New power wheelchair users appreciate practicing on a virtual reality simulator and find the experience useful when the simulated diving activities are realistic and ecologically valid. User-centred development can lead to simulated power wheelchair activities that adequately capture everyday driving challenges experienced in various environmental contexts.
Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.
Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J
2011-11-01
To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.
A teleoperation training simulator with visual and kinesthetic force virtual reality
NASA Technical Reports Server (NTRS)
Kim, Won S.; Schenker, Paul
1992-01-01
A force-reflecting teleoperation training simulator with a high-fidelity real-time graphics display has been developed for operator training. A novel feature of this simulator is that it enables the operator to feel contact forces and torques through a force-reflecting controller during the execution of the simulated peg-in-hole task, providing the operator with the feel of visual and kinesthetic force virtual reality. A peg-in-hole task is used in our simulated teleoperation trainer as a generic teleoperation task. A quasi-static analysis of a two-dimensional peg-in-hole task model has been extended to a three-dimensional model analysis to compute contact forces and torques for a virtual realization of kinesthetic force feedback. The simulator allows the user to specify force reflection gains and stiffness (compliance) values of the manipulator hand for both the three translational and the three rotational axes in Cartesian space. Three viewing modes are provided for graphics display: single view, two split views, and stereoscopic view.
A Context-Aware Method for Authentically Simulating Outdoors Shadows for Mobile Augmented Reality.
Barreira, Joao; Bessa, Maximino; Barbosa, Luis; Magalhaes, Luis
2018-03-01
Visual coherence between virtual and real objects is a major issue in creating convincing augmented reality (AR) applications. To achieve this seamless integration, actual light conditions must be determined in real time to ensure that virtual objects are correctly illuminated and cast consistent shadows. In this paper, we propose a novel method to estimate daylight illumination and use this information in outdoor AR applications to render virtual objects with coherent shadows. The illumination parameters are acquired in real time from context-aware live sensor data. The method works under unprepared natural conditions. We also present a novel and rapid implementation of a state-of-the-art skylight model, from which the illumination parameters are derived. The Sun's position is calculated based on the user location and time of day, with the relative rotational differences estimated from a gyroscope, compass and accelerometer. The results illustrated that our method can generate visually credible AR scenes with consistent shadows rendered from recovered illumination.
Augmented Reality to Preserve Hidden Vestiges in Historical Cities. a Case Study
NASA Astrophysics Data System (ADS)
Martínez, J. L.; Álvareza, S.; Finat, J.; Delgado, F. J.; Finat, J.
2015-02-01
Mobile devices provide an increasingly sophisticated support to enhanced experiences and understanding the remote past in an interactive way. The use of augmented reality technologies allows to develop mobile applications for indoor exploration of virtually reconstructed archaeological places. In our work we have built a virtual reconstruction of a Roman Villa with data arising from an urgent partial excavation which were performed in order to build a car parking in the historical city of Valladolid (Spain). In its current state, the archaeological site is covered by an urban garden. Localization and tracking are performed using a combination of GPS and inertial sensors of the mobile device. In this work we prove how to perform an interactive navigation around the 3D virtual model showing an interpretation of the way it was. The user experience is enhanced by answering some simple questions, performing minor tasks and puzzles which are presented with multimedia contents linked to key features of the archaeological site.
a New ER Fluid Based Haptic Actuator System for Virtual Reality
NASA Astrophysics Data System (ADS)
Böse, H.; Baumann, M.; Monkman, G. J.; Egersdörfer, S.; Tunayar, A.; Freimuth, H.; Ermert, H.; Khaled, W.
The concept and some steps in the development of a new actuator system which enables the haptic perception of mechanically inhomogeneous virtual objects are introduced. The system consists of a two-dimensional planar array of actuator elements containing an electrorheological (ER) fluid. When a user presses his fingers onto the surface of the actuator array, he perceives locally variable resistance forces generated by vertical pistons which slide in the ER fluid through the gaps between electrode pairs. The voltage in each actuator element can be individually controlled by a novel sophisticated switching technology based on optoelectric gallium arsenide elements. The haptic information which is represented at the actuator array can be transferred from a corresponding sensor system based on ultrasonic elastography. The combined sensor-actuator system may serve as a technology platform for various applications in virtual reality, like telemedicine where the information on the consistency of tissue of a real patient is detected by the sensor part and recorded by the actuator part at a remote location.
A 3D virtual reality simulator for training of minimally invasive surgery.
Mi, Shao-Hua; Hou, Zeng-Gunag; Yang, Fan; Xie, Xiao-Liang; Bian, Gui-Bin
2014-01-01
For the last decade, remarkable progress has been made in the field of cardiovascular disease treatment. However, these complex medical procedures require a combination of rich experience and technical skills. In this paper, a 3D virtual reality simulator for core skills training in minimally invasive surgery is presented. The system can generate realistic 3D vascular models segmented from patient datasets, including a beating heart, and provide a real-time computation of force and force feedback module for surgical simulation. Instruments, such as a catheter or guide wire, are represented by a multi-body mass-spring model. In addition, a realistic user interface with multiple windows and real-time 3D views are developed. Moreover, the simulator is also provided with a human-machine interaction module that gives doctors the sense of touch during the surgery training, enables them to control the motion of a virtual catheter/guide wire inside a complex vascular model. Experimental results show that the simulator is suitable for minimally invasive surgery training.
A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality
Kim, Mingyu; Jeon, Changyu; Kim, Jinmo
2017-01-01
This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545
A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.
Kim, Mingyu; Jeon, Changyu; Kim, Jinmo
2017-05-17
This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.
Investigation of tracking systems properties in CAVE-type virtual reality systems
NASA Astrophysics Data System (ADS)
Szymaniak, Magda; Mazikowski, Adam; Meironke, Michał
2017-08-01
In recent years, many scientific and industrial centers in the world developed a virtual reality systems or laboratories. One of the most advanced solutions are Immersive 3D Visualization Lab (I3DVL), a CAVE-type (Cave Automatic Virtual Environment) laboratory. It contains two CAVE-type installations: six-screen installation arranged in a form of a cube, and four-screen installation, a simplified version of the previous one. The user feeling of "immersion" and interaction with virtual world depend on many factors, in particular on the accuracy of the tracking system of the user. In this paper properties of the tracking systems applied in I3DVL was investigated. For analysis two parameters were selected: the accuracy of the tracking system and the range of detection of markers by the tracking system in space of the CAVE. Measurements of system accuracy were performed for six-screen installation, equipped with four tracking cameras for three axes: X, Y, Z. Rotation around the Y axis was also analyzed. Measured tracking system shows good linear and rotating accuracy. The biggest issue was the range of the monitoring of markers inside the CAVE. It turned out, that the tracking system lose sight of the markers in the corners of the installation. For comparison, for a simplified version of CAVE (four-screen installation), equipped with eight tracking cameras, this problem was not occur. Obtained results will allow for improvement of cave quality.
Multithreaded hybrid feature tracking for markerless augmented reality.
Lee, Taehee; Höllerer, Tobias
2009-01-01
We describe a novel markerless camera tracking approach and user interaction methodology for augmented reality (AR) on unprepared tabletop environments. We propose a real-time system architecture that combines two types of feature tracking. Distinctive image features of the scene are detected and tracked frame-to-frame by computing optical flow. In order to achieve real-time performance, multiple operations are processed in a synchronized multi-threaded manner: capturing a video frame, tracking features using optical flow, detecting distinctive invariant features, and rendering an output frame. We also introduce user interaction methodology for establishing a global coordinate system and for placing virtual objects in the AR environment by tracking a user's outstretched hand and estimating a camera pose relative to it. We evaluate the speed and accuracy of our hybrid feature tracking approach, and demonstrate a proof-of-concept application for enabling AR in unprepared tabletop environments, using bare hands for interaction.
Design of an immersive simulator for assisted power wheelchair driving.
Devigne, Louise; Babel, Marie; Nouviale, Florian; Narayanan, Vishnu K; Pasteau, Francois; Gallien, Philippe
2017-07-01
Driving a power wheelchair is a difficult and complex visual-cognitive task. As a result, some people with visual and/or cognitive disabilities cannot access the benefits of a power wheelchair because their impairments prevent them from driving safely. In order to improve their access to mobility, we have previously designed a semi-autonomous assistive wheelchair system which progressively corrects the trajectory as the user manually drives the wheelchair and smoothly avoids obstacles. Developing and testing such systems for wheelchair driving assistance requires a significant amount of material resources and clinician time. With Virtual Reality technology, prototypes can be developed and tested in a risk-free and highly flexible Virtual Environment before equipping and testing a physical prototype. Additionally, users can "virtually" test and train more easily during the development process. In this paper, we introduce a power wheelchair driving simulator allowing the user to navigate with a standard wheelchair in an immersive 3D Virtual Environment. The simulation framework is designed to be flexible so that we can use different control inputs. In order to validate the framework, we first performed tests on the simulator with able-bodied participants during which the user's Quality of Experience (QoE) was assessed through a set of questionnaires. Results show that the simulator is a promising tool for future works as it generates a good sense of presence and requires rather low cognitive effort from users.
Sun, Huey-Min; Li, Shang-Phone; Zhu, Yu-Qian; Hsiao, Bo
2015-09-01
Technological advance in human-computer interaction has attracted increasing research attention, especially in the field of virtual reality (VR). Prior research has focused on examining the effects of VR on various outcomes, for example, learning and health. However, which factors affect the final outcomes? That is, what kind of VR system design will achieve higher usability? This question remains largely. Furthermore, when we look at VR system deployment from a human-computer interaction (HCI) lens, does user's attitude play a role in achieving the final outcome? This study aims to understand the effect of immersion and involvement, as well as users' regulatory focus on usability for a somatosensory VR learning system. This study hypothesized that regulatory focus and presence can effectively enhance user's perceived usability. Survey data from 78 students in Taiwan indicated that promotion focus is positively related to user's perceived efficiency, whereas involvement and promotion focus are positively related to user's perceived effectiveness. Promotion focus also predicts user satisfaction and overall usability perception. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Lorenz, Mario; Brade, Jennifer; Diamond, Lisa; Sjölie, Daniel; Busch, Marc; Tscheligi, Manfred; Klimant, Philipp; Heyde, Christoph-E; Hammer, Niels
2018-04-23
Virtual Reality (VR) is used for a variety of applications ranging from entertainment to psychological medicine. VR has been demonstrated to influence higher order cognitive functions and cortical plasticity, with implications on phobia and stroke treatment. An integral part for successful VR is a high sense of presence - a feeling of 'being there' in the virtual scenario. The underlying cognitive and perceptive functions causing presence in VR scenarios are however not completely known. It is evident that the brain function is influenced by drugs, such as ethanol, potentially confounding cortical plasticity, also in VR. As ethanol is ubiquitous and forms part of daily life, understanding the effects of ethanol on presence and user experience, the attitudes and emotions about using VR applications, is important. This exploratory study aims at contributing towards an understanding of how low-dose ethanol intake influences presence, user experience and their relationship in a validated VR context. It was found that low-level ethanol consumption did influence presence and user experience, but on a minimal level. In contrast, correlations between presence and user experience were strongly influenced by low-dose ethanol. Ethanol consumption may consequently alter cognitive and perceptive functions related to the connections between presence and user experience.
Virtual Realities and the Future of Text.
ERIC Educational Resources Information Center
Marcus, Stephen
1992-01-01
Discusses issues surrounding virtual reality and "virtual books." Suggests that those who are exploring the territory of virtual realities are already helping to expand and enrich expectations and visions for integrating technology into reading and writing. (RS)
Using virtual reality environment to facilitate training with advanced upper-limb prosthesis.
Resnik, Linda; Etter, Katherine; Klinger, Shana Lieberman; Kambe, Charles
2011-01-01
Technological advances in upper-limb prosthetic design offer dramatically increased possibilities for powered movement. The DEKA Arm system allows users 10 powered degrees of movement. Learning to control these movements by utilizing a set of motions that, in most instances, differ from those used to obtain the desired action prior to amputation is a challenge for users. In the Department of Veterans Affairs "Study to Optimize the DEKA Arm," we attempted to facilitate motor learning by using a virtual reality environment (VRE) program. This VRE program allows users to practice controlling an avatar using the controls designed to operate the DEKA Arm in the real world. In this article, we provide highlights from our experiences implementing VRE in training amputees to use the full DEKA Arm. This article discusses the use of VRE in amputee rehabilitation, describes the VRE system used with the DEKA Arm, describes VRE training, provides qualitative data from a case study of a subject, and provides recommendations for future research and implementation of VRE in amputee rehabilitation. Our experience has led us to believe that training with VRE is particularly valuable for upper-limb amputees who must master a large number of controls and for those amputees who need a structured learning environment because of cognitive deficits.
Use of virtual reality to promote hand therapy post-stroke
NASA Astrophysics Data System (ADS)
Tsoupikova, Daria; Stoykov, Nikolay; Vick, Randy; Li, Yu; Kamper, Derek; Listenberger, Molly
2013-03-01
A novel artistic virtual reality (VR) environment was developed and tested for use as a rehabilitation protocol for post-stroke hand rehabilitation therapy. The system was developed by an interdisciplinary team of engineers, art therapists, occupational therapists, and VR artists to improve patients' motivation and engagement. Specific exercises were developed to explicitly promote the practice of therapeutic tasks requiring hand and arm coordination for upper extremity rehabilitation. Here we describe system design, development, and user testing for efficiency, subject's satisfaction and clinical feasibility. We report results of the completed qualitative, pre-clinical pilot study of the system effectiveness for therapy. Fourteen stroke survivors with chronic hemiparesis participated in a single training session within the environment to gauge user response to the protocol through a custom survey. Results indicate that users found the system comfortable, enjoyable, tiring; instructions clear, and reported a high level of satisfaction with the VR environment and rehabilitation task variety and difficulty. Most patients reported very positive impressions of the VR environment and rated it highly, appreciating its engagement and motivation. We are currently conducting a longitudinal intervention study over 6 weeks in stroke survivors with chronic hemiparesis. Initial results following use of the system on the first subjects demonstrate that the system is operational and can facilitate therapy for post stroke patients with upper extremity impairment.
ERIC Educational Resources Information Center
Cheng, Yufang; Huang, Ruowen
2012-01-01
The focus of this study is using data glove to practice Joint attention skill in virtual reality environment for people with pervasive developmental disorder (PDD). The virtual reality environment provides a safe environment for PDD people. Especially, when they made errors during practice in virtual reality environment, there is no suffering or…
Naval Applications of Virtual Reality,
1993-01-01
Expert Virtual Reality Special Report , pp. 67- 72. 14. SUBJECT TERMS 15 NUMBER o0 PAGES man-machine interface virtual reality decision support...collective and individual performance. -" Virtual reality projects could help *y by Mark Gembicki Av-t-abilty CodesA Avafllat Idt Iofe and David Rousseau...alt- 67 VIRTUAL . REALITY SPECIAl, REPORT r-OPY avcriaikxb to DD)C qg .- 154,41X~~~~~~~~~~~~j 1411 iI..:41 T a].’ 1,1 4 1111 I 4 1 * .11 ~ 4 l.~w111511 I
Virtual reality and interactive 3D as effective tools for medical training.
Webb, George; Norcliffe, Alex; Cannings, Peter; Sharkey, Paul; Roberts, Dave
2003-01-01
CAVE-like displays allow a user to walk in to a virtual environment, and use natural movement to change the viewpoint of virtual objects which they can manipulate with a hand held device. This maps well to many surgical procedures offering strong potential for training and planning. These devices may be networked together allowing geographically remote users to share the interactive experience. This maps to the strong need for distance training and planning of surgeons. Our paper shows how the properties of a CAVE-Like facility can be maximised in order to provide an ideal environment for medical training. The implementation of a large 3D-eye is described. The resulting application is that of an eye that can be manipulated and examined by trainee medics under the guidance of a medical expert. The progression and effects of different ailments can be illustrated and corrective procedures, demonstrated.
Magical Stories: Blending Virtual Reality and Artificial Intelligence.
ERIC Educational Resources Information Center
McLellan, Hilary
Artificial intelligence (AI) techniques and virtual reality (VR) make possible powerful interactive stories, and this paper focuses on examples of virtual characters in three dimensional (3-D) worlds. Waldern, a virtual reality game designer, has theorized about and implemented software design of virtual teammates and opponents that incorporate AI…
ERIC Educational Resources Information Center
Franchi, Jorge
1994-01-01
Highlights of this overview of virtual reality include optics; interface devices; virtual worlds; potential applications, including medicine and archaeology; problems, including costs; current research and development; future possibilities; and a listing of vendors and suppliers of virtual reality products. (Contains 11 references.) (LRW)
Psychological benefits of virtual reality for patients in rehabilitation therapy.
Chen, Chih-Hung; Jeng, Ming-Chang; Fung, Chin-Ping; Doong, Ji-Liang; Chuang, Tien-Yow
2009-05-01
Whether virtual rehabilitation is beneficial has not been determined. To investigate the psychological benefits of virtual reality in rehabilitation. An experimental group underwent therapy with a virtual-reality-based exercise bike, and a control group underwent the therapy without virtual-reality equipment. Hospital laboratory. 30 patients suffering from spinal-cord injury. A designed rehabilitation therapy. Endurance, Borg's rating-of-perceived-exertion scale, the Activation-Deactivation Adjective Check List (AD-ACL), and the Simulator Sickness Questionnaire. The differences between the experimental and control groups were significant for AD-ACL calmness and tension. A virtual-reality-based rehabilitation program can ease patients' tension and induce calm.
Marketing analysis of a positive technology app for the self-management of psychological stress.
Wiederhold, Brenda K; Boyd, Chelsie; Sulea, Camelia; Gaggioli, Andrea; Riva, Giuseppe
2014-01-01
The INTERSTRESS project developed a completely new concept in the treatment of psychological stress: Interreality, a concept that combines cognitive behavioral therapy with a hybrid, closed-loop empowering experience bridging real and virtual worlds. This model provides the opportunity for individual citizens to become active participants in their own health and well-being. This article contains the results of the Marketing Trial and analysis of the opinions of individual consumers/end users of the INTERSTRESS product. The specific objective of this study was to evaluate the feasibility, efficacy and user acceptance of a novel mobile-based relaxation training tool in combination with biofeedback exercises and wearable biosensors. Relaxation was aided through immersion in a mobile virtual scenario (a virtual island) featuring pre-recorded audio narratives guiding a series of relaxation exercises. During biofeedback exercises, a wearable biosensor system provided data which directly modified the virtual reality experience in real-time. Thirty-six participants evaluated the product and overall feedback from users was positive, with some variation seen based on participant gender. A larger market study is now underway to understand if there are cultural variations in acceptability of the device.
Human Factors in Virtual Reality Development
NASA Technical Reports Server (NTRS)
Kaiser, Mary K.; Proffitt, Dennis R.; Null, Cynthia H. (Technical Monitor)
1995-01-01
This half-day tutorial will provide an overview of basic perceptual functioning as it relates to the design of virtual environment systems. The tutorial consists of three parts. First, basic issues in visual perception will be presented, including discussions of the visual sensations of brightness and color, and the visual perception of depth relationships in three-dimensional space (with a special emphasis on motion -specified depth). The second section will discuss the importance of conducting human-factors user studies and evaluations. Examples and suggestions on how best to get help with user studies will be provided. Finally, we will discuss how, by drawing on their complementary competencies, perceptual psychologists and computer engineers can work as a team to develop optimal VR systems, technologies, and techniques.
CAVE2: a hybrid reality environment for immersive simulation and information analysis
NASA Astrophysics Data System (ADS)
Febretti, Alessandro; Nishimoto, Arthur; Thigpen, Terrance; Talandis, Jonas; Long, Lance; Pirtle, J. D.; Peterka, Tom; Verlo, Alan; Brown, Maxine; Plepys, Dana; Sandin, Dan; Renambot, Luc; Johnson, Andrew; Leigh, Jason
2013-03-01
Hybrid Reality Environments represent a new kind of visualization spaces that blur the line between virtual environments and high resolution tiled display walls. This paper outlines the design and implementation of the CAVE2TM Hybrid Reality Environment. CAVE2 is the world's first near-seamless flat-panel-based, surround-screen immersive system. Unique to CAVE2 is that it will enable users to simultaneously view both 2D and 3D information, providing more flexibility for mixed media applications. CAVE2 is a cylindrical system of 24 feet in diameter and 8 feet tall, and consists of 72 near-seamless, off-axisoptimized passive stereo LCD panels, creating an approximately 320 degree panoramic environment for displaying information at 37 Megapixels (in stereoscopic 3D) or 74 Megapixels in 2D and at a horizontal visual acuity of 20/20. Custom LCD panels with shifted polarizers were built so the images in the top and bottom rows of LCDs are optimized for vertical off-center viewing- allowing viewers to come closer to the displays while minimizing ghosting. CAVE2 is designed to support multiple operating modes. In the Fully Immersive mode, the entire room can be dedicated to one virtual simulation. In 2D model, the room can operate like a traditional tiled display wall enabling users to work with large numbers of documents at the same time. In the Hybrid mode, a mixture of both 2D and 3D applications can be simultaneously supported. The ability to treat immersive work spaces in this Hybrid way has never been achieved before, and leverages the special abilities of CAVE2 to enable researchers to seamlessly interact with large collections of 2D and 3D data. To realize this hybrid ability, we merged the Scalable Adaptive Graphics Environment (SAGE) - a system for supporting 2D tiled displays, with Omegalib - a virtual reality middleware supporting OpenGL, OpenSceneGraph and Vtk applications.
Confessions of a Second Life: Conforming in the Virtual World?
NASA Astrophysics Data System (ADS)
Chicas, K.; Bailenson, J.; Stevenson Won, A.; Bailey, J.
2012-12-01
Virtual Worlds such as Second Life or World of Warcraft are increasingly popular, with people all over the world joining these online communities. In these virtual environments people break the barrier of reality every day when they fly, walk through walls and teleport places. It is easy for people to violate the norms and behaviors of the real world in the virtual environment without real world consequences. However, previous research has shown that users' behavior may conform to their digital self-representation (avatar). This is also known as the Proteus effect (Yee, 2007). Are people behaving in virtual worlds in ways that most people would not in the physical world? It's important to understand the behaviors that occur in the virtual world if they have an impact on how people act in the real world.
ERIC Educational Resources Information Center
Allison, John
2008-01-01
This paper will undertake a critical review of the impact of virtual reality tools on the teaching of history. Virtual reality is useful in several different ways. History educators, elementary and secondary school teachers and professors, can all profit from the digital environment. Challenges arise quickly however. Virtual reality technologies…
Immersive virtual reality simulations in nursing education.
Kilmon, Carol A; Brown, Leonard; Ghosh, Sumit; Mikitiuk, Artur
2010-01-01
This article explores immersive virtual reality as a potential educational strategy for nursing education and describes an immersive learning experience now being developed for nurses. This pioneering project is a virtual reality application targeting speed and accuracy of nurse response in emergency situations requiring cardiopulmonary resuscitation. Other potential uses and implications for the development of virtual reality learning programs are discussed.
Virtual reality simulation: using three-dimensional technology to teach nursing students.
Jenson, Carole E; Forsyth, Diane McNally
2012-06-01
The use of computerized technology is rapidly growing in the classroom and in healthcare. An emerging computer technology strategy for nursing education is the use of virtual reality simulation. This computer-based three-dimensional educational tool simulates real-life patient experiences in a risk-free environment, allows for repeated practice sessions, requires clinical decision making, exposes students to diverse patient conditions, provides immediate feedback, and is portable. The purpose of this article was to review the importance of virtual reality simulation as a computerized teaching strategy. In addition, a project to explore readiness of nursing faculty at one major Midwestern university for the use of virtual reality simulation as a computerized teaching strategy is described where faculty thought virtual reality simulation would increase students' knowledge of an intravenous line insertion procedure. Faculty who practiced intravenous catheter insertion via virtual reality simulation expressed a wide range of learning experiences from using virtual reality simulation that is congruent with the literature regarding the barriers to student learning. Innovative teaching strategies, such as virtual reality simulation, address barriers of increasing patient acuity, high student-to-faculty ratio, patient safety concerns from faculty, and student anxiety and can offer rapid feedback to students.
Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?
Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming
2017-01-01
A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males' mental workloads were significantly higher than females'. For males, high-value products' mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio-visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio-visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference.
Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?
Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming
2017-01-01
A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males’ mental workloads were significantly higher than females’. For males, high-value products’ mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio–visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio–visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference. PMID:28184207
Virtual Reality at the PC Level
NASA Technical Reports Server (NTRS)
Dean, John
1998-01-01
The main objective of my research has been to incorporate virtual reality at the desktop level; i.e., create virtual reality software that can be run fairly inexpensively on standard PC's. The standard language used for virtual reality on PC's is VRML (Virtual Reality Modeling Language). It is a new language so it is still undergoing a lot of changes. VRML 1.0 came out only a couple years ago and VRML 2.0 came out around last September. VRML is an interpreted language that is run by a web browser plug-in. It is fairly flexible in terms of allowing you to create different shapes and animations. Before this summer, I knew very little about virtual reality and I did not know VRML at all. I learned the VRML language by reading two books and experimenting on a PC. The following topics are presented: CAD to VRML, VRML 1.0 to VRML 2.0, VRML authoring tools, VRML browsers, finding virtual reality applications, the AXAF project, the VRML generator program, web communities and future plans.
Interpretations of virtual reality.
Voiskounsky, Alexander
2011-01-01
University students were surveyed to learn what they know about virtual realities. The two studies were administered with a half-year interval in which the students (N=90, specializing either in mathematics and science, or in social science and humanities) were asked to name particular examples of virtual realities. The second, but not the first study, was administered after the participants had the chance to see the movie "Avatar" (no investigation was held into whether they really saw it). While the students in both studies widely believed that activities such as social networking and online gaming represent virtual realities, some other examples provided by the students in the two studies differ: in the second study the participants expressed a better understanding of the items related to virtual realities. At the same time, not a single participant reported particular psychological states (either regular or altered) as examples of virtual realities. Profound popularization efforts need to be done to acquaint the public, including college students, with virtual realities and let the public adequately understand how such systems work.
Nesaratnam, N; Thomas, P; Vivian, A
2017-10-01
IntroductionDissociated tests of strabismus provide valuable information for diagnosis and monitoring of ocular misalignment in patients with normal retinal correspondence. However, they are vulnerable to operator error and rely on a fixed head position. Virtual reality headsets obviate the need for head fixation, while providing other clear theoretical advantages, including complete control over the illumination and targets presented for the patient's interaction.PurposeWe compared the performance of a virtual reality-based test of ocular misalignment to that of the traditional Lees screen, to establish the feasibility of using virtual reality technology in ophthalmic settings in the future.MethodsThree patients underwent a traditional Lees screen test, and a virtual reality headset-based test of ocular motility. The virtual reality headset-based programme consisted of an initial test to measure horizontal and vertical deviation, followed by a test for torsion.ResultsThe pattern of deviation obtained using the virtual reality-based test showed agreement with that obtained from the Lees screen for patients with a fourth nerve palsy, comitant esotropia, and restrictive thyroid eye disease.ConclusionsThis study reports the first use of a virtual reality headset in assessing ocular misalignment, and demonstrates that it is a feasible dissociative test of strabismus.
A convertor and user interface to import CAD files into worldtoolkit virtual reality systems
NASA Technical Reports Server (NTRS)
Wang, Peter Hor-Ching
1996-01-01
Virtual Reality (VR) is a rapidly developing human-to-computer interface technology. VR can be considered as a three-dimensional computer-generated Virtual World (VW) which can sense particular aspects of a user's behavior, allow the user to manipulate the objects interactively, and render the VW at real-time accordingly. The user is totally immersed in the virtual world and feel the sense of transforming into that VW. NASA/MSFC Computer Application Virtual Environments (CAVE) has been developing the space-related VR applications since 1990. The VR systems in CAVE lab are based on VPL RB2 system which consists of a VPL RB2 control tower, an LX eyephone, an Isotrak polhemus sensor, two Fastrak polhemus sensors, a folk of Bird sensor, and two VPL DG2 DataGloves. A dynamics animator called Body Electric from VPL is used as the control system to interface with all the input/output devices and to provide the network communications as well as VR programming environment. The RB2 Swivel 3D is used as the modelling program to construct the VW's. A severe limitation of the VPL VR system is the use of RB2 Swivel 3D, which restricts the files to a maximum of 1020 objects and doesn't have the advanced graphics texture mapping. The other limitation is that the VPL VR system is a turn-key system which does not provide the flexibility for user to add new sensors and C language interface. Recently, NASA/MSFC CAVE lab provides VR systems built on Sense8 WorldToolKit (WTK) which is a C library for creating VR development environments. WTK provides device drivers for most of the sensors and eyephones available on the VR market. WTK accepts several CAD file formats, such as Sense8 Neutral File Format, AutoCAD DXF and 3D Studio file format, Wave Front OBJ file format, VideoScape GEO file format, Intergraph EMS stereolithographics and CATIA Stereolithographics STL file formats. WTK functions are object-oriented in their naming convention, are grouped into classes, and provide easy C language interface. Using a CAD or modelling program to build a VW for WTK VR applications, we typically construct the stationary universe with all the geometric objects except the dynamic objects, and create each dynamic object in an individual file.
A standardized set of 3-D objects for virtual reality research and applications.
Peeters, David
2018-06-01
The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.
[Application of virtual reality in surgical treatment of complex head and neck carcinoma].
Zhou, Y Q; Li, C; Shui, C Y; Cai, Y C; Sun, R H; Zeng, D F; Wang, W; Li, Q L; Huang, L; Tu, J; Jiang, J
2018-01-07
Objective: To investigate the application of virtual reality technology in the preoperative evaluation of complex head and neck carcinoma and he value of virtual reality technology in surgical treatment of head and neck carcinoma. Methods: The image data of eight patients with complex head and neck carcinoma treated from December 2016 to May 2017 was acquired. The data were put into virtual reality system to built the three-dimensional anatomical model of carcinoma and to created the surgical scene. The process of surgery was stimulated by recognizing the relationship between tumor and surrounding important structures. Finally all patients were treated with surgery. And two typical cases were reported. Results: With the help of virtual reality, surgeons could adequately assess the condition of carcinoma and the security of operation and ensured the safety of operations. Conclusions: Virtual reality can provide the surgeons with the sensory experience in virtual surgery scenes and achieve the man-computer cooperation and stereoscopic assessment, which will ensure the safety of surgery. Virtual reality has a huge impact on guiding the traditional surgical procedure of head and neck carcinoma.
Role of virtual reality simulation in endoscopy training
Harpham-Lockyer, Louis; Laskaratos, Faidon-Marios; Berlingieri, Pasquale; Epstein, Owen
2015-01-01
Recent advancements in virtual reality graphics and models have allowed virtual reality simulators to be incorporated into a variety of endoscopic training programmes. Use of virtual reality simulators in training programmes is thought to improve skill acquisition amongst trainees which is reflected in improved patient comfort and safety. Several studies have already been carried out to ascertain the impact that usage of virtual reality simulators may have upon trainee learning curves and how this may translate to patient comfort. This article reviews the available literature in this area of medical education which is particularly relevant to all parties involved in endoscopy training and curriculum development. Assessment of the available evidence for an optimal exposure time with virtual reality simulators and the long-term benefits of their use are also discussed. PMID:26675895
Role of virtual reality simulation in endoscopy training.
Harpham-Lockyer, Louis; Laskaratos, Faidon-Marios; Berlingieri, Pasquale; Epstein, Owen
2015-12-10
Recent advancements in virtual reality graphics and models have allowed virtual reality simulators to be incorporated into a variety of endoscopic training programmes. Use of virtual reality simulators in training programmes is thought to improve skill acquisition amongst trainees which is reflected in improved patient comfort and safety. Several studies have already been carried out to ascertain the impact that usage of virtual reality simulators may have upon trainee learning curves and how this may translate to patient comfort. This article reviews the available literature in this area of medical education which is particularly relevant to all parties involved in endoscopy training and curriculum development. Assessment of the available evidence for an optimal exposure time with virtual reality simulators and the long-term benefits of their use are also discussed.
A Collaborative Molecular Modeling Environment Using a Virtual Tunneling Service
Lee, Jun; Kim, Jee-In; Kang, Lin-Woo
2012-01-01
Collaborative researches of three-dimensional molecular modeling can be limited by different time zones and locations. A networked virtual environment can be utilized to overcome the problem caused by the temporal and spatial differences. However, traditional approaches did not sufficiently consider integration of different computing environments, which were characterized by types of applications, roles of users, and so on. We propose a collaborative molecular modeling environment to integrate different molecule modeling systems using a virtual tunneling service. We integrated Co-Coot, which is a collaborative crystallographic object-oriented toolkit, with VRMMS, which is a virtual reality molecular modeling system, through a collaborative tunneling system. The proposed system showed reliable quantitative and qualitative results through pilot experiments. PMID:22927721
Hybrid Reality Lab Capabilities - Video 2
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2016-01-01
Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.
Mobile VR in Education: From the Fringe to the Mainstream
ERIC Educational Resources Information Center
Cochrane, Thomas
2016-01-01
This paper explores the development of virtual reality (VR) use in education and the emergence of mobile VR based content creation and sharing as a platform for enabling learner-generated content and learner-generated contexts. The author argues that an ecology of resources that maps the user content creation and sharing affordances of mobile…
Would You Adopt Second Life as a Training and Development Tool?
ERIC Educational Resources Information Center
Taylor, Kevin C.; Chyung, Seung Youn
2008-01-01
Due to advances in computer and network technology, virtual reality (VR) is no longer just an area of scientific research. It has also become a popular consumer product, as demonstrated by the proliferation of massive multiplayer online role-playing games. Second Life (SL), in particular, has gained popularity among casual users since it became…
Learning Protein Structure with Peers in an AR-Enhanced Learning Environment
ERIC Educational Resources Information Center
Chen, Yu-Chien
2013-01-01
Augmented reality (AR) is an interactive system that allows users to interact with virtual objects and the real world at the same time. The purpose of this dissertation was to explore how AR, as a new visualization tool, that can demonstrate spatial relationships by representing three dimensional objects and animations, facilitates students to…
Virtual Reality Exploration and Planning for Precision Colorectal Surgery.
Guerriero, Ludovica; Quero, Giuseppe; Diana, Michele; Soler, Luc; Agnus, Vincent; Marescaux, Jacques; Corcione, Francesco
2018-06-01
Medical software can build a digital clone of the patient with 3-dimensional reconstruction of Digital Imaging and Communication in Medicine images. The virtual clone can be manipulated (rotations, zooms, etc), and the various organs can be selectively displayed or hidden to facilitate a virtual reality preoperative surgical exploration and planning. We present preliminary cases showing the potential interest of virtual reality in colorectal surgery for both cases of diverticular disease and colonic neoplasms. This was a single-center feasibility study. The study was conducted at a tertiary care institution. Two patients underwent a laparoscopic left hemicolectomy for diverticular disease, and 1 patient underwent a laparoscopic right hemicolectomy for cancer. The 3-dimensional virtual models were obtained from preoperative CT scans. The virtual model was used to perform preoperative exploration and planning. Intraoperatively, one of the surgeons was manipulating the virtual reality model, using the touch screen of a tablet, which was interactively displayed to the surgical team. The main outcome was evaluation of the precision of virtual reality in colorectal surgery planning and exploration. In 1 patient undergoing laparoscopic left hemicolectomy, an abnormal origin of the left colic artery beginning as an extremely short common trunk from the inferior mesenteric artery was clearly seen in the virtual reality model. This finding was missed by the radiologist on CT scan. The precise identification of this vascular variant granted a safe and adequate surgery. In the remaining cases, the virtual reality model helped to precisely estimate the vascular anatomy, providing key landmarks for a safer dissection. A larger sample size would be necessary to definitively assess the efficacy of virtual reality in colorectal surgery. Virtual reality can provide an enhanced understanding of crucial anatomical details, both preoperatively and intraoperatively, which could contribute to improve safety in colorectal surgery.
Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays
Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Wetzstein, Gordon
2017-01-01
From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one. PMID:28193871
A review of existing and potential computer user interfaces for modern radiology.
Iannessi, Antoine; Marcy, Pierre-Yves; Clatz, Olivier; Bertrand, Anne-Sophie; Sugimoto, Maki
2018-05-16
The digitalization of modern imaging has led radiologists to become very familiar with computers and their user interfaces (UI). New options for display and command offer expanded possibilities, but the mouse and keyboard remain the most commonly utilized, for usability reasons. In this work, we review and discuss different UI and their possible application in radiology. We consider two-dimensional and three-dimensional imaging displays in the context of interventional radiology, and discuss interest in touchscreens, kinetic sensors, eye detection, and augmented or virtual reality. We show that UI design specifically for radiologists is key for future use and adoption of such new interfaces. Next-generation UI must fulfil professional needs, while considering contextual constraints. • The mouse and keyboard remain the most utilized user interfaces for radiologists. • Touchscreen, holographic, kinetic sensors and eye tracking offer new possibilities for interaction. • 3D and 2D imaging require specific user interfaces. • Holographic display and augmented reality provide a third dimension to volume imaging. • Good usability is essential for adoption of new user interfaces by radiologists.
Virtual reality in surgical training.
Lange, T; Indelicato, D J; Rosen, J M
2000-01-01
Virtual reality in surgery and, more specifically, in surgical training, faces a number of challenges in the future. These challenges are building realistic models of the human body, creating interface tools to view, hear, touch, feel, and manipulate these human body models, and integrating virtual reality systems into medical education and treatment. A final system would encompass simulators specifically for surgery, performance machines, telemedicine, and telesurgery. Each of these areas will need significant improvement for virtual reality to impact medicine successfully in the next century. This article gives an overview of, and the challenges faced by, current systems in the fast-changing field of virtual reality technology, and provides a set of specific milestones for a truly realistic virtual human body.
Immersive Education, an Annotated Webliography
ERIC Educational Resources Information Center
Pricer, Wayne F.
2011-01-01
In this second installment of a two-part feature on immersive education a webliography will provide resources discussing the use of various types of computer simulations including: (a) augmented reality, (b) virtual reality programs, (c) gaming resources for teaching with technology, (d) virtual reality lab resources, (e) virtual reality standards…
Training Surgical Residents With a Haptic Robotic Central Venous Catheterization Simulator.
Pepley, David F; Gordon, Adam B; Yovanoff, Mary A; Mirkin, Katelin A; Miller, Scarlett R; Han, David C; Moore, Jason Z
Ultrasound guided central venous catheterization (CVC) is a common surgical procedure with complication rates ranging from 5 to 21 percent. Training is typically performed using manikins that do not simulate anatomical variations such as obesity and abnormal vessel positioning. The goal of this study was to develop and validate the effectiveness of a new virtual reality and force haptic based simulation platform for CVC of the right internal jugular vein. A CVC simulation platform was developed using a haptic robotic arm, 3D position tracker, and computer visualization. The haptic robotic arm simulated needle insertion force that was based on cadaver experiments. The 3D position tracker was used as a mock ultrasound device with realistic visualization on a computer screen. Upon completion of a practice simulation, performance feedback is given to the user through a graphical user interface including scoring factors based on good CVC practice. The effectiveness of the system was evaluated by training 13 first year surgical residents using the virtual reality haptic based training system over a 3 month period. The participants' performance increased from 52% to 96% on the baseline training scenario, approaching the average score of an expert surgeon: 98%. This also resulted in improvement in positive CVC practices including a 61% decrease between final needle tip position and vein center, a decrease in mean insertion attempts from 1.92 to 1.23, and a 12% increase in time spent aspirating the syringe throughout the procedure. A virtual reality haptic robotic simulator for CVC was successfully developed. Surgical residents training on the simulation improved to near expert levels after three robotic training sessions. This suggests that this system could act as an effective training device for CVC. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Virtual reality for stroke rehabilitation.
Laver, Kate E; George, Stacey; Thomas, Susie; Deutsch, Judith E; Crotty, Maria
2015-02-12
Virtual reality and interactive video gaming have emerged as recent treatment approaches in stroke rehabilitation. In particular, commercial gaming consoles have been rapidly adopted in clinical settings. This is an update of a Cochrane Review published in 2011. To determine the efficacy of virtual reality compared with an alternative intervention or no intervention on upper limb function and activity. To determine the efficacy of virtual reality compared with an alternative intervention or no intervention on: gait and balance activity, global motor function, cognitive function, activity limitation, participation restriction and quality of life, voxels or regions of interest identified via imaging, and adverse events. Additionally, we aimed to comment on the feasibility of virtual reality for use with stroke patients by reporting on patient eligibility criteria and recruitment. We searched the Cochrane Stroke Group Trials Register (October 2013), the Cochrane Central Register of Controlled Trials (The Cochrane Library 2013, Issue 11), MEDLINE (1950 to November 2013), EMBASE (1980 to November 2013) and seven additional databases. We also searched trials registries and reference lists. Randomised and quasi-randomised trials of virtual reality ("an advanced form of human-computer interface that allows the user to 'interact' with and become 'immersed' in a computer-generated environment in a naturalistic fashion") in adults after stroke. The primary outcome of interest was upper limb function and activity. Secondary outcomes included gait and balance function and activity, and global motor function. Two review authors independently selected trials based on pre-defined inclusion criteria, extracted data and assessed risk of bias. A third review author moderated disagreements when required. The authors contacted investigators to obtain missing information. We included 37 trials that involved 1019 participants. Study sample sizes were generally small and interventions varied. The risk of bias present in many studies was unclear due to poor reporting. Thus, while there are a large number of randomised controlled trials, the evidence remains 'low' or 'very low' quality when rated using the GRADE system. Control groups received no intervention or therapy based on a standard care approach. Intervention approaches in the included studies were predominantly designed to improve motor function rather than cognitive function or activity performance. The majority of participants were relatively young and more than one year post stroke. results were statistically significant for upper limb function (standardised mean difference (SMD) 0.28, 95% confidence intervals (CI) 0.08 to 0.49 based on 12 studies with 397 participants). there were no statistically significant effects for grip strength, gait speed or global motor function. Results were statistically significant for the activities of daily living (ADL) outcome (SMD 0.43, 95% CI 0.18 to 0.69 based on eight studies with 253 participants); however, we were unable to pool results for cognitive function, participation restriction, quality of life or imaging studies. There were few adverse events reported across studies and those reported were relatively mild. Studies that reported on eligibility rates showed that only 26% of participants screened were recruited. We found evidence that the use of virtual reality and interactive video gaming may be beneficial in improving upper limb function and ADL function when used as an adjunct to usual care (to increase overall therapy time) or when compared with the same dose of conventional therapy. There was insufficient evidence to reach conclusions about the effect of virtual reality and interactive video gaming on grip strength, gait speed or global motor function. It is unclear at present which characteristics of virtual reality are most important and it is unknown whether effects are sustained in the longer term.
Fitzgerald, Diarmaid; Trakarnratanakul, Nanthana; Dunne, Lucy; Smyth, Barry; Caulfield, Brian
2008-01-01
We have developed a prototype virtual reality-based balance training system using a single inertial orientation sensor attached to the upper surface of a wobble board. This input device has been interfaced with Neverball, an open source computer game to create the balance training platform. Users can exercise with the system by standing on the wobble board and tilting it in different directions to control an on-screen environment. We have also developed a customized instruction manual to use when setting up the system. To evaluate the usability our prototype system we undertook a user evaluation study with twelve healthy novice participants. Participants were required to assemble the system using an instruction manual and then perform balance exercises with the system. Following this period of exercise VRUSE, a usability evaluation questionnaire, was completed by participants. Results indicated a high level of usability in all categories evaluated.
Fully Three-Dimensional Virtual-Reality System
NASA Technical Reports Server (NTRS)
Beckman, Brian C.
1994-01-01
Proposed virtual-reality system presents visual displays to simulate free flight in three-dimensional space. System, virtual space pod, is testbed for control and navigation schemes. Unlike most virtual-reality systems, virtual space pod would not depend for orientation on ground plane, which hinders free flight in three dimensions. Space pod provides comfortable seating, convenient controls, and dynamic virtual-space images for virtual traveler. Controls include buttons plus joysticks with six degrees of freedom.
Salimi, Zohreh; Ferguson-Pell, Martin
2018-06-01
Although wheelchair ergometers provide a safe and controlled environment for studying or training wheelchair users, until recently they had a major disadvantage in only being capable of simulating straight-line wheelchair propulsion. Virtual reality has helped overcome this problem and broaden the usability of wheelchair ergometers. However, for a wheelchair ergometer to be validly used in research studies, it needs to be able to simulate the biomechanics of real world wheelchair propulsion. In this paper, three versions of a wheelchair simulator were developed. They provide a sophisticated wheelchair ergometer in an immersive virtual reality environment. They are intended for manual wheelchair propulsion and all are able to simulate simple translational inertia. In addition, each of the systems reported uses a different approach to simulate wheelchair rotation and accommodate rotational inertial effects. The first system does not provide extra resistance against rotation and relies on merely linear inertia, hypothesizing that it can provide acceptable replication of biomechanics of wheelchair maneuvers. The second and third systems, however, are designed to simulate rotational inertia. System II uses mechanical compensation, and System III uses visual compensation simulating the influence that rotational inertia has on the visual perception of wheelchair movement in response to rotation at different speeds.
Suitability of digital camcorders for virtual reality image data capture
NASA Astrophysics Data System (ADS)
D'Apuzzo, Nicola; Maas, Hans-Gerd
1998-12-01
Today's consumer market digital camcorders offer features which make them appear quite interesting devices for virtual reality data capture. The paper compares a digital camcorder with an analogue camcorder and a machine vision type CCD camera and discusses the suitability of these three cameras for virtual reality applications. Besides the discussion of technical features of the cameras, this includes a detailed accuracy test in order to define the range of applications. In combination with the cameras, three different framegrabbers are tested. The geometric accuracy potential of all three cameras turned out to be surprisingly large, and no problems were noticed in the radiometric performance. On the other hand, some disadvantages have to be reported: from the photogrammetrists point of view, the major disadvantage of most camcorders is the missing possibility to synchronize multiple devices, limiting the suitability for 3-D motion data capture. Moreover, the standard video format contains interlacing, which is also undesirable for all applications dealing with moving objects or moving cameras. Further disadvantages are computer interfaces with functionality, which is still suboptimal. While custom-made solutions to these problems are probably rather expensive (and will make potential users turn back to machine vision like equipment), this functionality could probably be included by the manufacturers at almost zero cost.
NASA Astrophysics Data System (ADS)
Latinovic, T. S.; Deaconu, S. I.; Latinović, M. T.; Malešević, N.; Barz, C.
2015-06-01
This paper work with a new system that provides distance learning and online training engineers. The purpose of this paper is to develop and provide web-based system for the handling and control of remote devices via the Internet. Remote devices are currently the industry or mobile robots [13]. For future product development machine in the factory will be included in the system. This article also discusses the current use of virtual reality tools in the fields of science and engineering education. One programming tool in particular, virtual reality modeling language (VRML) is presented in the light of its applications and capabilities in the development of computer visualization tool for education. One contribution of this paper is to present the software tools and examples that can encourage educators to develop a virtual reality model to improve teaching in their discipline. [12] This paper aims to introduce a software platform, called VALIP where users can build, share, and manipulate 3D content in cooperation with the interaction processes in a 3D context, while participating hardware and software devices can be physical and / or logical distributed and connected together via the Internet. VALIP the integration of virtual laboratories to appropriate partners; therefore, allowing access to all laboratories in any of the partners in the project. VALIP provides advanced laboratory for training and research within robotics and production engineering, and thus, provides a great laboratory facilities with only having to invest a limited amount of resources at the local level to the partner site.
New Dimensions of GIS Data: Exploring Virtual Reality (VR) Technology for Earth Science
NASA Astrophysics Data System (ADS)
Skolnik, S.; Ramirez-Linan, R.
2016-12-01
NASA's Science Mission Directorate (SMD) Earth Science Division (ESD) Earth Science Technology Office (ESTO) and Navteca are exploring virtual reality (VR) technology as an approach and technique related to the next generation of Earth science technology information systems. Having demonstrated the value of VR in viewing pre-visualized science data encapsulated in a movie representation of a time series, further investigation has led to the additional capability of permitting the observer to interact with the data, make selections, and view volumetric data in an innovative way. The primary objective of this project has been to investigate the use of commercially available VR hardware, the Oculus Rift and the Samsung Gear VR, for scientific analysis through an interface to ArcGIS to enable the end user to order and view data from the NASA Discover-AQ mission. A virtual console is presented through the VR interface that allows the user to select various layers of data from the server in both 2D, 3D, and full 4pi steradian views. By demonstrating the utility of VR in interacting with Discover-AQ flight mission measurements, and building on previous work done at the Atmospheric Science Data Center (ASDC) at NASA Langley supporting analysis of sources of CO2 during the Discover-AQ mission, the investigation team has shown the potential for VR as a science tool beyond simple visualization.
The Virtual Reality Roving Vehicle Project.
ERIC Educational Resources Information Center
Winn, William
1995-01-01
Describes the Virtual Reality Roving Vehicle project developed at the University of Washington to teach students in grades 4 through 12 about virtual reality. Topics include teacher workshops; virtual worlds created by students; learning outcomes compared with traditional instruction; and the effect of student characteristics, including gender, on…
Virtual Reality: An Instructional Medium for Visual-Spatial Tasks.
ERIC Educational Resources Information Center
Regian, J. Wesley; And Others
1992-01-01
Describes an empirical exploration of the instructional potential of virtual reality as an interface for simulation-based training. Shows that subjects learned spatial-procedural and spatial-navigational skills in virtual reality. (SR)
Mixed Reality with HoloLens: Where Virtual Reality Meets Augmented Reality in the Operating Room.
Tepper, Oren M; Rudy, Hayeem L; Lefkowitz, Aaron; Weimer, Katie A; Marks, Shelby M; Stern, Carrie S; Garfein, Evan S
2017-11-01
Virtual reality and augmented reality devices have recently been described in the surgical literature. The authors have previously explored various iterations of these devices, and although they show promise, it has become clear that virtual reality and/or augmented reality devices alone do not adequately meet the demands of surgeons. The solution may lie in a hybrid technology known as mixed reality, which merges many virtual reality and augmented realty features. Microsoft's HoloLens, the first commercially available mixed reality device, provides surgeons intraoperative hands-free access to complex data, the real environment, and bidirectional communication. This report describes the use of HoloLens in the operating room to improve decision-making and surgical workflow. The pace of mixed reality-related technological development will undoubtedly be rapid in the coming years, and plastic surgeons are ideally suited to both lead and benefit from this advance.
Tabanfar, Reza; Chan, Harley H L; Lin, Vincent; Le, Trung; Irish, Jonathan C
To develop and validate a smartphone based Virtual Reality Epley Maneuver System (VREMS) for home use. A smartphone application was designed to produce stereoscopic views of a Virtual Reality (VR) environment, which when viewed after placing a smartphone in a virtual reality headset, allowed the user to be guided step-by-step through the Epley maneuver in a VR environment. Twenty healthy participants were recruited and randomized to undergo either assisted Epleys or self-administered Epleys following reading instructions from an Instructional Handout (IH). All participants were filmed and two expert Otologists reviewed the videos, assigning each participant a score (out of 10) for performance on each step. Participants rated their perceived workload by completing a validated task-load questionnaire (NASA Task Load Index) and averages for both groups were calculated. Twenty participants were evaluated with average age 26.4±7.12years old in the VREMS group and 26.1±7.72 in the IH group. The VR assisted group achieved an average score of 7.78±0.99 compared to 6.65±1.72 in the IH group. This result was statistically significant with p=0.0001 and side dominance did not appear to play a factor. Analyzing each step of the Epley maneuver demonstrated that assisted Epleys were done more accurately with statically significant results in steps 2-4. Results of the NASA-TLX scores were variable with no significant findings. We have developed and demonstrated face validity for VREMS through our randomized controlled trial. The VREMS platform is promising technology, which may improve the accuracy and effectiveness of home Epley treatments. N/A. Copyright © 2017 Elsevier Inc. All rights reserved.
Differentiating levels of surgical experience on a virtual reality temporal bone simulator.
Zhao, Yi C; Kennedy, Gregor; Hall, Richard; O'Leary, Stephen
2010-11-01
Virtual reality simulation is increasingly being incorporated into surgical training and may have a role in temporal bone surgical education. Here we test whether metrics generated by a virtual reality surgical simulation can differentiate between three levels of experience, namely novices, otolaryngology residents, and experienced qualified surgeons. Cohort study. Royal Victorian Eye and Ear Hospital. Twenty-seven participants were recruited. There were 12 experts, six residents, and nine novices. After orientation, participants were asked to perform a modified radical mastoidectomy on the simulator. Comparisons of time taken, injury to structures, and forces exerted were made between the groups to determine which specific metrics would discriminate experience levels. Experts completed the simulated task in significantly shorter time than the other two groups (experts 22 minutes, residents 36 minutes, and novices 46 minutes; P = 0.001). Novices exerted significantly higher average forces when dissecting close to vital structures compared with experts (0.24 Newton [N] vs 0.13 N, P = 0.002). Novices were also more likely to injure structures such as dura compared to experts (23 injuries vs 3 injuries, P = 0.001). Compared with residents, the experts modulated their force between initial cortex dissection and dissection close to vital structures. Using the combination of these metrics, we were able to correctly classify the participants' level of experience 90 percent of the time. This preliminary study shows that measurements of performance obtained from within a virtual reality simulator can differentiate between levels of users' experience. These results suggest that simulator training may have a role in temporal bone training beyond foundational training. Copyright © 2010 American Academy of Otolaryngology–Head and Neck Surgery Foundation. Published by Mosby, Inc. All rights reserved.
Altering User Movement Behaviour in Virtual Environments.
Simeone, Adalberto L; Mavridou, Ifigeneia; Powell, Wendy
2017-04-01
In immersive Virtual Reality systems, users tend to move in a Virtual Environment as they would in an analogous physical environment. In this work, we investigated how user behaviour is affected when the Virtual Environment differs from the physical space. We created two sets of four environments each, plus a virtual replica of the physical environment as a baseline. The first focused on aesthetic discrepancies, such as a water surface in place of solid ground. The second focused on mixing immaterial objects together with those paired to tangible objects. For example, barring an area with walls or obstacles. We designed a study where participants had to reach three waypoints laid out in such a way to prompt a decision on which path to follow based on the conflict between the mismatching visual stimuli and their awareness of the real layout of the room. We analysed their performances to determine whether their trajectories were altered significantly from the shortest route. Our results indicate that participants altered their trajectories in presence of surfaces representing higher walking difficulty (for example, water instead of grass). However, when the graphical appearance was found to be ambiguous, there was no significant trajectory alteration. The environments mixing immaterial with physical objects had the most impact on trajectories with a mean deviation from the shortest route of 60 cm against the 37 cm of environments with aesthetic alterations. The co-existance of paired and unpaired virtual objects was reported to support the idea that all objects participants saw were backed by physical props. From these results and our observations, we derive guidelines on how to alter user movement behaviour in Virtual Environments.
Virtual Reality: Emerging Applications and Future Directions
ERIC Educational Resources Information Center
Ludlow, Barbara L.
2015-01-01
Virtual reality is an emerging technology that has resulted in rapid expansion in the development of virtual immersive environments for use as educational simulations in schools, colleges and universities. This article presents an overview of virtual reality, describes a number of applications currently being used by special educators for…
Virtual Reality: A Dream Come True or a Nightmare.
ERIC Educational Resources Information Center
Cornell, Richard; Bailey, Dan
Virtual Reality (VR) is a new medium which allows total stimulation of one's senses through human/computer interfaces. VR has applications in training simulators, nano-science, medicine, entertainment, electronic technology, and manufacturing. This paper focuses on some current and potential problems of virtual reality and virtual environments…
Web-based interactive 3D visualization as a tool for improved anatomy learning.
Petersson, Helge; Sinkvist, David; Wang, Chunliang; Smedby, Orjan
2009-01-01
Despite a long tradition, conventional anatomy education based on dissection is declining. This study tested a new virtual reality (VR) technique for anatomy learning based on virtual contrast injection. The aim was to assess whether students value this new three-dimensional (3D) visualization method as a learning tool and what value they gain from its use in reaching their anatomical learning objectives. Several 3D vascular VR models were created using an interactive segmentation tool based on the "virtual contrast injection" method. This method allows users, with relative ease, to convert computer tomography or magnetic resonance images into vivid 3D VR movies using the OsiriX software equipped with the CMIV CTA plug-in. Once created using the segmentation tool, the image series were exported in Quick Time Virtual Reality (QTVR) format and integrated within a web framework of the Educational Virtual Anatomy (EVA) program. A total of nine QTVR movies were produced encompassing most of the major arteries of the body. These movies were supplemented with associated information, color keys, and notes. The results indicate that, in general, students' attitudes towards the EVA-program were positive when compared with anatomy textbooks, but results were not the same with dissections. Additionally, knowledge tests suggest a potentially beneficial effect on learning.
Virtual reality hardware and graphic display options for brain-machine interfaces
Marathe, Amar R.; Carey, Holle L.; Taylor, Dawn M.
2009-01-01
Virtual reality hardware and graphic displays are reviewed here as a development environment for brain-machine interfaces (BMIs). Two desktop stereoscopic monitors and one 2D monitor were compared in a visual depth discrimination task and in a 3D target-matching task where able-bodied individuals used actual hand movements to match a virtual hand to different target hands. Three graphic representations of the hand were compared: a plain sphere, a sphere attached to the fingertip of a realistic hand and arm, and a stylized pacman-like hand. Several subjects had great difficulty using either stereo monitor for depth perception when perspective size cues were removed. A mismatch in stereo and size cues generated inappropriate depth illusions. This phenomenon has implications for choosing target and virtual hand sizes in BMI experiments. Target matching accuracy was about as good with the 2D monitor as with either 3D monitor. However, users achieved this accuracy by exploring the boundaries of the hand in the target with carefully controlled movements. This method of determining relative depth may not be possible in BMI experiments if movement control is more limited. Intuitive depth cues, such as including a virtual arm, can significantly improve depth perception accuracy with or without stereo viewing. PMID:18006069
Virtual reality in surgical skills training.
Palter, Vanessa N; Grantcharov, Teodor P
2010-06-01
With recent concerns regarding patient safety, and legislation regarding resident work hours, it is accepted that a certain amount of surgical skills training will transition to the surgical skills laboratory. Virtual reality offers enormous potential to enhance technical and non-technical skills training outside the operating room. Virtual-reality systems range from basic low-fidelity devices to highly complex virtual environments. These systems can act as training and assessment tools, with the learned skills effectively transferring to an analogous clinical situation. Recent developments include expanding the role of virtual reality to allow for holistic, multidisciplinary team training in simulated operating rooms, and focusing on the role of virtual reality in evidence-based surgical curriculum design. Copyright 2010 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Cooper, Rory A.; Ding, Dan; Simpson, Richard; Fitzgerald, Shirley G.; Spaeth, Donald M.; Guo, Songfeng; Koontz, Alicia M.; Cooper, Rosemarie; Kim, Jongbae; Boninger, Michael L.
2005-01-01
Some aspects of assistive technology can be enhanced by the application of virtual reality. Although virtual simulation offers a range of new possibilities, learning to navigate in a virtual environment is not equivalent to learning to navigate in the real world. Therefore, virtual reality simulation is advocated as a useful preparation for…
Knowledge and Valorization of Historical Sites Through 3d Documentation and Modeling
NASA Astrophysics Data System (ADS)
Farella, E.; Menna, F.; Nocerino, E.; Morabito, D.; Remondino, F.; Campi, M.
2016-06-01
The paper presents the first results of an interdisciplinary project related to the 3D documentation, dissemination, valorization and digital access of archeological sites. Beside the mere 3D documentation aim, the project has two goals: (i) to easily explore and share via web references and results of the interdisciplinary work, including the interpretative process and the final reconstruction of the remains; (ii) to promote and valorize archaeological areas using reality-based 3D data and Virtual Reality devices. This method has been verified on the ruins of the archeological site of Pausilypon, a maritime villa of Roman period (Naples, Italy). Using Unity3D, the virtual tour of the heritage site was integrated and enriched with the surveyed 3D data, text documents, CAAD reconstruction hypotheses, drawings, photos, etc. In this way, starting from the actual appearance of the ruins (panoramic images), passing through the 3D digital surveying models and several other historical information, the user is able to access virtual contents and reconstructed scenarios, all in a single virtual, interactive and immersive environment. These contents and scenarios allow to derive documentation and geometrical information, understand the site, perform analyses, see interpretative processes, communicate historical information and valorize the heritage location.
Hu, Jian; Xu, Xiang-yang; Song, En-min; Tan, Hong-bao; Wang, Yi-ning
2009-09-01
To establish a new visual educational system of virtual reality for clinical dentistry based on world wide web (WWW) webpage in order to provide more three-dimensional multimedia resources to dental students and an online three-dimensional consulting system for patients. Based on computer graphics and three-dimensional webpage technologies, the software of 3Dsmax and Webmax were adopted in the system development. In the Windows environment, the architecture of whole system was established step by step, including three-dimensional model construction, three-dimensional scene setup, transplanting three-dimensional scene into webpage, reediting the virtual scene, realization of interactions within the webpage, initial test, and necessary adjustment. Five cases of three-dimensional interactive webpage for clinical dentistry were completed. The three-dimensional interactive webpage could be accessible through web browser on personal computer, and users could interact with the webpage through rotating, panning and zooming the virtual scene. It is technically feasible to implement the visual educational system of virtual reality for clinical dentistry based on WWW webpage. Information related to clinical dentistry can be transmitted properly, visually and interactively through three-dimensional webpage.
The benefits of virtual reality simulator training for laparoscopic surgery.
Hart, Roger; Karthigasu, Krishnan
2007-08-01
Virtual reality is a computer-generated system that provides a representation of an environment. This review will analyse the literature with regard to any benefit to be derived from training with virtual reality equipment and to describe the current equipment available. Virtual reality systems are not currently realistic of the live operating environment because they lack tactile sensation, and do not represent a complete operation. The literature suggests that virtual reality training is a valuable learning tool for gynaecologists in training, particularly those in the early stages of their careers. Furthermore, it may be of benefit for the ongoing audit of surgical skills and for the early identification of a surgeon's deficiencies before operative incidents occur. It is only a matter of time before realistic virtual reality models of most complete gynaecological operations are available, with improved haptics as a result of improved computer technology. It is inevitable that in the modern climate of litigation virtual reality training will become an essential part of clinical training, as evidence for its effectiveness as a training tool exists, and in many countries training by operating on live animals is not possible.
Webizing mobile augmented reality content
NASA Astrophysics Data System (ADS)
Ahn, Sangchul; Ko, Heedong; Yoo, Byounghyun
2014-01-01
This paper presents a content structure for building mobile augmented reality (AR) applications in HTML5 to achieve a clean separation of the mobile AR content and the application logic for scaling as on the Web. We propose that the content structure contains the physical world as well as virtual assets for mobile AR applications as document object model (DOM) elements and that their behaviour and user interactions are controlled through DOM events by representing objects and places with a uniform resource identifier. Our content structure enables mobile AR applications to be seamlessly developed as normal HTML documents under the current Web eco-system.
Web-based Three-dimensional Virtual Body Structures: W3D-VBS
Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex
2002-01-01
Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user’s progress through evaluation tools helps customize lesson plans. A self-guided “virtual tour” of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it. PMID:12223495
The Potential of Virtual Reality for the Investigation of Awe
Chirico, Alice; Yaden, David B.; Riva, Giuseppe; Gaggioli, Andrea
2016-01-01
The emotion of awe is characterized by the perception of vastness and a need for accommodation, which can include a positive and/or negative valence. While a number of studies have successfully manipulated this emotion, the issue of how to elicit particularly intense awe experiences in laboratory settings remains. We suggest that virtual reality (VR) is a particularly effective mood induction tool for eliciting awe. VR provides three key assets for improving awe. First, VR provides users with immersive and ecological yet controlled environments that can elicit a sense of “presence,” the subjective experience of “being there” in a simulated reality. Further, VR can be used to generate complex, vast stimuli, which can target specific theoretical facets of awe. Finally, VR allows for convenient tracking of participants’ behavior and physiological responses, allowing for more integrated assessment of emotional experience. We discussed the potential and challenges of the proposed approach with an emphasis on VR’s capacity to raise the signal of reactions to emotions such as awe in laboratory settings. PMID:27881970
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
A microbased shared virtual world prototype
NASA Technical Reports Server (NTRS)
Pitts, Gerald; Robinson, Mark; Strange, Steve
1993-01-01
Virtual reality (VR) allows sensory immersion and interaction with a computer-generated environment. The user adopts a physical interface with the computer, through Input/Output devices such as a head-mounted display, data glove, mouse, keyboard, or monitor, to experience an alternate universe. What this means is that the computer generates an environment which, in its ultimate extension, becomes indistinguishable from the real world. 'Imagine a wraparound television with three-dimensional programs, including three-dimensional sound, and solid objects that you can pick up and manipulate, even feel with your fingers and hands.... 'Imagine that you are the creator as well as the consumer of your artificial experience, with the power to use a gesture or word to remold the world you see and hear and feel. That part is not fiction... three-dimensional computer graphics, input/output devices, computer models that constitute a VR system make it possible, today, to immerse yourself in an artificial world and to reach in and reshape it.' Our research's goal was to propose a feasibility experiment in the construction of a networked virtual reality system, making use of current personal computer (PC) technology. The prototype was built using Borland C compiler, running on an IBM 486 33 MHz and a 386 33 MHz. Each game currently is represented as an IPX client on a non-dedicated Novell server. We initially posed the two questions: (1) Is there a need for networked virtual reality? (2) In what ways can the technology be made available to the most people possible?
archAR: an archaeological augmented reality experience
NASA Astrophysics Data System (ADS)
Wiley, Bridgette; Schulze, Jürgen P.
2015-03-01
We present an application for Android phones or tablets called "archAR" that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD's Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm's Vuforia API, we use an image target as a map and overlay a three-dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to "zoom" into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.
[What do virtual reality tools bring to child and adolescent psychiatry?
Bioulac, S; de Sevin, E; Sagaspe, P; Claret, A; Philip, P; Micoulaud-Franchi, J A; Bouvard, M P
2018-06-01
Virtual reality is a relatively new technology that enables individuals to immerse themselves in a virtual world. It offers several advantages including a more realistic, lifelike environment that may allow subjects to "forget" they are being assessed, allow a better participation and an increased generalization of learning. Moreover, the virtual reality system can provide multimodal stimuli, such as visual and auditory stimuli, and can also be used to evaluate the patient's multimodal integration and to aid rehabilitation of cognitive abilities. The use of virtual reality to treat various psychiatric disorders in adults (phobic anxiety disorders, post-traumatic stress disorder, eating disorders, addictions…) and its efficacy is supported by numerous studies. Similar research for children and adolescents is lagging behind. This may be particularly beneficial to children who often show great interest and considerable success on computer, console or videogame tasks. This article will expose the main studies that have used virtual reality with children and adolescents suffering from psychiatric disorders. The use of virtual reality to treat anxiety disorders in adults is gaining popularity and its efficacy is supported by various studies. Most of the studies attest to the significant efficacy of the virtual reality exposure therapy (or in virtuo exposure). In children, studies have covered arachnophobia social anxiety and school refusal phobia. Despite the limited number of studies, results are very encouraging for treatment in anxiety disorders. Several studies have reported the clinical use of virtual reality technology for children and adolescents with autistic spectrum disorders (ASD). Extensive research has proven the efficiency of technologies as support tools for therapy. Researches are found to be focused on communication and on learning and social imitation skills. Virtual reality is also well accepted by subjects with ASD. The virtual environment offers the opportunity to administer controlled tasks such as the typical neuropsychological tools, but in an environment much more like a standard classroom. The virtual reality classroom offers several advantages compared to classical tools such as more realistic and lifelike environment but also records various measures in standardized conditions. Most of the studies using a virtual classroom have found that children with Attention Deficit/Hyperactivity Disorder make significantly fewer correct hits and more commission errors compared with controls. The virtual classroom has proven to be a good clinical tool for evaluation of attention in ADHD. For eating disorders, cognitive behavioural therapy (CBT) program enhanced by a body image specific component using virtual reality techniques was shown to be more efficient than cognitive behavioural therapy alone. The body image-specific component using virtual reality techniques boots efficiency and accelerates the CBT change process for eating disorders. Virtual reality is a relatively new technology and its application in child and adolescent psychiatry is recent. However, this technique is still in its infancy and much work is needed including controlled trials before it can be introduced in routine clinical use. Virtual reality interventions should also investigate how newly acquired skills are transferred to the real world. At present virtual reality can be considered a useful tool in evaluation and treatment for child and adolescent disorders. Copyright © 2017 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Improving Balance in TBI Using a Low-Cost Customized Virtual Reality Rehabilitation Tool
2016-10-01
AWARD NUMBER: W81XWH-14-2-0150 TITLE: Improving Balance in TBI Using a Low-Cost Customized Virtual Reality Rehabilitation Tool PRINCIPAL...AND SUBTITLE Improving Balance in TBI Using a Low-Cost Customized Virtual Reality Rehabilitation Tool 5a. CONTRACT NUMBER 5b. GRANT NUMBER W81XWH...Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The proposed study will implement and evaluate a novel, low-cost, Virtual Reality (VR
Virtual reality for dermatologic surgery: virtually a reality in the 21st century.
Gladstone, H B; Raugi, G J; Berg, D; Berkley, J; Weghorst, S; Ganter, M
2000-01-01
In the 20th century, virtual reality has predominantly played a role in training pilots and in the entertainment industry. Despite much publicity, virtual reality did not live up to its perceived potential. During the past decade, it has also been applied for medical uses, particularly as training simulators, for minimally invasive surgery. Because of advances in computer technology, virtual reality is on the cusp of becoming an effective medical educational tool. At the University of Washington, we are developing a virtual reality soft tissue surgery simulator. Based on fast finite element modeling and using a personal computer, this device can simulate three-dimensional human skin deformations with real-time tactile feedback. Although there are many cutaneous biomechanical challenges to solve, it will eventually provide more realistic dermatologic surgery training for medical students and residents than the currently used models.
An Intelligent Virtual Human System For Providing Healthcare Information And Support
2011-01-01
for clinical purposes. Shifts in the social and scientific landscape have now set the stage for the next major movement in Clinical Virtual Reality ...College; dMadigan Army Medical Center Army Abstract. Over the last 15 years, a virtual revolution has taken place in the use of Virtual Reality ... Virtual Reality with the “birth” of intelligent virtual humans. Seminal research and development has appeared in the creation of highly interactive
Virtual reality for stroke rehabilitation.
Laver, Kate E; George, Stacey; Thomas, Susie; Deutsch, Judith E; Crotty, Maria
2011-09-07
Virtual reality and interactive video gaming have emerged as new treatment approaches in stroke rehabilitation. In particular, commercial gaming consoles are being rapidly adopted in clinical settings; however, there is currently little information about their effectiveness. To evaluate the effects of virtual reality and interactive video gaming on upper limb, lower limb and global motor function after stroke. We searched the Cochrane Stroke Group Trials Register (March 2010), the Cochrane Central Register of Controlled Trials (The Cochrane Library 2010, Issue 1), MEDLINE (1950 to March 2010), EMBASE (1980 to March 2010) and seven additional databases. We also searched trials registries, conference proceedings, reference lists and contacted key researchers in the area and virtual reality equipment manufacturers. Randomised and quasi-randomised trials of virtual reality ('an advanced form of human-computer interface that allows the user to 'interact' with and become 'immersed' in a computer-generated environment in a naturalistic fashion') in adults after stroke. The primary outcomes of interest were: upper limb function and activity, gait and balance function and activity and global motor function. Two review authors independently selected trials based on pre-defined inclusion criteria, extracted data and assessed risk of bias. A third review author moderated disagreements when required. The authors contacted all investigators to obtain missing information. We included 19 trials which involved 565 participants. Study sample sizes were generally small and interventions and outcome measures varied, limiting the ability to which studies could be compared. Intervention approaches in the included studies were predominantly designed to improve motor function rather than cognitive function or activity performance. The majority of participants were relatively young and more than one year post stroke. results were statistically significant for arm function (standardised mean difference (SMD) 0.53, 95% confidence intervals (CI) 0.25 to 0.81 based on seven studies with 205 participants). There were no statistically significant effects for grip strength or gait speed. We were unable to determine the effect on global motor function due to insufficient numbers of comparable studies. results were statistically significant for activities of daily living (ADL) outcome (SMD 0.81, 95% CI 0.39 to 1.22 based on three studies with 101 participants); however, we were unable to pool results for cognitive function, participation restriction and quality of life or imaging studies. There were few adverse events reported across studies and those reported were relatively mild. Studies that reported on eligibility rates showed that only 34% (standard deviation (SD) 26, range 17 to 80) of participants screened were recruited. We found limited evidence that the use of virtual reality and interactive video gaming may be beneficial in improving arm function and ADL function when compared with the same dose of conventional therapy. There was insufficient evidence to reach conclusions about the effect of virtual reality and interactive video gaming on grip strength or gait speed. It is unclear at present which characteristics of virtual reality are most important and it is unknown whether effects are sustained in the longer term. Furthermore, there are currently very few studies evaluating the use of commercial gaming consoles (such as the Nintendo Wii).
NASA Astrophysics Data System (ADS)
Narasimha Rao, Gudikandhula; Jagadeeswara Rao, Peddada; Duvvuru, Rajesh
2016-09-01
Wild fires have significant impact on atmosphere and lives. The demand of predicting exact fire area in forest may help fire management team by using drone as a robot. These are flexible, inexpensive and elevated-motion remote sensing systems that use drones as platforms are important for substantial data gaps and supplementing the capabilities of manned aircraft and satellite remote sensing systems. In addition, powerful computational tools are essential for predicting certain burned area in the duration of a forest fire. The reason of this study is to built up a smart system based on semantic neural networking for the forecast of burned areas. The usage of virtual reality simulator is used to support the instruction process of fire fighters and all users for saving of surrounded wild lives by using a naive method Semantic Neural Network System (SNNS). Semantics are valuable initially to have a enhanced representation of the burned area prediction and better alteration of simulation situation to the users. In meticulous, consequences obtained with geometric semantic neural networking is extensively superior to other methods. This learning suggests that deeper investigation of neural networking in the field of forest fires prediction could be productive.
A virtual reality system for arm and hand rehabilitation
NASA Astrophysics Data System (ADS)
Luo, Zhiqiang; Lim, Chee Kian; Chen, I.-Ming; Yeo, Song Huat
2011-03-01
This paper presents a virtual reality (VR) system for upper limb rehabilitation. The system incorporates two motion track components, the Arm Suit and the Smart Glove which are composed of a range of the optical linear encoders (OLE) and the inertial measurement units (IMU), and two interactive practice applications designed for driving users to perform the required functional and non-functional motor recovery tasks. We describe the technique details about the two motion track components and the rational to design two practice applications. The experiment results show that, compared with the marker-based tracking system, the Arm Suit can accurately track the elbow and wrist positions. The repeatability of the Smart Glove on measuring the five fingers' movement can be satisfied. Given the low cost, high accuracy and easy installation, the system thus promises to be a valuable complement to conventional therapeutic programs offered in rehabilitation clinics and at home.
Using Virtual Reality to Dynamically Setting an Electrical Wheelchair
NASA Astrophysics Data System (ADS)
Dir, S.; Habert, O.; Pruski, A.
2008-06-01
This work uses virtual reality to find or refine in a recurring way the best adequacy between a person with physically disability and his electrical wheelchair. A system architecture based on "Experiment→Analyze and decision-making→Modification of the wheelchair" cycles is proposed. This architecture uses a decision-making module based on a fuzzy inference system which has to be parameterized so that the system converges quickly towards the optimal solution. The first challenge consists in computing criteria which must represent as well as possible particular situations that the user meets during each navigation experiment. The second challenge consists in transforming these criteria into relevant modifications about the active or non active functionalities or into adjustment of intrinsic setting of the wheelchair. These modifications must remain most stable as possible during the successive experiments. Objectives are to find the best wheelchair to give a beginning of mobility to a given person with physically disability.
Virtual reality in anxiety disorders: the past and the future.
Gorini, Alessandra; Riva, Giuseppe
2008-02-01
One of the most effective treatments of anxiety is exposure therapy: a person is exposed to specific feared situations or objects that trigger anxiety. This exposure process may be done through actual exposure, with visualization, by imagination or using virtual reality (VR), that provides users with computer simulated environments with and within which they can interact. VR is made possible by the capability of computers to synthesize a 3D graphical environment from numerical data. Furthermore, because input devices sense the subject's reactions and motions, the computer can modify the synthetic environment accordingly, creating the illusion of interacting with, and thus being immersed within the environment. Starting from 1995, different experimental studies have been conducted in order to investigate the effect of VR exposure in the treatment of subclinical fears and anxiety disorders. This review will discuss their outcome and provide guidelines for the use of VR exposure for the treatment of anxious patients.
NASA Technical Reports Server (NTRS)
Schulte, Erin
2017-01-01
As augmented and virtual reality grows in popularity, and more researchers focus on its development, other fields of technology have grown in the hopes of integrating with the up-and-coming hardware currently on the market. Namely, there has been a focus on how to make an intuitive, hands-free human-computer interaction (HCI) utilizing AR and VR that allows users to control their technology with little to no physical interaction with hardware. Computer vision, which is utilized in devices such as the Microsoft Kinect, webcams and other similar hardware has shown potential in assisting with the development of a HCI system that requires next to no human interaction with computing hardware and software. Object and facial recognition are two subsets of computer vision, both of which can be applied to HCI systems in the fields of medicine, security, industrial development and other similar areas.
Fransson, Boel A; Chen, Chi-Ya; Noyes, Julie A; Ragle, Claude A
2016-11-01
To determine the construct and concurrent validity of instrument motion metrics for laparoscopic skills assessment in virtual reality and augmented reality simulators. Evaluation study. Veterinarian students (novice, n = 14) and veterinarians (experienced, n = 11) with no or variable laparoscopic experience. Participants' minimally invasive surgery (MIS) experience was determined by hospital records of MIS procedures performed in the Teaching Hospital. Basic laparoscopic skills were assessed by 5 tasks using a physical box trainer. Each participant completed 2 tasks for assessments in each type of simulator (virtual reality: bowel handling and cutting; augmented reality: object positioning and a pericardial window model). Motion metrics such as instrument path length, angle or drift, and economy of motion of each simulator were recorded. None of the motion metrics in a virtual reality simulator showed correlation with experience, or to the basic laparoscopic skills score. All metrics in augmented reality were significantly correlated with experience (time, instrument path, and economy of movement), except for the hand dominance metric. The basic laparoscopic skills score was correlated to all performance metrics in augmented reality. The augmented reality motion metrics differed between American College of Veterinary Surgeons diplomates and residents, whereas basic laparoscopic skills score and virtual reality metrics did not. Our results provide construct validity and concurrent validity for motion analysis metrics for an augmented reality system, whereas a virtual reality system was validated only for the time score. © Copyright 2016 by The American College of Veterinary Surgeons.
Real-time 3D human capture system for mixed-reality art and entertainment.
Nguyen, Ta Huynh Duy; Qui, Tran Cong Thien; Xu, Ke; Cheok, Adrian David; Teo, Sze Lee; Zhou, ZhiYing; Mallawaarachchi, Asitha; Lee, Shang Ping; Liu, Wei; Teo, Hui Siang; Thang, Le Nam; Li, Yu; Kato, Hirokazu
2005-01-01
A real-time system for capturing humans in 3D and placing them into a mixed reality environment is presented in this paper. The subject is captured by nine cameras surrounding her. Looking through a head-mounted-display with a camera in front pointing at a marker, the user can see the 3D image of this subject overlaid onto a mixed reality scene. The 3D images of the subject viewed from this viewpoint are constructed using a robust and fast shape-from-silhouette algorithm. The paper also presents several techniques to produce good quality and speed up the whole system. The frame rate of our system is around 25 fps using only standard Intel processor-based personal computers. Besides a remote live 3D conferencing and collaborating system, we also describe an application of the system in art and entertainment, named Magic Land, which is a mixed reality environment where captured avatars of human and 3D computer generated virtual animations can form an interactive story and play with each other. This system demonstrates many technologies in human computer interaction: mixed reality, tangible interaction, and 3D communication. The result of the user study not only emphasizes the benefits, but also addresses some issues of these technologies.
Chen, Karen B; Ponto, Kevin; Tredinnick, Ross D; Radwin, Robert G
2015-06-01
This study was a proof of concept for virtual exertions, a novel method that involves the use of body tracking and electromyography for grasping and moving projections of objects in virtual reality (VR). The user views objects in his or her hands during rehearsed co-contractions of the same agonist-antagonist muscles normally used for the desired activities to suggest exerting forces. Unlike physical objects, virtual objects are images and lack mass. There is currently no practical physically demanding way to interact with virtual objects to simulate strenuous activities. Eleven participants grasped and lifted similar physical and virtual objects of various weights in an immersive 3-D Cave Automatic Virtual Environment. Muscle activity, localized muscle fatigue, ratings of perceived exertions, and NASA Task Load Index were measured. Additionally, the relationship between levels of immersion (2-D vs. 3-D) was studied. Although the overall magnitude of biceps activity and workload were greater in VR, muscle activity trends and fatigue patterns for varying weights within VR and physical conditions were the same. Perceived exertions for varying weights were not significantly different between VR and physical conditions. Perceived exertion levels and muscle activity patterns corresponded to the assigned virtual loads, which supported the hypothesis that the method evoked the perception of physical exertions and showed that the method was promising. Ultimately this approach may offer opportunities for research and training individuals to perform strenuous activities under potentially safer conditions that mimic situations while seeing their own body and hands relative to the scene. © 2014, Human Factors and Ergonomics Society.
Schmitt, Yuko S; Hoffman, Hunter G; Blough, David K; Patterson, David R; Jensen, Mark P; Soltani, Maryam; Carrougher, Gretchen J; Nakamura, Dana; Sharar, Sam R
2011-02-01
This randomized, controlled, within-subjects (crossover design) study examined the effects of immersive virtual reality as an adjunctive analgesic technique for hospitalized pediatric burn inpatients undergoing painful physical therapy. Fifty-four subjects (6-19 years old) performed range-of-motion exercises under a therapist's direction for 1-5 days. During each session, subjects spent equivalent time in both the virtual reality and the control conditions (treatment order randomized and counterbalanced). Graphic rating scale scores assessing the sensory, affective, and cognitive components of pain were obtained for each treatment condition. Secondary outcomes assessed subjects' perception of the virtual reality experience and maximum range-of-motion. Results showed that on study day one, subjects reported significant decreases (27-44%) in pain ratings during virtual reality. They also reported improved affect ("fun") during virtual reality. The analgesia and affect improvements were maintained with repeated virtual reality use over multiple therapy sessions. Maximum range-of-motion was not different between treatment conditions, but was significantly greater after the second treatment condition (regardless of treatment order). These results suggest that immersive virtual reality is an effective nonpharmacologic, adjunctive pain reduction technique in the pediatric burn population undergoing painful rehabilitation therapy. The magnitude of the analgesic effect is clinically meaningful and is maintained with repeated use. Copyright © 2010 Elsevier Ltd and ISBI. All rights reserved.
Schmitt, Yuko S.; Hoffman, Hunter G.; Blough, David K.; Patterson, David R.; Jensen, Mark P.; Soltani, Maryam; Carrougher, Gretchen J.; Nakamura, Dana; Sharar, Sam R.
2010-01-01
This randomized, controlled, within-subjects (crossover design) study examined the effects of immersive virtual reality as an adjunctive analgesic technique for hospitalized pediatric burn inpatients undergoing painful physical therapy. Fifty-four subjects (6–19 years old) performed range-of-motion exercises under a therapist’s direction for one to five days. During each session, subjects spent equivalent time in both the virtual reality and the control conditions (treatment order randomized and counterbalanced). Graphic rating scale scores assessing the sensory, affective, and cognitive components of pain were obtained for each treatment condition. Secondary outcomes assessed subjects’ perception of the virtual reality experience and maximum range-of-motion. Results showed that on study day one, subjects reported significant decreases (27–44%) in pain ratings during virtual reality. They also reported improved affect (“fun”) during virtual reality. The analgesia and affect improvements were maintained with repeated virtual reality use over multiple therapy sessions. Maximum range-of-motion was not different between treatment conditions, but was significantly greater after the second treatment condition (regardless of treatment order). These results suggest that immersive virtual reality is an effective nonpharmacologic, adjunctive pain reduction technique in the pediatric burn population undergoing painful rehabilitation therapy. The magnitude of the analgesic effect is clinically meaningful and is maintained with repeated use. PMID:20692769
Thomsen, Ann Sofia Skou; Bach-Holm, Daniella; Kjærbo, Hadi; Højgaard-Olsen, Klavs; Subhi, Yousif; Saleh, George M; Park, Yoon Soo; la Cour, Morten; Konge, Lars
2017-04-01
To investigate the effect of virtual reality proficiency-based training on actual cataract surgery performance. The secondary purpose of the study was to define which surgeons benefit from virtual reality training. Multicenter masked clinical trial. Eighteen cataract surgeons with different levels of experience. Cataract surgical training on a virtual reality simulator (EyeSi) until a proficiency-based test was passed. Technical performance in the operating room (OR) assessed by 3 independent, masked raters using a previously validated task-specific assessment tool for cataract surgery (Objective Structured Assessment of Cataract Surgical Skill). Three surgeries before and 3 surgeries after the virtual reality training were video-recorded, anonymized, and presented to the raters in random order. Novices (non-independently operating surgeons) and surgeons having performed fewer than 75 independent cataract surgeries showed significant improvements in the OR-32% and 38%, respectively-after virtual reality training (P = 0.008 and P = 0.018). More experienced cataract surgeons did not benefit from simulator training. The reliability of the assessments was high with a generalizability coefficient of 0.92 and 0.86 before and after the virtual reality training, respectively. Clinically relevant cataract surgical skills can be improved by proficiency-based training on a virtual reality simulator. Novices as well as surgeons with an intermediate level of experience showed improvement in OR performance score. Copyright © 2017 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Effective Communication with Cultural Heritage Using Virtual Technologies
NASA Astrophysics Data System (ADS)
Reffat, R. M.; Nofal, E. M.
2013-07-01
Cultural heritage is neither static nor stable. There is a need to explore ways for effectively communicating with cultural heritage to tourists and society at large, in an age of immediacy, a time of multiple realities and to multi-cultural tourists. It is vital to consider cultural heritage as a creative and relational process where places and communities are constantly remade through creative performance. The paper introduces virtual technologies as an approach to attain effective communication with cultural heritage. This approach emphasizes the importance of "user, content and context" in guiding the production of virtual heritage, as opposed to technology being the sole motivator. It addresses how these three issues in virtual heritage need to be transformed from merely representing quantitative data towards cultural information using the proposed effective communication triangle through representing meaningful relationships between cultural heritage elements, users and context. The paper offers a focused articulation of a proposed computational platform of "interactive, personalized and contextual-based navigation" with Egyptian heritage monuments as a one step forward towards achieving effective communication with Egyptian cultural heritage.
Exploring 4D Flow Data in an Immersive Virtual Environment
NASA Astrophysics Data System (ADS)
Stevens, A. H.; Butkiewicz, T.
2017-12-01
Ocean models help us to understand and predict a wide range of intricate physical processes which comprise the atmospheric and oceanic systems of the Earth. Because these models output an abundance of complex time-varying three-dimensional (i.e., 4D) data, effectively conveying the myriad information from a given model poses a significant visualization challenge. The majority of the research effort into this problem has concentrated around synthesizing and examining methods for representing the data itself; by comparison, relatively few studies have looked into the potential merits of various viewing conditions and virtual environments. We seek to improve our understanding of the benefits offered by current consumer-grade virtual reality (VR) systems through an immersive, interactive 4D flow visualization system. Our dataset is a Regional Ocean Modeling System (ROMS) model representing a 12-hour tidal cycle of the currents within New Hampshire's Great Bay estuary. The model data was loaded into a custom VR particle system application using the OpenVR software library and the HTC Vive hardware, which tracks a headset and two six-degree-of-freedom (6DOF) controllers within a 5m-by-5m area. The resulting visualization system allows the user to coexist in the same virtual space as the data, enabling rapid and intuitive analysis of the flow model through natural interactions with the dataset and within the virtual environment. Whereas a traditional computer screen typically requires the user to reposition a virtual camera in the scene to obtain the desired view of the data, in virtual reality the user can simply move their head to the desired viewpoint, completely eliminating the mental context switches from data exploration/analysis to view adjustment and back. The tracked controllers become tools to quickly manipulate (reposition, reorient, and rescale) the dataset and to interrogate it by, e.g., releasing dye particles into the flow field, probing scalar velocities, placing a cutting plane through a region of interest, etc. It is hypothesized that the advantages afforded by head-tracked viewing and 6DOF interaction devices will lead to faster and more efficient examination of 4D flow data. A human factors study is currently being prepared to empirically evaluate this method of visualization and interaction.
Pan, Xueni; Hamilton, Antonia F de C
2018-03-05
As virtual reality (VR) technology and systems become more commercially available and accessible, more and more psychologists are starting to integrate VR as part of their methods. This approach offers major advantages in experimental control, reproducibility, and ecological validity, but also has limitations and hidden pitfalls which may distract the novice user. This study aimed to guide the psychologist into the novel world of VR, reviewing available instrumentation and mapping the landscape of possible systems. We use examples of state-of-the-art research to describe challenges which research is now solving, including embodiment, uncanny valley, simulation sickness, presence, ethics, and experimental design. Finally, we propose that the biggest challenge for the field would be to build a fully interactive virtual human who can pass a VR Turing test - and that this could only be achieved if psychologists, VR technologists, and AI researchers work together. © 2018 The Authors British Journal of Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
Proof-of-Concept Part Task Trainer for Close Air Support Procedures
2016-06-01
TVDL Tactical Video Down Link VE Virtual Environment VR Virtual Reality WTI Weapons and Tactics Instructor xvii ACKNOWLEDGMENTS I would first...in training of USMC pilots for close air support operations? • What is the feasibility of developing a prototype virtual reality (VR) system that...Chapter IV provides a review of virtual reality (VR)/ virtual environment (VE) and part-task trainers currently used in military training
Virtual reality training for surgical trainees in laparoscopic surgery.
Nagendran, Myura; Gurusamy, Kurinchi Selvan; Aggarwal, Rajesh; Loizidou, Marilena; Davidson, Brian R
2013-08-27
Standard surgical training has traditionally been one of apprenticeship, where the surgical trainee learns to perform surgery under the supervision of a trained surgeon. This is time-consuming, costly, and of variable effectiveness. Training using a virtual reality simulator is an option to supplement standard training. Virtual reality training improves the technical skills of surgical trainees such as decreased time for suturing and improved accuracy. The clinical impact of virtual reality training is not known. To assess the benefits (increased surgical proficiency and improved patient outcomes) and harms (potentially worse patient outcomes) of supplementary virtual reality training of surgical trainees with limited laparoscopic experience. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) in The Cochrane Library, MEDLINE, EMBASE and Science Citation Index Expanded until July 2012. We included all randomised clinical trials comparing virtual reality training versus other forms of training including box-trainer training, no training, or standard laparoscopic training in surgical trainees with little laparoscopic experience. We also planned to include trials comparing different methods of virtual reality training. We included only trials that assessed the outcomes in people undergoing laparoscopic surgery. Two authors independently identified trials and collected data. We analysed the data with both the fixed-effect and the random-effects models using Review Manager 5 analysis. For each outcome we calculated the mean difference (MD) or standardised mean difference (SMD) with 95% confidence intervals based on intention-to-treat analysis. We included eight trials covering 109 surgical trainees with limited laparoscopic experience. Of the eight trials, six compared virtual reality versus no supplementary training. One trial compared virtual reality training versus box-trainer training and versus no supplementary training, and one trial compared virtual reality training versus box-trainer training. There were no trials that compared different forms of virtual reality training. All the trials were at high risk of bias. Operating time and operative performance were the only outcomes reported in the trials. The remaining outcomes such as mortality, morbidity, quality of life (the primary outcomes of this review) and hospital stay (a secondary outcome) were not reported. Virtual reality training versus no supplementary training: The operating time was significantly shorter in the virtual reality group than in the no supplementary training group (3 trials; 49 participants; MD -11.76 minutes; 95% CI -15.23 to -8.30). Two trials that could not be included in the meta-analysis also showed a reduction in operating time (statistically significant in one trial). The numerical values for operating time were not reported in these two trials. The operative performance was significantly better in the virtual reality group than the no supplementary training group using the fixed-effect model (2 trials; 33 participants; SMD 1.65; 95% CI 0.72 to 2.58). The results became non-significant when the random-effects model was used (2 trials; 33 participants; SMD 2.14; 95% CI -1.29 to 5.57). One trial could not be included in the meta-analysis as it did not report the numerical values. The authors stated that the operative performance of virtual reality group was significantly better than the control group. Virtual reality training versus box-trainer training: The only trial that reported operating time did not report the numerical values. In this trial, the operating time in the virtual reality group was significantly shorter than in the box-trainer group. Of the two trials that reported operative performance, only one trial reported the numerical values. The operative performance was significantly better in the virtual reality group than in the box-trainer group (1 trial; 19 participants; SMD 1.46; 95% CI 0.42 to 2.50). In the other trial that did not report the numerical values, the authors stated that the operative performance in the virtual reality group was significantly better than the box-trainer group. Virtual reality training appears to decrease the operating time and improve the operative performance of surgical trainees with limited laparoscopic experience when compared with no training or with box-trainer training. However, the impact of this decreased operating time and improvement in operative performance on patients and healthcare funders in terms of improved outcomes or decreased costs is not known. Further well-designed trials at low risk of bias and random errors are necessary. Such trials should assess the impact of virtual reality training on clinical outcomes.
Virtual Reality simulator for dental anesthesia training in the inferior alveolar nerve block.
Corrêa, Cléber Gimenez; Machado, Maria Aparecida de Andrade Moreira; Ranzini, Edith; Tori, Romero; Nunes, Fátima de Lourdes Santos
2017-01-01
This study shows the development and validation of a dental anesthesia-training simulator, specifically for the inferior alveolar nerve block (IANB). The system developed provides the tactile sensation of inserting a real needle in a human patient, using Virtual Reality (VR) techniques and a haptic device that can provide a perceived force feedback in the needle insertion task during the anesthesia procedure. To simulate a realistic anesthesia procedure, a Carpule syringe was coupled to a haptic device. The Volere method was used to elicit requirements from users in the Dentistry area; Repeated Measures Two-Way ANOVA (Analysis of Variance), Tukey post-hoc test and averages for the results' analysis. A questionnaire-based subjective evaluation method was applied to collect information about the simulator, and 26 people participated in the experiments (12 beginners, 12 at intermediate level, and 2 experts). The questionnaire included profile, preferences (number of viewpoints, texture of the objects, and haptic device handler), as well as visual (appearance, scale, and position of objects) and haptic aspects (motion space, tactile sensation, and motion reproduction). The visual aspect was considered appropriate and the haptic feedback must be improved, which the users can do by calibrating the virtual tissues' resistance. The evaluation of visual aspects was influenced by the participants' experience, according to ANOVA test (F=15.6, p=0.0002, with p<0.01). The user preferences were the simulator with two viewpoints, objects with texture based on images and the device with a syringe coupled to it. The simulation was considered thoroughly satisfactory for the anesthesia training, considering the needle insertion task, which includes the correct insertion point and depth, as well as the perception of tissues resistances during the insertion.
Stability and Workload of the Virtual Reality-Based Simulator-2.
Kamaraj, Deepan C; Dicianno, Brad E; Mahajan, Harshal P; Buhari, Alhaji M; Cooper, Rory A
2016-07-01
To assess the stability of clinicians' and users' rating of electric-powered wheelchair (EPW) driving while using 4 different human-machine interfaces (HMIs) within the Virtual Reality-based SIMulator-version 2 (VRSIM-2) and in the real world (accounting for a total of 5 unique driving conditions). Within-subjects repeated-measures design. Simulation-based assessment in a research laboratory. A convenience sample of EPW athletes (N=21) recruited at the 31st National Veterans Wheelchair Games. Not applicable. Composite PMRT scores from the Power Mobility Road Test (PMRT); Raw Task Load Index; and the 6 subscale scores from the Task Load Index developed by the National Aeronautics and Space Administration (NASA-TLX). There was moderate stability (intraclass correlation coefficient between .50 and .75) in the total composite PMRT scores (P<.001) and the users' self-reported performance scores (P<.001) among the 5 driving conditions. There was a significant difference in the workload among the 5 different driving conditions as reflected by the Raw Task Load Index (P=.009). Subanalyses revealed this difference was due to the difference in the mental demand (P=.007) and frustration (P=.007) subscales. Post hoc analyses revealed that these differences in the NASA-TLX subscale scores were due to the differences between real-world and virtual driving scores, particularly attributable to the conditions (1 and 3) that lacked the rollers as a part of the simulation. Further design improvements in the simulator to increase immersion experienced by the EPW user, along with a standardized training program for clinicians to deliver PMRT in VRSIM-2, could improve the stability between the different HMIs and real-world driving. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Assessment of individual hand performance in box trainers compared to virtual reality trainers.
Madan, Atul K; Frantzides, Constantine T; Shervin, Nina; Tebbit, Christopher L
2003-12-01
Training residents in laparoscopic skills is ideally initiated in an inanimate laboratory with both box trainers and virtual reality trainers. Virtual reality trainers have the ability to score individual hand performance although they are expensive. Here we compared the ability to assess dominant and nondominant hand performance in box trainers with virtual reality trainers. Medical students without laparoscopic experience were utilized in this study (n = 16). Each student performed tasks on the LTS 2000, an inanimate box trainer (placing pegs with both hands and transferring pegs from one hand to another), as well as a task on the MIST-VR, a virtual reality trainer (grasping a virtual object and placing it in a virtual receptable with alternating hands). A surgeon scored students for the inanimate box trainer exercises (time and errors) while the MIST-VR scored students (time, economy of movements, and errors for each hand). Statistical analysis included Pearson correlations. Errors and time for the one-handed tasks on the box trainer did not correlate with errors, time, or economy measured for each hand by the MIST-VR (r = 0.01 to 0.30; P = NS). Total errors on the virtual reality trainer did correlate with errors on transferring pege (r = 0.61; P < 0.05). Economy and time of both dominant and nondominant hand from the MIST-VR correlated with time of transferring pegs in the box trainer (r = 0.53 to 0.77; P < 0.05). While individual hand assessment by the box trainer during 2-handed tasks was related to assessment by the virtual reality trainer, individual hand assessment during 1-handed tasks did not correlate with the virtual reality trainer. Virtual reality trainers, such as the MIST-VR, allow assessment of individual hand skills which may lead to improved laparoscopic skill acquisition. It is difficult to assess individual hand performance with box trainers alone.
Mixed reality ultrasound guidance system: a case study in system development and a cautionary tale.
Ameri, Golafsoun; Baxter, John S H; Bainbridge, Daniel; Peters, Terry M; Chen, Elvis C S
2018-04-01
Real-time ultrasound has become a crucial aspect of several image-guided interventions. One of the main constraints of such an approach is the difficulty in interpretability of the limited field of view of the image, a problem that has recently been addressed using mixed reality, such as augmented reality and augmented virtuality. The growing popularity and maturity of mixed reality has led to a series of informal guidelines to direct development of new systems and to facilitate regulatory approval. However, the goals of mixed reality image guidance systems and the guidelines for their development have not been thoroughly discussed. The purpose of this paper is to identify and critically examine development guidelines in the context of a mixed reality ultrasound guidance system through a case study. A mixed reality ultrasound guidance system tailored to central line insertions was developed in close collaboration with an expert user. This system outperformed ultrasound-only guidance in a novice user study and has obtained clearance for clinical use in humans. A phantom study with 25 experienced physicians was carried out to compare the performance of the mixed reality ultrasound system against conventional ultrasound-only guidance. Despite the previous promising results, there was no statistically significant difference between the two systems. Guidelines for developing mixed reality image guidance systems cannot be applied indiscriminately. Each design decision, no matter how well justified, should be the subject of scientific and technical investigation. Iterative and small-scale evaluation can readily unearth issues and previously unknown or implicit system requirements. We recommend a wary eye in development of mixed reality ultrasound image guidance systems emphasizing small-scale iterative evaluation alongside system development. Ultimately, we recommend that the image-guided intervention community furthers and deepens this discussion into best practices in developing image-guided interventions.
The Reality of Virtual Reality Product Development
NASA Astrophysics Data System (ADS)
Dever, Clark
Virtual Reality and Augmented Reality are emerging areas of research and product development in enterprise companies. This talk will discuss industry standard tools and current areas of application in the commercial market. Attendees will gain insights into how to research, design, and (most importantly) ship, world class products. The presentation will recount the lessons learned to date developing a Virtual Reality tool to solve physics problems resulting from trying to perform aircraft maintenance on ships at sea.
NASA Technical Reports Server (NTRS)
1994-01-01
This symposium on measurement and control in robotics included sessions on: (1) rendering, including tactile perception and applied virtual reality; (2) applications in simulated medical procedures and telerobotics; (3) tracking sensors in a virtual environment; (4) displays for virtual reality applications; (5) sensory feedback including a virtual environment application with partial gravity simulation; and (6) applications in education, entertainment, technical writing, and animation.
Virtual Reality: A Strategy for Training in Cross-Cultural Communication.
ERIC Educational Resources Information Center
Meyer, Catherine; Dunn-Roberts, Richard
1992-01-01
Defines virtual reality and explains terminology, theoretical concepts, and enabling technologies. Research and applications are described; limitations of current technology are considered; and future possibilities are discussed, including the use of virtual reality in training for cross-cultural communication. (22 references) (LRW)
[Virtual reality in medical education].
Edvardsen, O; Steensrud, T
1998-02-28
Virtual reality technology has found new applications in industry over the last few years. Medical literature has for several years predicted a break-through in this technology for medical education. Although there is a great potential for this technology in medical education, there seems to be a wide gap between expectations and actual possibilities at present. State of the technology was explored by participation at the conference "Medicine meets virtual reality V" (San Diego Jan. 22-25 1997) and a visit to one of the leading laboratories on virtual reality in medical education. In this paper we introduce some of the basic terminology and technology, review some of the topics covered by the conference, and describe projects running in one of the leading laboratories on virtual reality technology for medical education. With this information in mind, we discuss potential applications of the current technology in medical education. Current virtual reality systems are judged to be too costly and their usefulness in education too limited for routine use in medical education.
Tal, Aner; Wansink, Brian
2011-01-01
Virtual reality (VR) provides a potentially powerful tool for researchers seeking to investigate eating and physical activity. Some unique conditions are necessary to ensure that the psychological processes that influence real eating behavior also influence behavior in VR environments. Accounting for these conditions is critical if VR-assisted research is to accurately reflect real-world situations. The current work discusses key considerations VR researchers must take into account to ensure similar psychological functioning in virtual and actual reality and does so by focusing on the process of spontaneous mental simulation. Spontaneous mental simulation is prevalent under real-world conditions but may be absent under VR conditions, potentially leading to differences in judgment and behavior between virtual and actual reality. For simulation to occur, the virtual environment must be perceived as being available for action. A useful chart is supplied as a reference to help researchers to investigate eating and physical activity more effectively. PMID:21527088
Tal, Aner; Wansink, Brian
2011-03-01
Virtual reality (VR) provides a potentially powerful tool for researchers seeking to investigate eating and physical activity. Some unique conditions are necessary to ensure that the psychological processes that influence real eating behavior also influence behavior in VR environments. Accounting for these conditions is critical if VR-assisted research is to accurately reflect real-world situations. The current work discusses key considerations VR researchers must take into account to ensure similar psychological functioning in virtual and actual reality and does so by focusing on the process of spontaneous mental simulation. Spontaneous mental simulation is prevalent under real-world conditions but may be absent under VR conditions, potentially leading to differences in judgment and behavior between virtual and actual reality. For simulation to occur, the virtual environment must be perceived as being available for action. A useful chart is supplied as a reference to help researchers to investigate eating and physical activity more effectively. © 2011 Diabetes Technology Society.
ERIC Educational Resources Information Center
Huang, Hsiu-Mei; Liaw, Shu-Sheng; Lai, Chung-Min
2016-01-01
Advanced technologies have been widely applied in medical education, including human-patient simulators, immersive virtual reality Cave Automatic Virtual Environment systems, and video conferencing. Evaluating learner acceptance of such virtual reality (VR) learning environments is a critical issue for ensuring that such technologies are used to…
French Military Applications of Virtual Reality
2000-11-01
UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO10631 TITLE: French Military Applications of Virtual Reality...numbers comprise the compilation report: ADPO10609 thru ADP010633 UNCLASSIFIED 23-1 FRENCH MILITARY APPLICATIONS OF VIRTUAL REALITY Jean Paul Papin* and...Pascal Hue DGA/DCE/ETC4/ETAS Etablissement Technique d’ Angers BP 36 49460 MONTREUIL JUIGNE, France INTRODUCTION France is now applying virtual
Authoring Tours of Geospatial Data With KML and Google Earth
NASA Astrophysics Data System (ADS)
Barcay, D. P.; Weiss-Malik, M.
2008-12-01
As virtual globes become widely adopted by the general public, the use of geospatial data has expanded greatly. With the popularization of Google Earth and other platforms, GIS systems have become virtual reality platforms. Using these platforms, a casual user can easily explore the world, browse massive data-sets, create powerful 3D visualizations, and share those visualizations with millions of people using the KML language. This technology has raised the bar for professionals and academics alike. It is now expected that studies and projects will be accompanied by compelling, high-quality visualizations. In this new landscape, a presentation of geospatial data can be the most effective form of advertisement for a project: engaging both the general public and the scientific community in a unified interactive experience. On the other hand, merely dumping a dataset into a virtual globe can be a disorienting, alienating experience for many users. To create an effective, far-reaching presentation, an author must take care to make their data approachable to a wide variety of users with varying knowledge of the subject matter, expertise in virtual globes, and attention spans. To that end, we present techniques for creating self-guided interactive tours of data represented in KML and visualized in Google Earth. Using these methods, we provide the ability to move the camera through the world while dynamically varying the content, style, and visibility of the displayed data. Such tours can automatically guide users through massive, complex datasets: engaging a broad user-base, and conveying subtle concepts that aren't immediately apparent when viewing the raw data. To the casual user these techniques result in an extremely compelling experience similar to watching video. Unlike video though, these techniques maintain the rich interactive environment provided by the virtual globe, allowing users to explore the data in detail and to add other data sources to the presentation.
Perpiñá, Conxa; Roncero, María
2016-05-01
Virtual reality has demonstrated promising results in the treatment of eating disorders (ED); however, few studies have examined its usefulness in treating obesity. The aim of this study was to compare ED and obese patients on their reality judgment of a virtual environment (VE) designed to normalize their eating pattern. A second objective was to study which variables predicted the reality of the experience of eating a virtual forbidden-fattening food. ED patients, obese patients, and a non-clinical group (N=62) experienced a non-immersive VE, and then completed reality judgment and presence measures. All participants rated the VE with similar scores for quality, interaction, engagement, and ecological validity; however, ED patients obtained the highest scores on emotional involvement, attention, reality judgment/presence, and negative effects. The obese group gave the lowest scores to reality judgment/presence, satisfaction and sense of physical space, and they held an intermediate position in the attribution of reality to virtually eating a "fattening" food. The palatability of a virtual food was predicted by attention capturing and belonging to the obese group, while the attribution of reality to the virtual eating was predicted by engagement and belonging to the ED group. This study offers preliminary results about the differential impact on ED and obese patients of the exposure to virtual food, and about the need to implement a VE that can be useful as a virtual lab for studying eating behavior and treating obesity. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Schonlau, William J.
2006-05-01
An immersive viewing engine providing basic telepresence functionality for a variety of application types is presented. Augmented reality, teleoperation and virtual reality applications all benefit from the use of head mounted display devices that present imagery appropriate to the user's head orientation at full frame rates. Our primary application is the viewing of remote environments, as with a camera equipped teleoperated vehicle. The conventional approach where imagery from a narrow field camera onboard the vehicle is presented to the user on a small rectangular screen is contrasted with an immersive viewing system where a cylindrical or spherical format image is received from a panoramic camera on the vehicle, resampled in response to sensed user head orientation and presented via wide field eyewear display, approaching 180 degrees of horizontal field. Of primary interest is the user's enhanced ability to perceive and understand image content, even when image resolution parameters are poor, due to the innate visual integration and 3-D model generation capabilities of the human visual system. A mathematical model for tracking user head position and resampling the panoramic image to attain distortion free viewing of the region appropriate to the user's current head pose is presented and consideration is given to providing the user with stereo viewing generated from depth map information derived using stereo from motion algorithms.
A Discussion of Virtual Reality As a New Tool for Training Healthcare Professionals.
Fertleman, Caroline; Aubugeau-Williams, Phoebe; Sher, Carmel; Lim, Ai-Nee; Lumley, Sophie; Delacroix, Sylvie; Pan, Xueni
2018-01-01
Virtual reality technology is an exciting and emerging field with vast applications. Our study sets out the viewpoint that virtual reality software could be a new focus of direction in the development of training tools in medical education. We carried out a panel discussion at the Center for Behavior Change 3rd Annual Conference, prompted by the study, "The Responses of Medical General Practitioners to Unreasonable Patient Demand for Antibiotics--A Study of Medical Ethics Using Immersive Virtual Reality" (1). In Pan et al.'s study, 21 general practitioners (GPs) and GP trainees took part in a videoed, 15-min virtual reality scenario involving unnecessary patient demands for antibiotics. This paper was discussed in-depth at the Center for Behavior Change 3rd Annual Conference; the content of this paper is a culmination of findings and feedback from the panel discussion. The experts involved have backgrounds in virtual reality, general practice, medicines management, medical education and training, ethics, and philosophy. Virtual reality is an unexplored methodology to instigate positive behavioral change among clinicians where other methods have been unsuccessful, such as antimicrobial stewardship. There are several arguments in favor of use of virtual reality in medical education: it can be used for "difficult to simulate" scenarios and to standardize a scenario, for example, for use in exams. However, there are limitations to its usefulness because of the cost implications and the lack of evidence that it results in demonstrable behavior change.
Stereoscopic virtual reality models for planning tumor resection in the sellar region.
Wang, Shou-sen; Zhang, Shang-ming; Jing, Jun-jie
2012-11-28
It is difficult for neurosurgeons to perceive the complex three-dimensional anatomical relationships in the sellar region. To investigate the value of using a virtual reality system for planning resection of sellar region tumors. The study included 60 patients with sellar tumors. All patients underwent computed tomography angiography, MRI-T1W1, and contrast enhanced MRI-T1W1 image sequence scanning. The CT and MRI scanning data were collected and then imported into a Dextroscope imaging workstation, a virtual reality system that allows structures to be viewed stereoscopically. During preoperative assessment, typical images for each patient were chosen and printed out for use by the surgeons as references during surgery. All sellar tumor models clearly displayed bone, the internal carotid artery, circle of Willis and its branches, the optic nerve and chiasm, ventricular system, tumor, brain, soft tissue and adjacent structures. Depending on the location of the tumors, we simulated the transmononasal sphenoid sinus approach, transpterional approach, and other approaches. Eleven surgeons who used virtual reality models completed a survey questionnaire. Nine of the participants said that the virtual reality images were superior to other images but that other images needed to be used in combination with the virtual reality images. The three-dimensional virtual reality models were helpful for individualized planning of surgery in the sellar region. Virtual reality appears to be promising as a valuable tool for sellar region surgery in the future.
Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces
NASA Astrophysics Data System (ADS)
Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana
Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.
Shiri, Shimon; Feintuch, Uri; Lorber-Haddad, Adi; Moreh, Elior; Twito, Dvora; Tuchner-Arieli, Maya; Meiner, Zeev
2012-01-01
To introduce the rationale of a novel virtual reality system based on self-face viewing and mirror visual feedback, and to examine its feasibility as a rehabilitation tool for poststroke patients. A novel motion capture virtual reality system integrating online self-face viewing and mirror visual feedback has been developed for stroke rehabilitation.The system allows the replacement of the impaired arm by a virtual arm. Upon making small movements of the paretic arm, patients view themselves virtually performing healthy full-range movements. A sample of 6 patients in the acute poststroke phase received the virtual reality treatment concomitantly with conservative rehabilitation treatment. Feasibility was assessed during 10 sessions for each participant. All participants succeeded in operating the system, demonstrating its feasibility in terms of adherence and improvement in task performance. Patients' performance within the virtual environment and a set of clinical-functional measures recorded before the virtual reality treatment, at 1 week, and after 3 months indicated neurological status and general functioning improvement. These preliminary results indicate that this newly developed virtual reality system is safe and feasible. Future randomized controlled studies are required to assess whether this system has beneficial effects in terms of enhancing upper limb function and quality of life in poststroke patients.
Army Logistician. Volume 38, Issue 6, November-December 2006
2006-12-01
functioning electrically, magnetically, or thermally; or performing self -diagnosis and self - healing actions)—will offer extraordinary capabilities for...receives sufficient information about a remote, real-world site (a battlefield) through a machine (a robot ) so that the user feels physically present at...collaborative planning. • Improved training and education because of ad- vances in virtual reality environments and perception capabilities. Robots have been
2006-09-01
application with the aim of finding an affordable display with acceptable resolution and field of view (5DT, Cyvisor, eMagin ). The HMD that was chosen was the... eMagin z800, which contains OLED displays capable of 800x600 (SVGA) resolution with a 40 degree diagonal field of view (http://www.emagin.com
Virtual Reality, Combat, and Communication.
ERIC Educational Resources Information Center
Thrush, Emily Austin; Bodary, Michael
2000-01-01
Presents a brief examination of the evolution of virtual reality devices that illustrates how the development of this new medium is influenced by emerging technologies and by marketing pressures. Notes that understanding these influences may help prepare for the role of technical communicators in building virtual reality applications for education…
The Use of Virtual Reality Technology in the Treatment of Anxiety and Other Psychiatric Disorders.
Maples-Keller, Jessica L; Bunnell, Brian E; Kim, Sae-Jin; Rothbaum, Barbara O
After participating in this activity, learners should be better able to:• Evaluate the literature regarding the effectiveness of incorporating virtual reality (VR) in the treatment of psychiatric disorders• Assess the use of exposure-based intervention for anxiety disorders ABSTRACT: Virtual reality (VR) allows users to experience a sense of presence in a computer-generated, three-dimensional environment. Sensory information is delivered through a head-mounted display and specialized interface devices. These devices track head movements so that the movements and images change in a natural way with head motion, allowing for a sense of immersion. VR, which allows for controlled delivery of sensory stimulation via the therapist, is a convenient and cost-effective treatment. This review focuses on the available literature regarding the effectiveness of incorporating VR within the treatment of various psychiatric disorders, with particular attention to exposure-based intervention for anxiety disorders. A systematic literature search was conducted in order to identify studies implementing VR-based treatment for anxiety or other psychiatric disorders. This article reviews the history of the development of VR-based technology and its use within psychiatric treatment, the empirical evidence for VR-based treatment, and the benefits for using VR for psychiatric research and treatment. It also presents recommendations for how to incorporate VR into psychiatric care and discusses future directions for VR-based treatment and clinical research.
The challenge of using virtual reality in telerehabilitation.
Rizzo, Albert A; Strickland, Dorothy; Bouchard, Stéphane
2004-01-01
Continuing advances in virtual reality (VR) technology along with concomitant system cost reductions have supported the development of more useful and accessible VR systems that can uniquely target a wide range of physical, psychological, and cognitive rehabilitation concerns and research questions. VR offers the potential to deliver systematic human testing, training, and treatment environments that allow for the precise control of complex dynamic three-dimensional stimulus presentations, within which sophisticated interaction, behavioral tracking, and performance recording is possible. The next step in this evolution will allow for Internet accessibility to libraries of VR scenarios as a likely form of distribution and use. VR applications that are Internet deliverable could open up new possibilities for home-based therapy and rehabilitation. If executed thoughtfully, they could increase client involvement, enhance outcomes and reduce costs. However, before this vision can be achieved, a number of significant challenges will need to be addressed and solved. This article will first present three fictional case vignettes that illustrate the ways that VR telerehabilitation might be implemented with varying degrees of success in the future. We then describe a system that is currently being used to deliver virtual worlds over the Internet for training safety skills to children with learning disabilities. From these illustrative fictional and reality-based applications, we will then briefly discuss the technical, practical, and user-based challenges for implementing VR telerehabilitation, along with views regarding the future of this emerging clinical application.
Virtual reality based therapy for post operative rehabilitation of children with cerebral palsy.
Sharan, Deepak; Ajeesh, P S; Rameshkumar, R; Mathankumar, M; Paulina, R Jospin; Manjula, M
2012-01-01
Virtual reality is the use of interactive replication created with computer hardware and software to impart users with opportunities to engage in environments that appear to feel similar to real world objects and events. The commonest rehabilitation program of cerebral palsy children involves stretching, strengthening, mobilization and various other activities, whereas the use of virtual reality based training (VRBT) for rehabilitation of cerebral palsy is not common. To understand the effect of VRBT a study was formulated. Twenty nine subjects participated (study group--14 and control group--15). Outcome measures were MACS, PBS, level of participation, motivation, cooperation and satisfaction of the child. Results revealed that balance and manual ability were significantly improved in both the groups (Balance: study: t-2.28, p<0.05; control: t-3.5, p<0.01; Manual ability: study: t-5.58, p<0.001; control: t-7.06, p<0.001). PBS had significantly greater improvement in the study group (t-t-2.02, p<0.05). Level of participation, motivation, cooperation and satisfaction of the child were also reported to be significantly higher among the study group as compared with control group. To the author's best knowledge, this is the first study on using the VR-based therapy for the postoperative rehabilitation of children with CP which need further elaboration with larger sample size.
Chen, Xiaojun; Xu, Lu; Wang, Yiping; Wang, Huixiang; Wang, Fang; Zeng, Xiangsen; Wang, Qiugen; Egger, Jan
2015-06-01
The surgical navigation system has experienced tremendous development over the past decades for minimizing the risks and improving the precision of the surgery. Nowadays, Augmented Reality (AR)-based surgical navigation is a promising technology for clinical applications. In the AR system, virtual and actual reality are mixed, offering real-time, high-quality visualization of an extensive variety of information to the users (Moussa et al., 2012) [1]. For example, virtual anatomical structures such as soft tissues, blood vessels and nerves can be integrated with the real-world scenario in real time. In this study, an AR-based surgical navigation system (AR-SNS) is developed using an optical see-through HMD (head-mounted display), aiming at improving the safety and reliability of the surgery. With the use of this system, including the calibration of instruments, registration, and the calibration of HMD, the 3D virtual critical anatomical structures in the head-mounted display are aligned with the actual structures of patient in real-world scenario during the intra-operative motion tracking process. The accuracy verification experiment demonstrated that the mean distance and angular errors were respectively 0.809±0.05mm and 1.038°±0.05°, which was sufficient to meet the clinical requirements. Copyright © 2015 Elsevier Inc. All rights reserved.
Fonseca, Luciana Mara Monti; Dias, Danielle Monteiro Vilela; Góes, Fernanda Dos Santos Nogueira; Seixas, Carlos Alberto; Scochi, Carmen Gracinda Silvan; Martins, José Carlos Amado; Rodrigues, Manuel Alves
2014-09-01
The present study aimed to describe the development process of a serious game that enables users to evaluate the respiratory process in a preterm infant based on an emotional design model. The e-Baby serious game was built to feature the simulated environment of an incubator, in which the user performs a clinical evaluation of the respiratory process in a virtual preterm infant. The user learns about the preterm baby's history, chooses the tools for the clinical evaluation, evaluates the baby, and determines whether his/her evaluation is appropriate. The e-Baby game presents phases that contain respiratory process impairments of higher or lower complexity in the virtual preterm baby. Included links give the user the option of recording the entire evaluation procedure and sharing his/her performance on a social network. e-Baby integrates a Clinical Evaluation of the Preterm Baby course in the Moodle virtual environment. This game, which evaluates the respiratory process in preterm infants, could support a more flexible, attractive, and interactive teaching and learning process that includes simulations with features very similar to neonatal unit realities, thus allowing more appropriate training for clinical oxygenation evaluations in at-risk preterm infants. e-Baby allows advanced user-technology-educational interactions because it requires active participation in the process and is emotionally integrated.
Use of display technologies for augmented reality enhancement
NASA Astrophysics Data System (ADS)
Harding, Kevin
2016-06-01
Augmented reality (AR) is seen as an important tool for the future of user interfaces as well as training applications. An important application area for AR is expected to be in the digitization of training and worker instructions used in the Brilliant Factory environment. The transition of work instructions methods from printed pages in a book or taped to a machine to virtual simulations is a long step with many challenges along the way. A variety of augmented reality tools are being explored today for industrial applications that range from simple programmable projections in the work space to 3D displays and head mounted gear. This paper will review where some of these tool are today and some of the pros and cons being considered for the future worker environment.
[Virtual reality therapy in anxiety disorders].
Mitrousia, V; Giotakos, O
2016-01-01
During the last decade a number of studies have been conducted in order to examine if virtual reality exposure therapy can be an alternative form of therapy for the treatment of mental disorders and particularly for the treatment of anxiety disorders. Imaginal exposure therapy, which is one of the components of Cognitive Behavioral Therapy, cannot be easily applied to all patients and in cases like those virtual reality can be used as an alternative or a supportive psychotherapeutic technique. Most studies using virtual reality have focused on anxiety disorders, mainly in specific phobias, but some extend to other disorders such as eating disorders, drug dependence, pain control and palliative care and rehabilitation. Main characteristics of virtual reality therapy are: "interaction", "immersion", and "presence". High levels of "immersion" and "presence" are associated with increased response to exposure therapy in virtual environments, as well as better therapeutic outcomes and sustained therapeutic gains. Typical devices that are used in order patient's immersion to be achieved are the Head-Mounted Displays (HMD), which are only for individual use, and the computer automatic virtual environment (CAVE), which is a multiuser. Virtual reality therapy's disadvantages lie in the difficulties that arise due to the demanded specialized technology skills, devices' cost and side effects. Therapists' training is necessary in order for them to be able to manipulate the software and the hardware and to adjust it to each case's needs. Devices' cost is high but as technology continuously improves it constantly decreases. Immersion during virtual reality therapy can induce mild and temporary side effects such as nausea, dizziness or headache. Until today, however, experience shows that virtual reality offers several advantages. Patient's avoidance to be exposed in phobic stimuli is reduced via the use of virtual reality since the patient is exposed to them as many times as he wishes and under the supervision of the therapist. The technique takes place in the therapist's office which ensures confidentiality and privacy. The therapist is able to control unpredicted events that can occur during patient's exposure in real environments. Mainly the therapist can control the intensity of exposure and adapt it to the patient's needs. Virtual reality can be proven particularly useful in some specific psychological states. For instance, patients with post-traumatic stress disorder (PTSD) who prone to avoid the reminders of the traumatic events. Exposure in virtual reality can solve this problem providing to the patient a large number of stimuli that activate the senses causing the necessary physiological and psychological anxiety reactions, regardless of his willingness or ability to recall in his imagination the traumatic event.
Augmenting breath regulation using a mobile driven virtual reality therapy framework.
Abushakra, Ahmad; Faezipour, Miad
2014-05-01
This paper presents a conceptual framework of a virtual reality therapy to assist individuals, especially lung cancer patients or those with breathing disorders to regulate their breath through real-time analysis of respiration movements using a smartphone. Virtual reality technology is an attractive means for medical simulations and treatment, particularly for patients with cancer. The theories, methodologies and approaches, and real-world dynamic contents for all the components of this virtual reality therapy (VRT) via a conceptual framework using the smartphone will be discussed. The architecture and technical aspects of the offshore platform of the virtual environment will also be presented.
Intelligent virtual reality in the setting of fuzzy sets
NASA Technical Reports Server (NTRS)
Dockery, John; Littman, David
1992-01-01
The authors have previously introduced the concept of virtual reality worlds governed by artificial intelligence. Creation of an intelligent virtual reality was further proposed as a universal interface for the handicapped. This paper extends consideration of intelligent virtual realty to a context in which fuzzy set principles are explored as a major tool for implementing theory in the domain of applications to the disabled.
D Modelling and Mapping for Virtual Exploration of Underwater Archaeology Assets
NASA Astrophysics Data System (ADS)
Liarokapis, F.; Kouřil, P.; Agrafiotis, P.; Demesticha, S.; Chmelík, J.; Skarlatos, D.
2017-02-01
This paper investigates immersive technologies to increase exploration time in an underwater archaeological site, both for the public, as well as, for researchers and scholars. Focus is on the Mazotos shipwreck site in Cyprus, which is located 44 meters underwater. The aim of this work is two-fold: (a) realistic modelling and mapping of the site and (b) an immersive virtual reality visit. For 3D modelling and mapping optical data were used. The underwater exploration is composed of a variety of sea elements including: plants, fish, stones, and artefacts, which are randomly positioned. Users can experience an immersive virtual underwater visit in Mazotos shipwreck site and get some information about the shipwreck and its contents for raising their archaeological knowledge and cultural awareness.
Virtual reality in surgery and medicine.
Chinnock, C
1994-01-01
This report documents the state of development of enhanced and virtual reality-based systems in medicine. Virtual reality systems seek to simulate a surgical procedure in a computer-generated world in order to improve training. Enhanced reality systems seek to augment or enhance reality by providing improved imaging alternatives for specific patient data. Virtual reality represents a paradigm shift in the way we teach and evaluate the skills of medical personnel. Driving the development of virtual reality-based simulators is laparoscopic abdominal surgery, where there is a perceived need for better training techniques; within a year, systems will be fielded for second-year residency students. Further refinements over perhaps the next five years should allow surgeons to evaluate and practice new techniques in a simulator before using them on patients. Technical developments are rapidly improving the realism of these machines to an amazing degree, as well as bringing the price down to affordable levels. In the next five years, many new anatomical models, procedures, and skills are likely to become available on simulators. Enhanced reality systems are generally being developed to improve visualization of specific patient data. Three-dimensional (3-D) stereovision systems for endoscopic applications, head-mounted displays, and stereotactic image navigation systems are being fielded now, with neurosurgery and laparoscopic surgery being major driving influences. Over perhaps the next five years, enhanced and virtual reality systems are likely to merge. This will permit patient-specific images to be used on virtual reality simulators or computer-generated landscapes to be input into surgical visualization instruments. Percolating all around these activities are developments in robotics and telesurgery. An advanced information infrastructure eventually will permit remote physicians to share video, audio, medical records, and imaging data with local physicians in real time. Surgical robots are likely to be deployed for specific tasks in the operating room (OR) and to support telesurgery applications. Technical developments in robotics and motion control are key components of many virtual reality systems. Since almost all of the virtual reality and enhanced reality systems will be digitally based, they are also capable of being put "on-line" for tele-training, consulting, and even surgery. Advancements in virtual and enhanced reality systems will be driven in part by consumer applications of this technology. Many of the companies that will supply systems for medical applications are also working on commercial products. A big consumer hit can benefit the entire industry by increasing volumes and bringing down costs.(ABSTRACT TRUNCATED AT 400 WORDS)
A Cost-Effective Virtual Environment for Simulating and Training Powered Wheelchairs Manoeuvres.
Headleand, Christopher J; Day, Thomas; Pop, Serban R; Ritsos, Panagiotis D; John, Nigel W
2016-01-01
Control of a powered wheelchair is often not intuitive, making training of new users a challenging and sometimes hazardous task. Collisions, due to a lack of experience can result in injury for the user and other individuals. By conducting training activities in virtual reality (VR), we can potentially improve driving skills whilst avoiding the risks inherent to the real world. However, until recently VR technology has been expensive and limited the commercial feasibility of a general training solution. We describe Wheelchair-Rift, a cost effective prototype simulator that makes use of the Oculus Rift head mounted display and the Leap Motion hand tracking device. It has been assessed for face validity by a panel of experts from a local Posture and Mobility Service. Initial results augur well for our cost-effective training solution.
The Impact of Virtual Reality Programs in Career and Technical Education
ERIC Educational Resources Information Center
Catterson, Anna J.
2013-01-01
Instructional technology has evolved from blackboards with chalk to in some cases three-dimensional virtual reality environments in which students are interacting and engaging with other students worldwide. The use of this new instructional methodology, known as "virtual reality," has experienced substantial growth in higher education…
When Rural Reality Goes Virtual.
ERIC Educational Resources Information Center
Husain, Dilshad D.
1998-01-01
In rural towns where sparse population and few business are barriers, virtual reality may be the only way to bring work-based learning to students. A partnership between a small-town high school, the Ohio Supercomputer Center, and a high-tech business will enable students to explore the workplace using virtual reality. (JOW)
Designing a Virtual-Reality-Based, Gamelike Math Learning Environment
ERIC Educational Resources Information Center
Xu, Xinhao; Ke, Fengfeng
2016-01-01
This exploratory study examined the design issues related to a virtual-reality-based, gamelike learning environment (VRGLE) developed via OpenSimulator, an open-source virtual reality server. The researchers collected qualitative data to examine the VRGLE's usability, playability, and content integration for math learning. They found it important…
Sweaty Palms! Virtual Reality Applied to Training.
ERIC Educational Resources Information Center
Treiber, Karin
A qualitative case study approach was used to identify the psychosocial effects of the high-fidelity, virtual reality simulation provided in the college-level air traffic control (ATC) training program offered at the Minnesota Air Traffic Control Training Center and to evaluate the applicability of virtual reality to academic/training situations.…
ERIC Educational Resources Information Center
Miller, Erez Cedric
This paper discusses some of the potential benefits and hazards that virtual reality holds for exceptional children in the special education system. Topics addressed include (1) applications of virtual reality, including developing academic skills via cyberspace, vocational training, and social learning in cyberspace; (2) telepresence and distance…
Assessment method of digital Chinese dance movements based on virtual reality technology
NASA Astrophysics Data System (ADS)
Feng, Wei; Shao, Shuyuan; Wang, Shumin
2008-03-01
Virtual reality has played an increasing role in such areas as medicine, architecture, aviation, engineering science and advertising. However, in the art fields, virtual reality is still in its infancy in the representation of human movements. Based on the techniques of motion capture and reuse of motion capture data in virtual reality environment, this paper presents an assessment method in order to evaluate the quantification of dancers' basic Arm Position movements in Chinese traditional dance. In this paper, the data for quantifying traits of dance motions are defined and measured on dancing which performed by an expert and two beginners, with results indicating that they are beneficial for evaluating dance skills and distinctiveness, and the assessment method of digital Chinese dance movements based on virtual reality technology is validity and feasibility.