Science.gov

Sample records for 3-d virtual reality

  1. 3D Virtual Reality Check: Learner Engagement and Constructivist Theory

    ERIC Educational Resources Information Center

    Bair, Richard A.

    2013-01-01

    The inclusion of three-dimensional (3D) virtual tools has created a need to communicate the engagement of 3D tools and specify learning gains that educators and the institutions, which are funding 3D tools, can expect. A review of literature demonstrates that specific models and theories for 3D Virtual Reality (VR) learning do not exist "per…

  2. 3D Virtual Reality for Teaching Astronomy

    NASA Astrophysics Data System (ADS)

    Speck, Angela; Ruzhitskaya, L.; Laffey, J.; Ding, N.

    2012-01-01

    We are developing 3D virtual learning environments (VLEs) as learning materials for an undergraduate astronomy course, in which will utilize advances both in technologies available and in our understanding of the social nature of learning. These learning materials will be used to test whether such VLEs can indeed augment science learning so that it is more engaging, active, visual and effective. Our project focuses on the challenges and requirements of introductory college astronomy classes. Here we present our virtual world of the Jupiter system and how we plan to implement it to allow students to learn course material - physical laws and concepts in astronomy - while engaging them into exploration of the Jupiter's system, encouraging their imagination, curiosity, and motivation. The VLE can allow students to work individually or collaboratively. The 3D world also provides an opportunity for research in astronomy education to investigate impact of social interaction, gaming features, and use of manipulatives offered by a learning tool on students’ motivation and learning outcomes. Use of this VLE is also a valuable source for exploration of how the learners’ spatial awareness can be enhanced by working in 3D environment. We will present the Jupiter-system environment along with a preliminary study of the efficacy and usability of our Jupiter 3D VLE.

  3. Organizational Learning Goes Virtual?: A Study of Employees' Learning Achievement in Stereoscopic 3D Virtual Reality

    ERIC Educational Resources Information Center

    Lau, Kung Wong

    2015-01-01

    Purpose: This study aims to deepen understanding of the use of stereoscopic 3D technology (stereo3D) in facilitating organizational learning. The emergence of advanced virtual technologies, in particular to the stereo3D virtual reality, has fundamentally changed the ways in which organizations train their employees. However, in academic or…

  4. Virtual reality 3D headset based on DMD light modulators

    SciTech Connect

    Bernacki, Bruce E.; Evans, Allan; Tang, Edward

    2014-06-13

    We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micro-mirror devices (DMD). Our approach leverages silicon micro mirrors offering 720p resolution displays in a small form-factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high resolution and low power consumption. Applications include night driving, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design is described in which light from the DMD is imaged to infinity and the user’s own eye lens forms a real image on the user’s retina.

  5. Virtual reality 3D headset based on DMD light modulators

    NASA Astrophysics Data System (ADS)

    Bernacki, Bruce E.; Evans, Allan; Tang, Edward

    2014-06-01

    We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micromirror devices (DMD). Current methods for presenting information for virtual reality are focused on either polarizationbased modulators such as liquid crystal on silicon (LCoS) devices, or miniature LCD or LED displays often using lenses to place the image at infinity. LCoS modulators are an area of active research and development, and reduce the amount of viewing light by 50% due to the use of polarization. Viewable LCD or LED screens may suffer low resolution, cause eye fatigue, and exhibit a "screen door" or pixelation effect due to the low pixel fill factor. Our approach leverages a mature technology based on silicon micro mirrors delivering 720p resolution displays in a small form-factor with high fill factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high-definition resolution and low power consumption, and many of the design methods developed for DMD projector applications can be adapted to display use. Potential applications include night driving with natural depth perception, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design concept is described in which light from the DMD is imaged to infinity and the user's own eye lens forms a real image on the user's retina resulting in a virtual retinal display.

  6. 3-D Sound for Virtual Reality and Multimedia

    NASA Technical Reports Server (NTRS)

    Begault, Durand R.; Trejo, Leonard J. (Technical Monitor)

    2000-01-01

    Technology and applications for the rendering of virtual acoustic spaces are reviewed. Chapter 1 deals with acoustics and psychoacoustics. Chapters 2 and 3 cover cues to spatial hearing and review psychoacoustic literature. Chapter 4 covers signal processing and systems overviews of 3-D sound systems. Chapter 5 covers applications to computer workstations, communication systems, aeronautics and space, and sonic arts. Chapter 6 lists resources. This TM is a reprint of the 1994 book from Academic Press.

  7. Dental impressions using 3D digital scanners: virtual becomes reality.

    PubMed

    Birnbaum, Nathan S; Aaronson, Heidi B

    2008-10-01

    The technologies that have made the use of three-dimensional (3D) digital scanners an integral part of many industries for decades have been improved and refined for application to dentistry. Since the introduction of the first dental impressioning digital scanner in the 1980s, development engineers at a number of companies have enhanced the technologies and created in-office scanners that are increasingly user-friendly and able to produce precisely fitting dental restorations. These systems are capable of capturing 3D virtual images of tooth preparations, from which restorations may be fabricated directly (ie, CAD/CAM systems) or fabricated indirectly (ie, dedicated impression scanning systems for the creation of accurate master models). The use of these products is increasing rapidly around the world and presents a paradigm shift in the way in which dental impressions are made. Several of the leading 3D dental digital scanning systems are presented and discussed in this article.

  8. Virtual reality and 3D animation in forensic visualization.

    PubMed

    Ma, Minhua; Zheng, Huiru; Lallie, Harjinder

    2010-09-01

    Computer-generated three-dimensional (3D) animation is an ideal media to accurately visualize crime or accident scenes to the viewers and in the courtrooms. Based upon factual data, forensic animations can reproduce the scene and demonstrate the activity at various points in time. The use of computer animation techniques to reconstruct crime scenes is beginning to replace the traditional illustrations, photographs, and verbal descriptions, and is becoming popular in today's forensics. This article integrates work in the areas of 3D graphics, computer vision, motion tracking, natural language processing, and forensic computing, to investigate the state-of-the-art in forensic visualization. It identifies and reviews areas where new applications of 3D digital technologies and artificial intelligence could be used to enhance particular phases of forensic visualization to create 3D models and animations automatically and quickly. Having discussed the relationships between major crime types and level-of-detail in corresponding forensic animations, we recognized that high level-of-detail animation involving human characters, which is appropriate for many major crime types but has had limited use in courtrooms, could be useful for crime investigation.

  9. Virtual reality and 3D animation in forensic visualization.

    PubMed

    Ma, Minhua; Zheng, Huiru; Lallie, Harjinder

    2010-09-01

    Computer-generated three-dimensional (3D) animation is an ideal media to accurately visualize crime or accident scenes to the viewers and in the courtrooms. Based upon factual data, forensic animations can reproduce the scene and demonstrate the activity at various points in time. The use of computer animation techniques to reconstruct crime scenes is beginning to replace the traditional illustrations, photographs, and verbal descriptions, and is becoming popular in today's forensics. This article integrates work in the areas of 3D graphics, computer vision, motion tracking, natural language processing, and forensic computing, to investigate the state-of-the-art in forensic visualization. It identifies and reviews areas where new applications of 3D digital technologies and artificial intelligence could be used to enhance particular phases of forensic visualization to create 3D models and animations automatically and quickly. Having discussed the relationships between major crime types and level-of-detail in corresponding forensic animations, we recognized that high level-of-detail animation involving human characters, which is appropriate for many major crime types but has had limited use in courtrooms, could be useful for crime investigation. PMID:20533989

  10. Anesthesiology training using 3D imaging and virtual reality

    NASA Astrophysics Data System (ADS)

    Blezek, Daniel J.; Robb, Richard A.; Camp, Jon J.; Nauss, Lee A.

    1996-04-01

    Current training for regional nerve block procedures by anesthesiology residents requires expert supervision and the use of cadavers; both of which are relatively expensive commodities in today's cost-conscious medical environment. We are developing methods to augment and eventually replace these training procedures with real-time and realistic computer visualizations and manipulations of the anatomical structures involved in anesthesiology procedures, such as nerve plexus injections (e.g., celiac blocks). The initial work is focused on visualizations: both static images and rotational renderings. From the initial results, a coherent paradigm for virtual patient and scene representation will be developed.

  11. Three Primary School Students' Cognition about 3D Rotation in a Virtual Reality Learning Environment

    ERIC Educational Resources Information Center

    Yeh, Andy

    2010-01-01

    This paper reports on three primary school students' explorations of 3D rotation in a virtual reality learning environment (VRLE) named VRMath. When asked to investigate if you would face the same direction when you turn right 45 degrees first then roll up 45 degrees, or when you roll up 45 degrees first then turn right 45 degrees, the students…

  12. The Learner Characteristics, Features of Desktop 3D Virtual Reality Environments, and College Chemistry Instruction: A Structural Equation Modeling Analysis

    ERIC Educational Resources Information Center

    Merchant, Zahira; Goetz, Ernest T.; Keeney-Kennicutt, Wendy; Kwok, Oi-man; Cifuentes, Lauren; Davis, Trina J.

    2012-01-01

    We examined a model of the impact of a 3D desktop virtual reality environment on the learner characteristics (i.e. perceptual and psychological variables) that can enhance chemistry-related learning achievements in an introductory college chemistry class. The relationships between the 3D virtual reality features and the chemistry learning test as…

  13. Mackay campus of environmental education and digital cultural construction: the application of 3D virtual reality

    NASA Astrophysics Data System (ADS)

    Chien, Shao-Chi; Chung, Yu-Wei; Lin, Yi-Hsuan; Huang, Jun-Yi; Chang, Jhih-Ting; He, Cai-Ying; Cheng, Yi-Wen

    2012-04-01

    This study uses 3D virtual reality technology to create the "Mackay campus of the environmental education and digital cultural 3D navigation system" for local historical sites in the Tamsui (Hoba) area, in hopes of providing tourism information and navigation through historical sites using a 3D navigation system. We used Auto CAD, Sketch Up, and SpaceEyes 3D software to construct the virtual reality scenes and create the school's historical sites, such as the House of Reverends, the House of Maidens, the Residence of Mackay, and the Education Hall. We used this technology to complete the environmental education and digital cultural Mackay campus . The platform we established can indeed achieve the desired function of providing tourism information and historical site navigation. The interactive multimedia style and the presentation of the information will allow users to obtain a direct information response. In addition to showing the external appearances of buildings, the navigation platform can also allow users to enter the buildings to view lifelike scenes and textual information related to the historical sites. The historical sites are designed according to their actual size, which gives users a more realistic feel. In terms of the navigation route, the navigation system does not force users along a fixed route, but instead allows users to freely control the route they would like to take to view the historical sites on the platform.

  14. Using the CAVE virtual-reality environment as an aid to 3-D electromagnetic field computation

    SciTech Connect

    Turner, L.R.; Levine, D.; Huang, M.; Papka, M; Kettunen, L.

    1995-08-01

    One of the major problems in three-dimensional (3-D) field computation is visualizing the resulting 3-D field distributions. A virtual-reality environment, such as the CAVE, (CAVE Automatic Virtual Environment) is helping to overcome this problem, thus making the results of computation more usable for designers and users of magnets and other electromagnetic devices. As a demonstration of the capabilities of the CAVE, the elliptical multipole wiggler (EMW), an insertion device being designed for the Advanced Photon Source (APS) now being commissioned at Argonne National Laboratory (ANL), wa made visible, along with its fields and beam orbits. Other uses of the CAVE in preprocessing and postprocessing computation for electromagnetic applications are also discussed.

  15. Effects of 3D Virtual Reality of Plate Tectonics on Fifth Grade Students' Achievement and Attitude toward Science

    ERIC Educational Resources Information Center

    Kim, Paul

    2006-01-01

    This study examines the effects of a teaching method using 3D virtual reality simulations on achievement and attitude toward science. An experiment was conducted with fifth-grade students (N = 41) to examine the effects of 3D simulations, designed to support inquiry-based science curriculum. An ANOVA analysis revealed that the 3D group scored…

  16. Modulation of cortical activity in 2D versus 3D virtual reality environments: an EEG study.

    PubMed

    Slobounov, Semyon M; Ray, William; Johnson, Brian; Slobounov, Elena; Newell, Karl M

    2015-03-01

    There is a growing empirical evidence that virtual reality (VR) is valuable for education, training, entertaining and medical rehabilitation due to its capacity to represent real-life events and situations. However, the neural mechanisms underlying behavioral confounds in VR environments are still poorly understood. In two experiments, we examined the effect of fully immersive 3D stereoscopic presentations and less immersive 2D VR environments on brain functions and behavioral outcomes. In Experiment 1 we examined behavioral and neural underpinnings of spatial navigation tasks using electroencephalography (EEG). In Experiment 2, we examined EEG correlates of postural stability and balance. Our major findings showed that fully immersive 3D VR induced a higher subjective sense of presence along with enhanced success rate of spatial navigation compared to 2D. In Experiment 1 power of frontal midline EEG (FM-theta) was significantly higher during the encoding phase of route presentation in the 3D VR. In Experiment 2, the 3D VR resulted in greater postural instability and modulation of EEG patterns as a function of 3D versus 2D environments. The findings support the inference that the fully immersive 3D enriched-environment requires allocation of more brain and sensory resources for cognitive/motor control during both tasks than 2D presentations. This is further evidence that 3D VR tasks using EEG may be a promising approach for performance enhancement and potential applications in clinical/rehabilitation settings. PMID:25448267

  17. 3D Visualization of Cultural Heritage Artefacts with Virtual Reality devices

    NASA Astrophysics Data System (ADS)

    Gonizzi Barsanti, S.; Caruso, G.; Micoli, L. L.; Covarrubias Rodriguez, M.; Guidi, G.

    2015-08-01

    Although 3D models are useful to preserve the information about historical artefacts, the potential of these digital contents are not fully accomplished until they are not used to interactively communicate their significance to non-specialists. Starting from this consideration, a new way to provide museum visitors with more information was investigated. The research is aimed at valorising and making more accessible the Egyptian funeral objects exhibited in the Sforza Castle in Milan. The results of the research will be used for the renewal of the current exhibition, at the Archaeological Museum in Milan, by making it more attractive. A 3D virtual interactive scenario regarding the "path of the dead", an important ritual in ancient Egypt, was realized to augment the experience and the comprehension of the public through interactivity. Four important artefacts were considered for this scope: two ushabty, a wooden sarcophagus and a heart scarab. The scenario was realized by integrating low-cost Virtual Reality technologies, as the Oculus Rift DK2 and the Leap Motion controller, and implementing a specific software by using Unity. The 3D models were implemented by adding responsive points of interest in relation to important symbols or features of the artefact. This allows highlighting single parts of the artefact in order to better identify the hieroglyphs and provide their translation. The paper describes the process for optimizing the 3D models, the implementation of the interactive scenario and the results of some test that have been carried out in the lab.

  18. Load Assembly of the Ignitor Machine with 3D Interactive Virtual Reality

    NASA Astrophysics Data System (ADS)

    Migliori, S.; Pierattini, S.

    2003-10-01

    The main purpose of this work is to assist the Ignitor team in every phase of the project using the new Virtual Reality Technology (VR). Through the VR it is possible to see, plan and test the machine assembly sequence and the total layout. We are also planning to simulate in VR the remote handling systems. The complexity of the system requires a large and powerful graphical device. The ENEA?s "Advanced Visualization Technology" team has implemented a repository file data structure integrated with the CATIA drawing cams from the designer of Ignitor. The 3D virtual mockup software is used to view and analyze all objects that compose the mockup and also to analyze the correct assembly sequences. The ENEA?s 3D immersive system and software are fully integrated in the ENEA?s supercomputing GRID infrastructure. At any time all members of the Ignitor Project can view the status of the mockup in 3D (draft and/or final objects) through the net. During the conference examples of the assembly sequence and load assembly structure will be presented.

  19. Virtual reality hardware for use in interactive 3D data fusion and visualization

    NASA Astrophysics Data System (ADS)

    Gourley, Christopher S.; Abidi, Mongi A.

    1997-09-01

    Virtual reality has become a tool for use in many areas of research. We have designed and built a VR system for use in range data fusion and visualization. One major VR tool is the CAVE. This is the ultimate visualization tool, but comes with a large price tag. Our design uses a unique CAVE whose graphics are powered by a desktop computer instead of a larger rack machine making it much less costly. The system consists of a screen eight feet tall by twenty-seven feet wide giving a variable field-of-view currently set at 160 degrees. A silicon graphics Indigo2 MaxImpact with the impact channel option is used for display. This gives the capability to drive three projectors at a resolution of 640 by 480 for use in displaying the virtual environment and one 640 by 480 display for a user control interface. This machine is also the first desktop package which has built-in hardware texture mapping. This feature allows us to quickly fuse the range and intensity data and other multi-sensory data. The final goal is a complete 3D texture mapped model of the environment. A dataglove, magnetic tracker, and spaceball are to be used for manipulation of the data and navigation through the virtual environment. This system gives several users the ability to interactively create 3D models from multiple range images.

  20. Design and implementation of a 3D ocean virtual reality and visualization engine

    NASA Astrophysics Data System (ADS)

    Chen, Ge; Li, Bo; Tian, Fenglin; Ji, Pengbo; Li, Wenqing

    2012-12-01

    In this study, a 3D virtual reality and visualization engine for rendering the ocean, named VV-Ocean, is designed for marine applications. The design goals of VV-Ocean aim at high fidelity simulation of ocean environment, visualization of massive and multidimensional marine data, and imitation of marine lives. VV-Ocean is composed of five modules, i.e. memory management module, resources management module, scene management module, rendering process management module and interaction management module. There are three core functions in VV-Ocean: reconstructing vivid virtual ocean scenes, visualizing real data dynamically in real time, imitating and simulating marine lives intuitively. Based on VV-Ocean, we establish a sea-land integration platform which can reproduce drifting and diffusion processes of oil spilling from sea bottom to surface. Environment factors such as ocean current and wind field have been considered in this simulation. On this platform oil spilling process can be abstracted as movements of abundant oil particles. The result shows that oil particles blend with water well and the platform meets the requirement for real-time and interactive rendering. VV-Ocean can be widely used in ocean applications such as demonstrating marine operations, facilitating maritime communications, developing ocean games, reducing marine hazards, forecasting the weather over oceans, serving marine tourism, and so on. Finally, further technological improvements of VV-Ocean are discussed.

  1. A 3-D Virtual Reality Model of the Sun and the Moon for E-Learning at Elementary Schools

    ERIC Educational Resources Information Center

    Sun, Koun-Tem; Lin, Ching-Ling; Wang, Sheng-Min

    2010-01-01

    The relative positions of the sun, moon, and earth, their movements, and their relationships are abstract and difficult to understand astronomical concepts in elementary school science. This study proposes a three-dimensional (3-D) virtual reality (VR) model named the "Sun and Moon System." This e-learning resource was designed by combining…

  2. Re-Dimensional Thinking in Earth Science: From 3-D Virtual Reality Panoramas to 2-D Contour Maps

    ERIC Educational Resources Information Center

    Park, John; Carter, Glenda; Butler, Susan; Slykhuis, David; Reid-Griffin, Angelia

    2008-01-01

    This study examines the relationship of gender and spatial perception on student interactivity with contour maps and non-immersive virtual reality. Eighteen eighth-grade students elected to participate in a six-week activity-based course called "3-D GeoMapping." The course included nine days of activities related to topographic mapping. At the end…

  3. Assessing the precision of gaze following using a stereoscopic 3D virtual reality setting.

    PubMed

    Atabaki, Artin; Marciniak, Karolina; Dicke, Peter W; Thier, Peter

    2015-07-01

    Despite the ecological importance of gaze following, little is known about the underlying neuronal processes, which allow us to extract gaze direction from the geometric features of the eye and head of a conspecific. In order to understand the neuronal mechanisms underlying this ability, a careful description of the capacity and the limitations of gaze following at the behavioral level is needed. Previous studies of gaze following, which relied on naturalistic settings have the disadvantage of allowing only very limited control of potentially relevant visual features guiding gaze following, such as the contrast of iris and sclera, the shape of the eyelids and--in the case of photographs--they lack depth. Hence, in order to get full control of potentially relevant features we decided to study gaze following of human observers guided by the gaze of a human avatar seen stereoscopically. To this end we established a stereoscopic 3D virtual reality setup, in which we tested human subjects' abilities to detect at which target a human avatar was looking at. Following the gaze of the avatar showed all the features of the gaze following of a natural person, namely a substantial degree of precision associated with a consistent pattern of systematic deviations from the target. Poor stereo vision affected performance surprisingly little (only in certain experimental conditions). Only gaze following guided by targets at larger downward eccentricities exhibited a differential effect of the presence or absence of accompanying movements of the avatar's eyelids and eyebrows. PMID:25982719

  4. Enhancing Time-Connectives with 3D Immersive Virtual Reality (IVR)

    ERIC Educational Resources Information Center

    Passig, David; Eden, Sigal

    2010-01-01

    This study sought to test the most efficient representation mode with which children with hearing impairment could express a story while producing connectives indicating relations of time and of cause and effect. Using Bruner's (1973, 1986, 1990) representation stages, we tested the comparative effectiveness of Virtual Reality (VR) as a mode of…

  5. The Input-Interface of Webcam Applied in 3D Virtual Reality Systems

    ERIC Educational Resources Information Center

    Sun, Huey-Min; Cheng, Wen-Lin

    2009-01-01

    Our research explores a virtual reality application based on Web camera (Webcam) input-interface. The interface can replace with the mouse to control direction intention of a user by the method of frame difference. We divide a frame into nine grids from Webcam and make use of the background registration to compute the moving object. In order to…

  6. Exploring 3-D Virtual Reality Technology for Spatial Ability and Chemistry Achievement

    ERIC Educational Resources Information Center

    Merchant, Z.; Goetz, E. T.; Keeney-Kennicutt, W.; Cifuentes, L.; Kwok, O.; Davis, T. J.

    2013-01-01

    We investigated the potential of Second Life® (SL), a three-dimensional (3-D) virtual world, to enhance undergraduate students' learning of a vital chemistry concept. A quasi-experimental pre-posttest control group design was used to conduct the study. A total of 387 participants completed three assignment activities either in SL or using…

  7. Generic precise augmented reality guiding system and its calibration method based on 3D virtual model.

    PubMed

    Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua

    2016-05-30

    Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique.

  8. Generic precise augmented reality guiding system and its calibration method based on 3D virtual model.

    PubMed

    Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua

    2016-05-30

    Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique. PMID:27410124

  9. Using virtual reality technology and hand tracking technology to create software for training surgical skills in 3D game

    NASA Astrophysics Data System (ADS)

    Zakirova, A. A.; Ganiev, B. A.; Mullin, R. I.

    2015-11-01

    The lack of visible and approachable ways of training surgical skills is one of the main problems in medical education. Existing simulation training devices are not designed to teach students, and are not available due to the high cost of the equipment. Using modern technologies such as virtual reality and hands movements fixation technology we want to create innovative method of learning the technics of conducting operations in 3D game format, which can make education process interesting and effective. Creating of 3D format virtual simulator will allow to solve several conceptual problems at once: opportunity of practical skills improvement unlimited by the time without the risk for patient, high realism of environment in operational and anatomic body structures, using of game mechanics for information perception relief and memorization of methods acceleration, accessibility of this program.

  10. 3D graphics, virtual reality, and motion-onset visual evoked potentials in neurogaming.

    PubMed

    Beveridge, R; Wilson, S; Coyle, D

    2016-01-01

    A brain-computer interface (BCI) offers movement-free control of a computer application and is achieved by reading and translating the cortical activity of the brain into semantic control signals. Motion-onset visual evoked potentials (mVEP) are neural potentials employed in BCIs and occur when motion-related stimuli are attended visually. mVEP dynamics are correlated with the position and timing of the moving stimuli. To investigate the feasibility of utilizing the mVEP paradigm with video games of various graphical complexities including those of commercial quality, we conducted three studies over four separate sessions comparing the performance of classifying five mVEP responses with variations in graphical complexity and style, in-game distractions, and display parameters surrounding mVEP stimuli. To investigate the feasibility of utilizing contemporary presentation modalities in neurogaming, one of the studies compared mVEP classification performance when stimuli were presented using the oculus rift virtual reality headset. Results from 31 independent subjects were analyzed offline. The results show classification performances ranging up to 90% with variations in conditions in graphical complexity having limited effect on mVEP performance; thus, demonstrating the feasibility of using the mVEP paradigm within BCI-based neurogaming. PMID:27590974

  11. 3D graphics, virtual reality, and motion-onset visual evoked potentials in neurogaming.

    PubMed

    Beveridge, R; Wilson, S; Coyle, D

    2016-01-01

    A brain-computer interface (BCI) offers movement-free control of a computer application and is achieved by reading and translating the cortical activity of the brain into semantic control signals. Motion-onset visual evoked potentials (mVEP) are neural potentials employed in BCIs and occur when motion-related stimuli are attended visually. mVEP dynamics are correlated with the position and timing of the moving stimuli. To investigate the feasibility of utilizing the mVEP paradigm with video games of various graphical complexities including those of commercial quality, we conducted three studies over four separate sessions comparing the performance of classifying five mVEP responses with variations in graphical complexity and style, in-game distractions, and display parameters surrounding mVEP stimuli. To investigate the feasibility of utilizing contemporary presentation modalities in neurogaming, one of the studies compared mVEP classification performance when stimuli were presented using the oculus rift virtual reality headset. Results from 31 independent subjects were analyzed offline. The results show classification performances ranging up to 90% with variations in conditions in graphical complexity having limited effect on mVEP performance; thus, demonstrating the feasibility of using the mVEP paradigm within BCI-based neurogaming.

  12. 3D Elevation Program—Virtual USA in 3D

    USGS Publications Warehouse

    Lukas, Vicki; Stoker, J.M.

    2016-01-01

    The U.S. Geological Survey (USGS) 3D Elevation Program (3DEP) uses a laser system called ‘lidar’ (light detection and ranging) to create a virtual reality map of the Nation that is very accurate. 3D maps have many uses with new uses being discovered all the time.  

  13. 3D Elevation Program—Virtual USA in 3D

    USGS Publications Warehouse

    Lukas, Vicki; Stoker, J.M.

    2016-04-14

    The U.S. Geological Survey (USGS) 3D Elevation Program (3DEP) uses a laser system called ‘lidar’ (light detection and ranging) to create a virtual reality map of the Nation that is very accurate. 3D maps have many uses with new uses being discovered all the time.  

  14. Visuomotor learning in immersive 3D virtual reality in Parkinson's disease and in aging.

    PubMed

    Messier, Julie; Adamovich, Sergei; Jack, David; Hening, Wayne; Sage, Jacob; Poizner, Howard

    2007-05-01

    Successful adaptation to novel sensorimotor contexts critically depends on efficient sensory processing and integration mechanisms, particularly those required to combine visual and proprioceptive inputs. If the basal ganglia are a critical part of specialized circuits that adapt motor behavior to new sensorimotor contexts, then patients who are suffering from basal ganglia dysfunction, as in Parkinson's disease should show sensorimotor learning impairments. However, this issue has been under-explored. We tested the ability of 8 patients with Parkinson's disease (PD), off medication, ten healthy elderly subjects and ten healthy young adults to reach to a remembered 3D location presented in an immersive virtual environment. A multi-phase learning paradigm was used having four conditions: baseline, initial learning, reversal learning and aftereffect. In initial learning, the computer altered the position of a simulated arm endpoint used for movement feedback by shifting its apparent location diagonally, requiring thereby both horizontal and vertical compensations. This visual distortion forced subjects to learn new coordinations between what they saw in the virtual environment and the actual position of their limbs, which they had to derive from proprioceptive information (or efference copy). In reversal learning, the sign of the distortion was reversed. Both elderly subjects and PD patients showed learning phase-dependent difficulties. First, elderly controls were slower than young subjects when learning both dimensions of the initial biaxial discordance. However, their performance improved during reversal learning and as a result elderly and young controls showed similar adaptation rates during reversal learning. Second, in striking contrast to healthy elderly subjects, PD patients were more profoundly impaired during the reversal phase of learning. PD patients were able to learn the initial biaxial discordance but were on average slower than age-matched controls

  15. An Augmented Reality based 3D Catalog

    NASA Astrophysics Data System (ADS)

    Yamada, Ryo; Kishimoto, Katsumi

    This paper presents a 3D catalog system that uses Augmented Reality technology. The use of Web-based catalog systems that present products in 3D form is increasing in various fields, along with the rapid and widespread adoption of Electronic Commerce. However, 3D shapes could previously only be seen in a virtual space, and it was difficult to understand how the products would actually look in the real world. To solve this, we propose a method that combines the virtual and real worlds simply and intuitively. The method applies Augmented Reality technology, and the system developed based on the method enables users to evaluate 3D virtual products in a real environment.

  16. Three‐dimensional immersive virtual reality for studying cellular compartments in 3D models from EM preparations of neural tissues

    PubMed Central

    Baghabra, Jumana; Boges, Daniya J.; Holst, Glendon R.; Kreshuk, Anna; Hamprecht, Fred A.; Srinivasan, Madhusudhanan; Lehväslaiho, Heikki

    2016-01-01

    ABSTRACT Advances in the application of electron microscopy (EM) to serial imaging are opening doors to new ways of analyzing cellular structure. New and improved algorithms and workflows for manual and semiautomated segmentation allow us to observe the spatial arrangement of the smallest cellular features with unprecedented detail in full three‐dimensions. From larger samples, higher complexity models can be generated; however, they pose new challenges to data management and analysis. Here we review some currently available solutions and present our approach in detail. We use the fully immersive virtual reality (VR) environment CAVE (cave automatic virtual environment), a room in which we are able to project a cellular reconstruction and visualize in 3D, to step into a world created with Blender, a free, fully customizable 3D modeling software with NeuroMorph plug‐ins for visualization and analysis of EM preparations of brain tissue. Our workflow allows for full and fast reconstructions of volumes of brain neuropil using ilastik, a software tool for semiautomated segmentation of EM stacks. With this visualization environment, we can walk into the model containing neuronal and astrocytic processes to study the spatial distribution of glycogen granules, a major energy source that is selectively stored in astrocytes. The use of CAVE was key to the observation of a nonrandom distribution of glycogen, and led us to develop tools to quantitatively analyze glycogen clustering and proximity to other subcellular features. J. Comp. Neurol. 524:23–38, 2016. © 2015 Wiley Periodicals, Inc. PMID:26179415

  17. Three-dimensional immersive virtual reality for studying cellular compartments in 3D models from EM preparations of neural tissues.

    PubMed

    Calì, Corrado; Baghabra, Jumana; Boges, Daniya J; Holst, Glendon R; Kreshuk, Anna; Hamprecht, Fred A; Srinivasan, Madhusudhanan; Lehväslaiho, Heikki; Magistretti, Pierre J

    2016-01-01

    Advances in the application of electron microscopy (EM) to serial imaging are opening doors to new ways of analyzing cellular structure. New and improved algorithms and workflows for manual and semiautomated segmentation allow us to observe the spatial arrangement of the smallest cellular features with unprecedented detail in full three-dimensions. From larger samples, higher complexity models can be generated; however, they pose new challenges to data management and analysis. Here we review some currently available solutions and present our approach in detail. We use the fully immersive virtual reality (VR) environment CAVE (cave automatic virtual environment), a room in which we are able to project a cellular reconstruction and visualize in 3D, to step into a world created with Blender, a free, fully customizable 3D modeling software with NeuroMorph plug-ins for visualization and analysis of EM preparations of brain tissue. Our workflow allows for full and fast reconstructions of volumes of brain neuropil using ilastik, a software tool for semiautomated segmentation of EM stacks. With this visualization environment, we can walk into the model containing neuronal and astrocytic processes to study the spatial distribution of glycogen granules, a major energy source that is selectively stored in astrocytes. The use of CAVE was key to the observation of a nonrandom distribution of glycogen, and led us to develop tools to quantitatively analyze glycogen clustering and proximity to other subcellular features. PMID:26179415

  18. 3D chromosome rendering from Hi-C data using virtual reality

    NASA Astrophysics Data System (ADS)

    Zhu, Yixin; Selvaraj, Siddarth; Weber, Philip; Fang, Jennifer; Schulze, Jürgen P.; Ren, Bing

    2015-01-01

    Most genome browsers display DNA linearly, using single-dimensional depictions that are useful to examine certain epigenetic mechanisms such as DNA methylation. However, these representations are insufficient to visualize intrachromosomal interactions and relationships between distal genome features. Relationships between DNA regions may be difficult to decipher or missed entirely if those regions are distant in one dimension but could be spatially proximal when mapped to three-dimensional space. For example, the visualization of enhancers folding over genes is only fully expressed in three-dimensional space. Thus, to accurately understand DNA behavior during gene expression, a means to model chromosomes is essential. Using coordinates generated from Hi-C interaction frequency data, we have created interactive 3D models of whole chromosome structures and its respective domains. We have also rendered information on genomic features such as genes, CTCF binding sites, and enhancers. The goal of this article is to present the procedure, findings, and conclusions of our models and renderings.

  19. A Learner-Centered Approach for Training Science Teachers through Virtual Reality and 3D Visualization Technologies: Practical Experience for Sharing

    ERIC Educational Resources Information Center

    Yeung, Yau-Yuen

    2004-01-01

    This paper presentation will report on how some science educators at the Science Department of The Hong Kong Institute of Education have successfully employed an array of innovative learning media such as three-dimensional (3D) and virtual reality (VR) technologies to create seven sets of resource kits, most of which are being placed on the…

  20. Virtual Reality.

    ERIC Educational Resources Information Center

    Newby, Gregory B.

    1993-01-01

    Discusses the current state of the art in virtual reality (VR), its historical background, and future possibilities. Highlights include applications in medicine, art and entertainment, science, business, and telerobotics; and VR for information science, including graphical display of bibliographic data, libraries and books, and cyberspace.…

  1. Transforming Clinical Imaging and 3D Data for Virtual Reality Learning Objects: HTML5 and Mobile Devices Implementation

    ERIC Educational Resources Information Center

    Trelease, Robert B.; Nieder, Gary L.

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android…

  2. The use of a low-cost visible light 3D scanner to create virtual reality environment models of actors and objects

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    A low-cost 3D scanner has been developed with a parts cost of approximately USD $5,000. This scanner uses visible light sensing to capture both structural as well as texture and color data of a subject. This paper discusses the use of this type of scanner to create 3D models for incorporation into a virtual reality environment. It describes the basic scanning process (which takes under a minute for a single scan), which can be repeated to collect multiple positions, if needed for actor model creation. The efficacy of visible light versus other scanner types is also discussed.

  3. A New Approach to Improve Cognition, Muscle Strength, and Postural Balance in Community-Dwelling Elderly with a 3-D Virtual Reality Kayak Program.

    PubMed

    Park, Junhyuck; Yim, JongEun

    2016-01-01

    Aging is usually accompanied with deterioration of physical abilities, such as muscular strength, sensory sensitivity, and functional capacity. Recently, intervention methods with virtual reality have been introduced, providing an enjoyable therapy for elderly. The aim of this study was to investigate whether a 3-D virtual reality kayak program could improve the cognitive function, muscle strength, and balance of community-dwelling elderly. Importantly, kayaking involves most of the upper body musculature and needs the balance control. Seventy-two participants were randomly allocated into the kayak program group (n = 36) and the control group (n = 36). The two groups were well matched with respect to general characteristics at baseline. The participants in both groups performed a conventional exercise program for 30 min, and then the 3-D virtual reality kayak program was performed in the kayak program group for 20 min, two times a week for 6 weeks. Cognitive function was measured using the Montreal Cognitive Assessment. Muscle strength was measured using the arm curl and handgrip strength tests. Standing and sitting balance was measured using the Good Balance system. The post-test was performed in the same manner as the pre-test; the overall outcomes such as cognitive function (p < 0.05), muscle strength (p < 0.05), and balance (standing and sitting balance, p < 0.05) were significantly improved in kayak program group compared to the control group. We propose that the 3-D virtual reality kayak program is a promising intervention method for improving the cognitive function, muscle strength, and balance of elderly.

  4. On the Usability and Usefulness of 3d (geo)visualizations - a Focus on Virtual Reality Environments

    NASA Astrophysics Data System (ADS)

    Çöltekin, A.; Lokka, I.; Zahner, M.

    2016-06-01

    Whether and when should we show data in 3D is an on-going debate in communities conducting visualization research. A strong opposition exists in the information visualization (Infovis) community, and seemingly unnecessary/unwarranted use of 3D, e.g., in plots, bar or pie charts, is heavily criticized. The scientific visualization (Scivis) community, on the other hand, is more supportive of the use of 3D as it allows `seeing' invisible phenomena, or designing and printing things that are used in e.g., surgeries, educational settings etc. Geographic visualization (Geovis) stands between the Infovis and Scivis communities. In geographic information science, most visuo-spatial analyses have been sufficiently conducted in 2D or 2.5D, including analyses related to terrain and much of the urban phenomena. On the other hand, there has always been a strong interest in 3D, with similar motivations as in Scivis community. Among many types of 3D visualizations, a popular one that is exploited both for visual analysis and visualization is the highly realistic (geo)virtual environments. Such environments may be engaging and memorable for the viewers because they offer highly immersive experiences. However, it is not yet well-established if we should opt to show the data in 3D; and if yes, a) what type of 3D we should use, b) for what task types, and c) for whom. In this paper, we identify some of the central arguments for and against the use of 3D visualizations around these three considerations in a concise interdisciplinary literature review.

  5. Quality of Grasping and the Role of Haptics in a 3-D Immersive Virtual Reality Environment in Individuals With Stroke.

    PubMed

    Levin, Mindy F; Magdalon, Eliane C; Michaelsen, Stella M; Quevedo, Antonio A F

    2015-11-01

    Reaching and grasping parameters with and without haptic feedback were characterized in people with chronic post-stroke behaviors. Twelve (67 ± 10 years) individuals with chronic stroke and arm/hand paresis (Fugl-Meyer Assessment-Arm: ≥ 46/66 pts) participated. Three dimensional (3-D) temporal and spatial kinematics of reaching and grasping movements to three objects (can: cylindrical grasp; screwdriver: power grasp; pen: precision grasp) in a physical environment (PE) with and without additional haptic feedback and a 3-D virtual environment (VE) with haptic feedback were recorded. Participants reached, grasped and transported physical and virtual objects using similar movement strategies in all conditions. Reaches made in VE were less smooth and slower compared to the PE. Arm and trunk kinematics were similar in both environments and glove conditions. For grasping, stroke subjects preserved aperture scaling to object size but used wider hand apertures with longer delays between times to maximal reaching velocity and maximal grasping aperture. Wearing the glove decreased reaching velocity. Our results in a small group of subjects suggest that providing haptic information in the VE did not affect the validity of reaching and grasping movement. Small disparities in movement parameters between environments may be due to differences in perception of object distance in VE. Reach-to-grasp kinematics to smaller objects may be improved by better 3-D rendering. Comparable kinematics between environments and conditions is encouraging for the incorporation of high quality VEs in rehabilitation programs aimed at improving upper limb recovery.

  6. Virtual reality welder training

    NASA Astrophysics Data System (ADS)

    White, Steven A.; Reiners, Dirk; Prachyabrued, Mores; Borst, Christoph W.; Chambers, Terrence L.

    2010-01-01

    This document describes the Virtual Reality Simulated MIG Lab (sMIG), a system for Virtual Reality welder training. It is designed to reproduce the experience of metal inert gas (MIG) welding faithfully enough to be used as a teaching tool for beginning welding students. To make the experience as realistic as possible it employs physically accurate and tracked input devices, a real-time welding simulation, real-time sound generation and a 3D display for output. Thanks to being a fully digital system it can go beyond providing just a realistic welding experience by giving interactive and immediate feedback to the student to avoid learning wrong movements from day 1.

  7. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  8. Cortical correlate of spatial presence in 2D and 3D interactive virtual reality: an EEG study.

    PubMed

    Kober, Silvia Erika; Kurzmann, Jürgen; Neuper, Christa

    2012-03-01

    The present study is the first that examined neuronal underpinnings of spatial presence using multi-channel EEG in an interactive virtual reality (VR). We compared two VR-systems: a highly immersive Single-Wall-VR-system (three-dimensional view, large screen) and a less immersive Desktop-VR-system (two-dimensional view, small screen). Twenty-nine participants performed a spatial navigation task in a virtual maze and had to state their sensation of "being there" on a 5-point rating scale. Task-related power decrease/increase (TRPD/TRPI) in the Alpha band (8-12Hz) and coherence analyses in different frequency bands were used to analyze the EEG data. The Single-Wall-VR-system caused a more intense presence experience than the Desktop-VR-system. This increased feeling of presence in the Single-Wall-VR-condition was accompanied by an increased parietal TRPD in the Alpha band, which is associated with cortical activation. The lower presence experience in the Desktop-VR-group was accompanied by a stronger functional connectivity between frontal and parietal brain regions indicating that the communication between these two brain areas is crucial for the presence experience. Hence, we found a positive relationship between presence and parietal brain activation and a negative relationship between presence and frontal brain activation in an interactive VR-paradigm, supporting the results of passive non-interactive VR-studies.

  9. 3D augmented reality with integral imaging display

    NASA Astrophysics Data System (ADS)

    Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-06-01

    In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.

  10. 3D Virtual Reality Applied in Tectonic Geomorphic Study of the Gombori Range of Greater Caucasus Mountains

    NASA Astrophysics Data System (ADS)

    Sukhishvili, Lasha; Javakhishvili, Zurab

    2016-04-01

    Gombori Range represents the southern part of the young Greater Caucasus Mountains and stretches from NW to SE. The range separates Alazani and Iori basins within the eastern Georgian province of Kakheti. The active phase of Caucasian orogeny started in the Pliocene, but according to alluvial sediments of Gombori range (mapped in the Soviet geologic map), we observe its uplift process to be Quaternary event. The highest peak of the Gombori range has an absolute elevation of 1991 m, while its neighboring Alazani valley gains only 400 m. We assume the range has a very fast uplift rate and it could trigger streams flow direction course reverse in Quaternary. To check this preliminary assumptions we are going to use a tectonic and fluvial geomorphic and stratigraphic approaches including paleocurrent analyses and various affordable absolute dating techniques to detect the evidence of river course reverses and date them. For these purposes we have selected river Turdo outcrop. The river itself flows northwards from the Gombori range and nearby region`s main city of Telavi generates 30-40 m high continuous outcrop along 1 km section. Turdo outcrop has very steep walls and requires special climbing skills to work on it. The goal of this particularly study is to avoid time and resource consuming ground survey process of this steep, high and wide outcrop and test 3D aerial and ground base photogrammetric modelling and analyzing approaches in initial stage of the tectonic geomorphic study. Using this type of remote sensing and virtual lab analyses of 3D outcrop model, we roughly delineated stratigraphic layers, selected exact locations for applying various research techniques and planned safe and suitable climbing routes for getting to the investigation sites.

  11. Transforming clinical imaging and 3D data for virtual reality learning objects: HTML5 and mobile devices implementation.

    PubMed

    Trelease, Robert B; Nieder, Gary L

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android tablets. This article describes complementary methods for creating comparable, multiplatform VR learning objects in the new HTML5 standard format, circumventing platform-specific limitations imposed by the QuickTime VR multimedia file format. Multiple types or "dimensions" of anatomical information can be embedded in such learning objects, supporting different kinds of online learning applications, including interactive atlases, examination questions, and complex, multi-structure presentations. Such HTML5 VR learning objects are usable on new mobile devices that do not support QuickTime VR, as well as on personal computers. Furthermore, HTML5 VR learning objects can be embedded in "ebook" document files, supporting the development of new types of electronic textbooks on mobile devices that are increasingly popular and self-adopted for mobile learning.

  12. Transforming clinical imaging and 3D data for virtual reality learning objects: HTML5 and mobile devices implementation.

    PubMed

    Trelease, Robert B; Nieder, Gary L

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android tablets. This article describes complementary methods for creating comparable, multiplatform VR learning objects in the new HTML5 standard format, circumventing platform-specific limitations imposed by the QuickTime VR multimedia file format. Multiple types or "dimensions" of anatomical information can be embedded in such learning objects, supporting different kinds of online learning applications, including interactive atlases, examination questions, and complex, multi-structure presentations. Such HTML5 VR learning objects are usable on new mobile devices that do not support QuickTime VR, as well as on personal computers. Furthermore, HTML5 VR learning objects can be embedded in "ebook" document files, supporting the development of new types of electronic textbooks on mobile devices that are increasingly popular and self-adopted for mobile learning. PMID:23212750

  13. Computer Vision Assisted Virtual Reality Calibration

    NASA Technical Reports Server (NTRS)

    Kim, W.

    1999-01-01

    A computer vision assisted semi-automatic virtual reality (VR) calibration technology has been developed that can accurately match a virtual environment of graphically simulated three-dimensional (3-D) models to the video images of the real task environment.

  14. Virtual Reality: An Overview.

    ERIC Educational Resources Information Center

    Franchi, Jorge

    1994-01-01

    Highlights of this overview of virtual reality include optics; interface devices; virtual worlds; potential applications, including medicine and archaeology; problems, including costs; current research and development; future possibilities; and a listing of vendors and suppliers of virtual reality products. (Contains 11 references.) (LRW)

  15. Eye-tracking and EMG supported 3D Virtual Reality - an integrated tool for perceptual and motor development of children with severe physical disabilities: a research concept.

    PubMed

    Pulay, Márk Ágoston

    2015-01-01

    Letting children with severe physical disabilities (like Tetraparesis spastica) to get relevant motional experiences of appropriate quality and quantity is now the greatest challenge for us in the field of neurorehabilitation. These motional experiences may establish many cognitive processes, but may also cause additional secondary cognitive dysfunctions such as disorders in body image, figure invariance, visual perception, auditory differentiation, concentration, analytic and synthetic ways of thinking, visual memory etc. Virtual Reality is a technology that provides a sense of presence in a real environment with the help of 3D pictures and animations formed in a computer environment and enable the person to interact with the objects in that environment. One of our biggest challenges is to find a well suited input device (hardware) to let the children with severe physical disabilities to interact with the computer. Based on our own experiences and a thorough literature review we have come to the conclusion that an effective combination of eye-tracking and EMG devices should work well.

  16. Virtual Reality Enhanced Instructional Learning

    ERIC Educational Resources Information Center

    Nachimuthu, K.; Vijayakumari, G.

    2009-01-01

    Virtual Reality (VR) is a creation of virtual 3D world in which one can feel and sense the world as if it is real. It is allowing engineers to design machines and Educationists to design AV [audiovisual] equipment in real time but in 3-dimensional hologram as if the actual material is being made and worked upon. VR allows a least-cost (energy…

  17. Inertial Motion-Tracking Technology for Virtual 3-D

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In the 1990s, NASA pioneered virtual reality research. The concept was present long before, but, prior to this, the technology did not exist to make a viable virtual reality system. Scientists had theories and ideas they knew that the concept had potential, but the computers of the 1970s and 1980s were not fast enough, sensors were heavy and cumbersome, and people had difficulty blending fluidly with the machines. Scientists at Ames Research Center built upon the research of previous decades and put the necessary technology behind them, making the theories of virtual reality a reality. Virtual reality systems depend on complex motion-tracking sensors to convey information between the user and the computer to give the user the feeling that he is operating in the real world. These motion-tracking sensors measure and report an object s position and orientation as it changes. A simple example of motion tracking would be the cursor on a computer screen moving in correspondence to the shifting of the mouse. Tracking in 3-D, necessary to create virtual reality, however, is much more complex. To be successful, the perspective of the virtual image seen on the computer must be an accurate representation of what is seen in the real world. As the user s head or camera moves, turns, or tilts, the computer-generated environment must change accordingly with no noticeable lag, jitter, or distortion. Historically, the lack of smooth and rapid tracking of the user s motion has thwarted the widespread use of immersive 3-D computer graphics. NASA uses virtual reality technology for a variety of purposes, mostly training of astronauts. The actual missions are costly and dangerous, so any opportunity the crews have to practice their maneuvering in accurate situations before the mission is valuable and instructive. For that purpose, NASA has funded a great deal of virtual reality research, and benefited from the results.

  18. Virtual sound for virtual reality

    SciTech Connect

    Blattner, M.M. ||; Papp, A.L. III |

    1993-02-01

    The computational limitations of real-time interactive computing do not meet our requirements for producing realistic images for virtual reality in a convincing manner. Regardless of the real-time restrictions on virtual reality interfaces, the representations can be no better than the graphics. Computer graphics is still limited in its ability to generate complex objects such as landscapes and humans. Nevertheless, useful and convincing visualizations are made through a variety of techniques. The central theme of this article is that a similar situation is true with sound for virtual reality. It is beyond our abilityto create interactive soundscapes that create a faithful reproduction of real world sounds, however, by choosing one`s application carefully and using sound to enhance a display rather than only mimic real-world scenes, a very effective use of sound can be made.

  19. Virtual sound for virtual reality

    SciTech Connect

    Blattner, M.M. Cancer Center, Houston, TX . Dept. of Biomathematics Lawrence Livermore National Lab., CA California Univ., Davis, CA ); Papp, A.L. III Lawrence Livermore National Lab., CA )

    1993-02-01

    The computational limitations of real-time interactive computing do not meet our requirements for producing realistic images for virtual reality in a convincing manner. Regardless of the real-time restrictions on virtual reality interfaces, the representations can be no better than the graphics. Computer graphics is still limited in its ability to generate complex objects such as landscapes and humans. Nevertheless, useful and convincing visualizations are made through a variety of techniques. The central theme of this article is that a similar situation is true with sound for virtual reality. It is beyond our abilityto create interactive soundscapes that create a faithful reproduction of real world sounds, however, by choosing one's application carefully and using sound to enhance a display rather than only mimic real-world scenes, a very effective use of sound can be made.

  20. Virtual reality via photogrammetry

    NASA Astrophysics Data System (ADS)

    Zahrt, John D.; Papcun, George; Childers, Randy A.; Rubin, Naama

    1996-03-01

    We wish to walk into a photograph just as Alice walked into the looking glass. From a mathematical perspective, this problem is exceedingly ill-posed (e.g. Is that a large, distant object or a small, nearby object?). A human expert can supply a large amount of a priori information that can function as mathematical constraints. The constrained problem can then be attacked with photogrammetry to obtain a great deal of quantitative information which is otherwise only qualitatively apparent. The user determines whether the object to be analyzed contains two or three vanishing points, then selects an appropriate number of points from the photon to enable the code to compute the locations of the vanishing points. Using this information and the standard photogrammetric geometric algorithms, the location of the camera, relative to the structure, is determined. The user must also enter information regarding an absolute sense of scale. As the vectors from the camera to the various points chosen from the photograph are determined, the vector components (coordinates) are handed to a virtual reality software package. Once the objects are entered, the appropriate surfaces of the 3D object are `wallpapered' with the surface from the photograph. The user is then able to move through the virtual scene. A video will demonstrate our work.

  1. Spacecraft 3D Augmented Reality Mobile App

    NASA Technical Reports Server (NTRS)

    Hussey, Kevin J.; Doronila, Paul R.; Kumanchik, Brian E.; Chan, Evan G.; Ellison, Douglas J.; Boeck, Andrea; Moore, Justin M.

    2013-01-01

    The Spacecraft 3D application allows users to learn about and interact with iconic NASA missions in a new and immersive way using common mobile devices. Using Augmented Reality (AR) techniques to project 3D renditions of the mission spacecraft into real-world surroundings, users can interact with and learn about Curiosity, GRAIL, Cassini, and Voyager. Additional updates on future missions, animations, and information will be ongoing. Using a printed AR Target and camera on a mobile device, users can get up close with these robotic explorers, see how some move, and learn about these engineering feats, which are used to expand knowledge and understanding about space. The software receives input from the mobile device's camera to recognize the presence of an AR marker in the camera's field of view. It then displays a 3D rendition of the selected spacecraft in the user's physical surroundings, on the mobile device's screen, while it tracks the device's movement in relation to the physical position of the spacecraft's 3D image on the AR marker.

  2. Virtual Reality Lab Assistant

    NASA Technical Reports Server (NTRS)

    Saha, Hrishikesh; Palmer, Timothy A.

    1996-01-01

    Virtual Reality Lab Assistant (VRLA) demonstration model is aligned for engineering and material science experiments to be performed by undergraduate and graduate students in the course as a pre-lab simulation experience. This will help students to get a preview of how to use the lab equipment and run experiments without using the lab hardware/software equipment. The quality of the time available for laboratory experiments can be significantly improved through the use of virtual reality technology.

  3. Learning in Virtual Reality.

    ERIC Educational Resources Information Center

    Bricken, William

    The essence of the computer revolution is yet to come, for computers are essentially generators of realities. Virtual reality (VR) is the next step in the evolutionary path; the user is placed inside the image and becomes a participant within the computational space. A VR computer generates a direct experience of the computational environment. The…

  4. Virtual reality and virtual bodies

    NASA Astrophysics Data System (ADS)

    Richards, Catherine; Korba, Larry W.; Shaw, Christopher D.; Green, Mark

    1994-04-01

    There are many ways to produce the sense of `presence' or telepresence in the user of virtual reality. For example attempting to increase the realism of the visual environment is a commonly accepted strategy. In contrast, this paper explores a way for the user to feel present in an unrealistic virtual body. It investigates an unusual approach, proprioceptive illusions. Proprioceptive or body illusions are used to generate and explore the experience of virtuality and presence outside of the normal body limits. These projects are realized in art installations.

  5. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients.

    PubMed

    Lledó, Luis D; Díez, Jorge A; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J; Sabater-Navarro, José M; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  6. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients.

    PubMed

    Lledó, Luis D; Díez, Jorge A; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J; Sabater-Navarro, José M; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  7. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients

    PubMed Central

    Lledó, Luis D.; Díez, Jorge A.; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J.; Sabater-Navarro, José M.; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  8. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients

    PubMed Central

    Lledó, Luis D.; Díez, Jorge A.; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J.; Sabater-Navarro, José M.; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  9. Computer-assisted three-dimensional surgical planning and simulation: 3D virtual osteotomy.

    PubMed

    Xia, J; Ip, H H; Samman, N; Wang, D; Kot, C S; Yeung, R W; Tideman, H

    2000-02-01

    A computer-assisted three-dimensional virtual osteotomy system for orthognathic surgery (CAVOS) is presented. The virtual reality workbench is used for surgical planning. The surgeon immerses in a virtual reality environment with stereo eyewear, holds a virtual "scalpel" (3D Mouse) and operates on a "real" patient (3D visualization) to obtain pre-surgical prediction (3D bony segment movements). Virtual surgery on a computer-generated 3D head model is simulated and can be visualized from any arbitrary viewing point in a personal computer system.

  10. Virtual Reality and the Virtual Library.

    ERIC Educational Resources Information Center

    Oppenheim, Charles

    1993-01-01

    Explains virtual reality, including proper and improper uses of the term, and suggests ways that libraries might be affected by it. Highlights include elements of virtual reality systems; possible virtual reality applications, including architecture, the chemical industry, transport planning, armed forces, and entertainment; and the virtual…

  11. Virtual Reality Calibration for Telerobotic Servicing

    NASA Technical Reports Server (NTRS)

    Kim, W.

    1994-01-01

    A virtual reality calibration technique of matching a virtual environment of simulated graphics models in 3-D geometry and perspective with actual camera views of the remote site task environment has been developed to enable high-fidelity preview/predictive displays with calibrated graphics overlay on live video.

  12. Magical Stories: Blending Virtual Reality and Artificial Intelligence.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    Artificial intelligence (AI) techniques and virtual reality (VR) make possible powerful interactive stories, and this paper focuses on examples of virtual characters in three dimensional (3-D) worlds. Waldern, a virtual reality game designer, has theorized about and implemented software design of virtual teammates and opponents that incorporate AI…

  13. Virtual reality for emergency training

    SciTech Connect

    Altinkemer, K.

    1995-12-31

    Virtual reality is a sequence of scenes generated by a computer as a response to the five different senses. These senses are sight, sound, taste, touch, smell. Other senses that can be used in virtual reality include balance, pheromonal, and immunological senses. Many application areas include: leisure and entertainment, medicine, architecture, engineering, manufacturing, and training. Virtual reality is especially important when it is used for emergency training and management of natural disasters including earthquakes, floods, tornados and other situations which are hard to emulate. Classical training methods for these extraordinary environments lack the realistic surroundings that virtual reality can provide. In order for virtual reality to be a successful training tool the design needs to include certain aspects; such as how real virtual reality should be and how much fixed cost is entailed in setting up the virtual reality trainer. There are also pricing questions regarding the price per training session on virtual reality trainer, and the appropriate training time length(s).

  14. Virtual reality systems

    NASA Technical Reports Server (NTRS)

    Johnson, David W.

    1992-01-01

    Virtual realities are a type of human-computer interface (HCI) and as such may be understood from a historical perspective. In the earliest era, the computer was a very simple, straightforward machine. Interaction was human manipulation of an inanimate object, little more than the provision of an explicit instruction set to be carried out without deviation. In short, control resided with the user. In the second era of HCI, some level of intelligence and control was imparted to the system to enable a dialogue with the user. Simple context sensitive help systems are early examples, while more sophisticated expert system designs typify this era. Control was shared more equally. In this, the third era of the HCI, the constructed system emulates a particular environment, constructed with rules and knowledge about 'reality'. Control is, in part, outside the realm of the human-computer dialogue. Virtual reality systems are discussed.

  15. A randomized controlled trial of the effects of hypnosis with 3-D virtual reality animation on tiredness, mood, and salivary cortisol.

    PubMed

    Thompson, Trevor; Steffert, Tony; Steed, Anthony; Gruzelier, John

    2011-01-01

    Case studies suggest hypnosis with a virtual reality (VR) component may be an effective intervention; although few follow-up randomized, controlled trials have been performed comparing such interventions with standard hypnotic treatments. Thirty-five healthy participants were randomized to self-hypnosis with VR imagery, standard self-hypnosis, or relaxation interventions. Changes in sleep, cortisol levels, and mood were examined. Self-hypnosis involved 10- to 20-min. sessions visualizing a healthy immune scenario. Trait absorption was also recorded as a possible moderator. Moderated regression indicated that both hypnosis interventions produced significantly lower tiredness ratings than relaxation when trait absorption was high. When trait absorption was low, VR resulted in significantly higher engagement ratings, although this did not translate to demonstrable improvement in outcome. Results suggest that VR imagery may increase engagement relative to traditional methods, but further investigation into its potential to enhance therapeutic efficacy is required. PMID:21104488

  16. Virtual Reality in the Classroom.

    ERIC Educational Resources Information Center

    Pantelidis, Veronica S.

    1993-01-01

    Considers the concept of virtual reality; reviews its history; describes general uses of virtual reality, including entertainment, medicine, and design applications; discusses classroom uses of virtual reality, including a software program called Virtus WalkThrough for use with a computer monitor; and suggests future possibilities. (34 references)…

  17. Constructing Meaning with Virtual Reality.

    ERIC Educational Resources Information Center

    Iaonnou-Georgiou, Sophie

    2002-01-01

    Presents a constructivist rationale for introducing virtual reality in language learning and teaching and describes various virtual reality environments that are available. Ways of implementing constuctivist learning through virtual reality are suggested as well as basic guidelines for successful implementation in the classroom. (Author/VWL)

  18. Medical applications of virtual reality.

    PubMed

    Satava, R M

    1995-06-01

    Medical applications for virtual reality (VR) are just beginning to emerge. These include VR surgical simulators, telepresence surgery, complex medical database visualization, and rehabilitation. These applications are mediated through the computer interface and as such are the embodiment of VR as an integral part of the paradigm shift in the field of medicine. The Green Telepresence Surgery System consists of two components, the surgical workstation and remote worksite. At the remote site there is a 3-D camera system and responsive manipulators with sensory input. At the workstation there is a 3-D monitor and dexterous handles with force feedback. The VR surgical simulator is a stylized recreation of the human abdomen with several essential organs. Using a helmet mounted display and DataGlove, a person can learn anatomy from a new perspective by 'flying' inside and around the organs, or can practice surgical procedures with a scalpel and clamps. Database visualization creates 3-D images of complex medical data for new perspectives in analysis. Rehabilitation medicine permits impaired individuals to explore worlds not otherwise available to them, allows accurate assessment and therapy for their disabilities, and helps architects understand their critical needs in public or personal space. And to support these advanced technologies, the operating room and hospital of the future will be first designed and tested in virtual reality, bringing together the full power of the digital physician.

  19. Virtual Representations in 3D Learning Environments

    ERIC Educational Resources Information Center

    Shonfeld, Miri; Kritz, Miki

    2013-01-01

    This research explores the extent to which virtual worlds can serve as online collaborative learning environments for students by increasing social presence and engagement. 3D environments enable learning, which simulates face-to-face encounters while retaining the advantages of online learning. Students in Education departments created avatars…

  20. Transparent 3D display for augmented reality

    NASA Astrophysics Data System (ADS)

    Lee, Byoungho; Hong, Jisoo

    2012-11-01

    Two types of transparent three-dimensional display systems applicable for the augmented reality are demonstrated. One of them is a head-mounted-display-type implementation which utilizes the principle of the system adopting the concave floating lens to the virtual mode integral imaging. Such configuration has an advantage in that the threedimensional image can be displayed at sufficiently far distance resolving the accommodation conflict with the real world scene. Incorporating the convex half mirror, which shows a partial transparency, instead of the concave floating lens, makes it possible to implement the transparent three-dimensional display system. The other type is the projection-type implementation, which is more appropriate for the general use than the head-mounted-display-type implementation. Its imaging principle is based on the well-known reflection-type integral imaging. We realize the feature of transparent display by imposing the partial transparency to the array of concave mirror which is used for the screen of reflection-type integral imaging. Two types of configurations, relying on incoherent and coherent light sources, are both possible. For the incoherent configuration, we introduce the concave half mirror array, whereas the coherent one adopts the holographic optical element which replicates the functionality of the lenslet array. Though the projection-type implementation is beneficial than the head-mounted-display in principle, the present status of the technical advance of the spatial light modulator still does not provide the satisfactory visual quality of the displayed three-dimensional image. Hence we expect that the head-mounted-display-type and projection-type implementations will come up in the market in sequence.

  1. [3D virtual endoscopy of heart].

    PubMed

    Du, Aan; Yang, Xin; Xue, Haihong; Yao, Liping; Sun, Kun

    2012-10-01

    In this paper, we present a virtual endoscopy (VE) for diagnosis of heart diseases, which is proved efficient and affordable, easy to popularize for viewing the interior of the heart. The dual source CT (DSCT) data were used as primary data in our system. The 3D structure of virtual heart was reconstructed with 3D texture mapping technology based on graphics processing unit (GPU), and could be displayed dynamically in real time. When we displayed it in real time, we could not only observe the inside of the chambers of heart but also examine from the new angle of view by the 3D data which were already clipped according to doctor's desire. In the pattern of observation, we used both mutual interactive mode and auto mode. In the auto mode, we used Dijkstra Algorithm which treated the 3D Euler distance as weighting factor to find out the view path quickly, and, used view path to calculate the four chamber plane. PMID:23198444

  2. Virtual VMASC: A 3D Game Environment

    NASA Technical Reports Server (NTRS)

    Manepalli, Suchitra; Shen, Yuzhong; Garcia, Hector M.; Lawsure, Kaleen

    2010-01-01

    The advantages of creating interactive 3D simulations that allow viewing, exploring, and interacting with land improvements, such as buildings, in digital form are manifold and range from allowing individuals from anywhere in the world to explore those virtual land improvements online, to training military personnel in dealing with war-time environments, and to making those land improvements available in virtual worlds such as Second Life. While we haven't fully explored the true potential of such simulations, we have identified a requirement within our organization to use simulations like those to replace our front-desk personnel and allow visitors to query, naVigate, and communicate virtually with various entities within the building. We implemented the Virtual VMASC 3D simulation of the Virginia Modeling Analysis and Simulation Center (VMASC) office building to not only meet our front-desk requirement but also to evaluate the effort required in designing such a simulation and, thereby, leverage the experience we gained in future projects of this kind. This paper describes the goals we set for our implementation, the software approach taken, the modeling contribution made, and the technologies used such as XNA Game Studio, .NET framework, Autodesk software packages, and, finally, the applicability of our implementation on a variety of architectures including Xbox 360 and PC. This paper also summarizes the result of our evaluation and the lessons learned from our effort.

  3. Virtual reality at work

    NASA Technical Reports Server (NTRS)

    Brooks, Frederick P., Jr.

    1991-01-01

    The utility of virtual reality computer graphics in telepresence applications is not hard to grasp and promises to be great. When the virtual world is entirely synthetic, as opposed to real but remote, the utility is harder to establish. Vehicle simulators for aircraft, vessels, and motor vehicles are proving their worth every day. Entertainment applications such as Disney World's StarTours are technologically elegant, good fun, and economically viable. Nevertheless, some of us have no real desire to spend our lifework serving the entertainment craze of our sick culture; we want to see this exciting technology put to work in medicine and science. The topics covered include the following: testing a force display for scientific visualization -- molecular docking; and testing a head-mounted display for scientific and medical visualization.

  4. Engineering applications of virtual reality

    NASA Astrophysics Data System (ADS)

    Smith, James R.; Grimes, Robert V.; Plant, Tony A.

    1996-04-01

    This paper addresses some of the practical applications, advantages and difficulties associated with the engineering applications of virtual reality. The paper tracks actual investigative work in progress on this subject at the BNR research lab in RTP, NC. This work attempts to demonstrate the actual value added to the engineering process by using existing 3D CAD data for interactive information navigation and evaluation of design concepts and products. Specifically, the work includes translation of Parametric Technology's Pro/ENGINEER models into a virtual world to evaluate potential attributes such as multiple concept exploration and product installation assessment. Other work discussed in this paper includes extensive evaluation of two new tools, VRML and SGI's/Template Graphics' WebSpace for navigation through Pro/ENGINEER models with links to supporting technical documentation and data. The benefits of using these tolls for 3D interactive navigation and exploration throughout three key phases of the physical design process is discussed in depth. The three phases are Design Concept Development, Product Design Evaluation and Product Design Networking. The predicted values added include reduced time to `concept ready', reduced prototype iterations, increased `design readiness' and shorter manufacturing introduction cycles.

  5. When Rural Reality Goes Virtual.

    ERIC Educational Resources Information Center

    Husain, Dilshad D.

    1998-01-01

    In rural towns where sparse population and few business are barriers, virtual reality may be the only way to bring work-based learning to students. A partnership between a small-town high school, the Ohio Supercomputer Center, and a high-tech business will enable students to explore the workplace using virtual reality. (JOW)

  6. Virtual reality in surgery and medicine.

    PubMed

    Chinnock, C

    1994-01-01

    This report documents the state of development of enhanced and virtual reality-based systems in medicine. Virtual reality systems seek to simulate a surgical procedure in a computer-generated world in order to improve training. Enhanced reality systems seek to augment or enhance reality by providing improved imaging alternatives for specific patient data. Virtual reality represents a paradigm shift in the way we teach and evaluate the skills of medical personnel. Driving the development of virtual reality-based simulators is laparoscopic abdominal surgery, where there is a perceived need for better training techniques; within a year, systems will be fielded for second-year residency students. Further refinements over perhaps the next five years should allow surgeons to evaluate and practice new techniques in a simulator before using them on patients. Technical developments are rapidly improving the realism of these machines to an amazing degree, as well as bringing the price down to affordable levels. In the next five years, many new anatomical models, procedures, and skills are likely to become available on simulators. Enhanced reality systems are generally being developed to improve visualization of specific patient data. Three-dimensional (3-D) stereovision systems for endoscopic applications, head-mounted displays, and stereotactic image navigation systems are being fielded now, with neurosurgery and laparoscopic surgery being major driving influences. Over perhaps the next five years, enhanced and virtual reality systems are likely to merge. This will permit patient-specific images to be used on virtual reality simulators or computer-generated landscapes to be input into surgical visualization instruments. Percolating all around these activities are developments in robotics and telesurgery. An advanced information infrastructure eventually will permit remote physicians to share video, audio, medical records, and imaging data with local physicians in real time

  7. Surgery applications of virtual reality

    NASA Technical Reports Server (NTRS)

    Rosen, Joseph

    1994-01-01

    Virtual reality is a computer-generated technology which allows information to be displayed in a simulated, bus lifelike, environment. In this simulated 'world', users can move and interact as if they were actually a part of that world. This new technology will be useful in many different fields, including the field of surgery. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations, simulate and perform surgical procedures (telesurgery), and predict the outcomes of surgery. The authors of this paper describe the basic components of a virtual reality surgical system. These components include: the virtual world, the virtual tools, the anatomical model, the software platform, the host computer, the interface, and the head-coupled display. In the chapter they also review the progress towards using virtual reality for surgical training, planning, telesurgery, and predicting outcomes. Finally, the authors present a training system being developed for the practice of new procedures in abdominal surgery.

  8. Augmented Virtual Reality Laboratory

    NASA Technical Reports Server (NTRS)

    Tully-Hanson, Benjamin

    2015-01-01

    Real time motion tracking hardware has for the most part been cost prohibitive for research to regularly take place until recently. With the release of the Microsoft Kinect in November 2010, researchers now have access to a device that for a few hundred dollars is capable of providing redgreenblue (RGB), depth, and skeleton data. It is also capable of tracking multiple people in real time. For its original intended purposes, i.e. gaming, being used with the Xbox 360 and eventually Xbox One, it performs quite well. However, researchers soon found that although the sensor is versatile, it has limitations in real world applications. I was brought aboard this summer by William Little in the Augmented Virtual Reality (AVR) Lab at Kennedy Space Center to find solutions to these limitations.

  9. Overestimation of heights in virtual reality is influenced more by perceived distal size than by the 2-D versus 3-D dimensionality of the display

    NASA Technical Reports Server (NTRS)

    Dixon, Melissa W.; Proffitt, Dennis R.; Kaiser, M. K. (Principal Investigator)

    2002-01-01

    One important aspect of the pictorial representation of a scene is the depiction of object proportions. Yang, Dixon, and Proffitt (1999 Perception 28 445-467) recently reported that the magnitude of the vertical-horizontal illusion was greater for vertical extents presented in three-dimensional (3-D) environments compared to two-dimensional (2-D) displays. However, because all of the 3-D environments were large and all of the 2-D displays were small, the question remains whether the observed magnitude differences were due solely to the dimensionality of the displays (2-D versus 3-D) or to the perceived distal size of the extents (small versus large). We investigated this question by comparing observers' judgments of vertical relative to horizontal extents on a large but 2-D display compared to the large 3-D and the small 2-D displays used by Yang et al (1999). The results confirmed that the magnitude differences for vertical overestimation between display media are influenced more by the perceived distal object size rather than by the dimensionality of the display.

  10. Virtual reality and psychotherapy.

    PubMed

    Botella, Cristina; Quero, Soledad; Baños, Rosa M; Perpiñá, Conxa; García Palacios, Azucena; Riva, Giuseppe

    2004-01-01

    Virtual Reality (VR) is a new technology consisting on a graphic environment in which the user, not only has the feeling of being physically present in a virtual world, but he/she can interact with it. The first VR workstations were designed for big companies in order to create environments that simulate certain situations to train professionals. However, at this moment a great expansion of this technology is taking place in several fields, including the area of health. Especially interesting for us is the use of VR as a therapeutic tool in the treatment of psychological disorders. Compared to the traditional treatments, VR has many advantages (e.g., it is a protected environment for the patient, he/she can re-experience many times the feared situation, etc.). There are already data on the effectiveness of this technology in the treatment of different psychological disorders; here anxiety disorders, eating disorders and sexual disorders are reviewed. Finally, this chapter ends with some words about the limitations of VR and future perspectives.

  11. A Collaborative Virtual Environment for Situated Language Learning Using VEC3D

    ERIC Educational Resources Information Center

    Shih, Ya-Chun; Yang, Mau-Tsuen

    2008-01-01

    A 3D virtually synchronous communication architecture for situated language learning has been designed to foster communicative competence among undergraduate students who have studied English as a foreign language (EFL). We present an innovative approach that offers better e-learning than the previous virtual reality educational applications. The…

  12. Learning in 3-D Virtual Worlds: Rethinking Media Literacy

    ERIC Educational Resources Information Center

    Qian, Yufeng

    2008-01-01

    3-D virtual worlds, as a new form of learning environments in the 21st century, hold great potential in education. Learning in such environments, however, demands a broader spectrum of literacy skills. This article identifies a new set of media literacy skills required in 3-D virtual learning environments by reviewing exemplary 3-D virtual…

  13. Virtual Reality: The Promise of the Future.

    ERIC Educational Resources Information Center

    Lanier, Jaron

    1992-01-01

    Defines virtual reality and describes the equipment or clothing necessary to achieve the illusion of being in a virtual world. Recent developments with this technology and current virtual reality applications are discussed, including experiential prototyping, telepresence, and educational applications. (MES)

  14. Virtual Realities and the Future of Text.

    ERIC Educational Resources Information Center

    Marcus, Stephen

    1992-01-01

    Discusses issues surrounding virtual reality and "virtual books." Suggests that those who are exploring the territory of virtual realities are already helping to expand and enrich expectations and visions for integrating technology into reading and writing. (RS)

  15. A specification of 3D manipulation in virtual environments

    NASA Technical Reports Server (NTRS)

    Su, S. Augustine; Furuta, Richard

    1994-01-01

    In this paper we discuss the modeling of three basic kinds of 3-D manipulations in the context of a logical hand device and our virtual panel architecture. The logical hand device is a useful software abstraction representing hands in virtual environments. The virtual panel architecture is the 3-D component of the 2-D window systems. Both of the abstractions are intended to form the foundation for adaptable 3-D manipulation.

  16. Telemedicine, virtual reality, and surgery

    NASA Technical Reports Server (NTRS)

    Mccormack, Percival D.; Charles, Steve

    1994-01-01

    Two types of synthetic experience are covered: virtual reality (VR) and surgery, and telemedicine. The topics are presented in viewgraph form and include the following: geometric models; physiological sensors; surgical applications; virtual cadaver; VR surgical simulation; telesurgery; VR Surgical Trainer; abdominal surgery pilot study; advanced abdominal simulator; examples of telemedicine; and telemedicine spacebridge.

  17. Image-based panoramic virtual reality system

    NASA Astrophysics Data System (ADS)

    Ritchey, Kurtis J.

    1992-06-01

    An extensive family of advanced virtual reality-telepresence systems and components have been developed. The purpose of these systems and components is to facilitate recording, processing, display, and interaction with audio and video signal(s) representing a scene or subject of three-dimensions (3-D). An overview of the systems currently available for license includes: a color video camera with real-time simultaneous spherical FOV coverage; a similar camera for recording various sides of a 3-D subject; an image based system for real-time processing and distribution of said camera based images onto 3-D wireframes; resultant camcorders are generally referred to as virtual reality/telepresence 'VRT camcorders'TM; a 'VIDEOROOM'TM large theater display system in which the floor, walls, and ceiling form a continuous display about the viewer for display of said images; 'INaVISION'TM a HMD system for viewing the same said images; and interactive control devices for manipulating said 3-D image and audio signal(s). Applications, to include visual and auditory simulation, host vehicle control, remote vehicle control, video teleconferencing, and so on, are feasible applications for the above technology. Rough costs of systems and components, photographs of a prototype system, and component illustrations are provided. Future directions of R&D are presented (i.e., Project HEAVEN: Humankind Eternal-Life Artificial-Intelligence Virtual Environment Network).

  18. Virtual annotation: Verbal communication in virtual reality

    NASA Astrophysics Data System (ADS)

    Verlinden, Jouke C.; Bolter, Jay David; Vandermast, Charles

    A system that was developed to explore communication in virtual reality and which offers a simple and powerful method to embed verbal communication in simulations and visualizers by means of voice annotation is described. The prototype demonstrates that the addition of verbal communication opens up a range of new uses for virtual environments. A similar voice annotation facility is easily added to existing visualizers and simulations, and it enables reading, writing and communicating.

  19. Virtual Reality: The Future of Animated Virtual Instructor, the Technology and Its Emergence to a Productive E-Learning Environment.

    ERIC Educational Resources Information Center

    Jiman, Juhanita

    This paper discusses the use of Virtual Reality (VR) in e-learning environments where an intelligent three-dimensional (3D) virtual person plays the role of an instructor. With the existence of this virtual instructor, it is hoped that the teaching and learning in the e-environment will be more effective and productive. This virtual 3D animated…

  20. Learning in 3D Virtual Environments: Collaboration and Knowledge Spirals

    ERIC Educational Resources Information Center

    Burton, Brian G.; Martin, Barbara N.

    2010-01-01

    The purpose of this case study was to determine if learning occurred within a 3D virtual learning environment by determining if elements of collaboration and Nonaka and Takeuchi's (1995) knowledge spiral were present. A key portion of this research was the creation of a Virtual Learning Environment. This 3D VLE utilized the Torque Game Engine…

  1. Mobile viewer system for virtual 3D space using infrared LED point markers and camera

    NASA Astrophysics Data System (ADS)

    Sakamoto, Kunio; Taneji, Shoto

    2006-09-01

    The authors have developed a 3D workspace system using collaborative imaging devices. A stereoscopic display enables this system to project 3D information. In this paper, we describe the position detecting system for a see-through 3D viewer. A 3D display system is useful technology for virtual reality, mixed reality and augmented reality. We have researched spatial imaging and interaction system. We have ever proposed 3D displays using the slit as a parallax barrier, the lenticular screen and the holographic optical elements(HOEs) for displaying active image 1)2)3)4). The purpose of this paper is to propose the interactive system using these 3D imaging technologies. The observer can view virtual images in the real world when the user watches the screen of a see-through 3D viewer. The goal of our research is to build the display system as follows; when users see the real world through the mobile viewer, the display system gives users virtual 3D images, which is floating in the air, and the observers can touch these floating images and interact them such that kids can make play clay. The key technologies of this system are the position recognition system and the spatial imaging display. The 3D images are presented by the improved parallax barrier 3D display. Here the authors discuss the measuring method of the mobile viewer using infrared LED point markers and a camera in the 3D workspace (augmented reality world). The authors show the geometric analysis of the proposed measuring method, which is the simplest method using a single camera not the stereo camera, and the results of our viewer system.

  2. Molecular Rift: Virtual Reality for Drug Designers.

    PubMed

    Norrby, Magnus; Grebner, Christoph; Eriksson, Joakim; Boström, Jonas

    2015-11-23

    Recent advances in interaction design have created new ways to use computers. One example is the ability to create enhanced 3D environments that simulate physical presence in the real world--a virtual reality. This is relevant to drug discovery since molecular models are frequently used to obtain deeper understandings of, say, ligand-protein complexes. We have developed a tool (Molecular Rift), which creates a virtual reality environment steered with hand movements. Oculus Rift, a head-mounted display, is used to create the virtual settings. The program is controlled by gesture-recognition, using the gaming sensor MS Kinect v2, eliminating the need for standard input devices. The Open Babel toolkit was integrated to provide access to powerful cheminformatics functions. Molecular Rift was developed with a focus on usability, including iterative test-group evaluations. We conclude with reflections on virtual reality's future capabilities in chemistry and education. Molecular Rift is open source and can be downloaded from GitHub.

  3. Molecular Rift: Virtual Reality for Drug Designers.

    PubMed

    Norrby, Magnus; Grebner, Christoph; Eriksson, Joakim; Boström, Jonas

    2015-11-23

    Recent advances in interaction design have created new ways to use computers. One example is the ability to create enhanced 3D environments that simulate physical presence in the real world--a virtual reality. This is relevant to drug discovery since molecular models are frequently used to obtain deeper understandings of, say, ligand-protein complexes. We have developed a tool (Molecular Rift), which creates a virtual reality environment steered with hand movements. Oculus Rift, a head-mounted display, is used to create the virtual settings. The program is controlled by gesture-recognition, using the gaming sensor MS Kinect v2, eliminating the need for standard input devices. The Open Babel toolkit was integrated to provide access to powerful cheminformatics functions. Molecular Rift was developed with a focus on usability, including iterative test-group evaluations. We conclude with reflections on virtual reality's future capabilities in chemistry and education. Molecular Rift is open source and can be downloaded from GitHub. PMID:26558887

  4. Virtual reality and planetary exploration

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.

  5. Virtual reality and stereoscopic telepresence

    SciTech Connect

    Mertens, E.P.

    1994-12-01

    Virtual reality technology is commonly thought to have few, if any, applications beyond the national research laboratories, the aerospace industry, and the entertainment world. A team at Westinghouse Hanford Company (WHC) is developing applications for virtual reality technology that make it a practical, viable, portable, and cost-effective business and training tool. The technology transfer is particularly applicable to the waste management industry and has become a tool that can serve the entire work force spectrum, from industrial sites to business offices. For three and a half years, a small team of WHC personnel has been developing an effective and practical method of bringing virtual reality technology to the job site. The applications are practical, the results are repeatable, and the equipment costs are within the range of present-day office machines. That combination can evolve into a competitive advantage for commercial business interests. The WHC team has contained system costs by using commercially available equipment and personal computers to create effective virtual reality work stations for less than $20,000.

  6. Embedding speech into virtual realities

    NASA Technical Reports Server (NTRS)

    Bohn, Christian-Arved; Krueger, Wolfgang

    1993-01-01

    In this work a speaker-independent speech recognition system is presented, which is suitable for implementation in Virtual Reality applications. The use of an artificial neural network in connection with a special compression of the acoustic input leads to a system, which is robust, fast, easy to use and needs no additional hardware, beside a common VR-equipment.

  7. Virtual Reality and Engineering Education.

    ERIC Educational Resources Information Center

    Pantelidis, Veronica S.

    1997-01-01

    Virtual Reality (VR) offers benefits to engineering education. This article defines VR and describes types; outlines reasons for using VR in engineering education; provides guidelines for using VR; presents a model for determining when to use VR; discusses VR applications; and describes hardware and software needed for a low-budget VR and…

  8. Virtual reality and planetary exploration

    NASA Astrophysics Data System (ADS)

    McGreevy, Michael W.

    Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.

  9. Designing Virtual Museum Using Web3D Technology

    NASA Astrophysics Data System (ADS)

    Zhao, Jianghai

    VRT was born to have the potentiality of constructing an effective learning environment due to its 3I characteristics: Interaction, Immersion and Imagination. It is now applied in education in a more profound way along with the development of VRT. Virtual Museum is one of the applications. The Virtual Museum is based on the WEB3D technology and extensibility is the most important factor. Considering the advantage and disadvantage of each WEB3D technology, VRML, CULT3D AND VIEWPOINT technologies are chosen. A web chatroom based on flash and ASP technology is also been created in order to make the Virtual Museum an interactive learning environment.

  10. Immersive virtual reality simulations in nursing education.

    PubMed

    Kilmon, Carol A; Brown, Leonard; Ghosh, Sumit; Mikitiuk, Artur

    2010-01-01

    This article explores immersive virtual reality as a potential educational strategy for nursing education and describes an immersive learning experience now being developed for nurses. This pioneering project is a virtual reality application targeting speed and accuracy of nurse response in emergency situations requiring cardiopulmonary resuscitation. Other potential uses and implications for the development of virtual reality learning programs are discussed. PMID:21086871

  11. Simulated maintenance a virtual reality

    SciTech Connect

    Lirvall, P.

    1995-10-01

    The article describes potential applications of personal computer-based virtual reality software. The applications are being investigated by Atomic Energy of Canada Limited`s (AECL) Chalk River Laboratories for the Canadian deuterium-uranium (Candu) reactor. Objectives include: (1) reduction of outage duration and improved safety, (2) cost-effective and safe maintenance of equipment, (3) reduction of exposure times and identification of overexposure situations, (4) cost-effective training in a virtual control room simulator, (5) human factors evaluation of design interface, and (6) visualization of conceptual and detailed designs of critical nuclear field environments. A demonstration model of a typical reactor control room, the use of virtual reality in outage planning, and safety issues are outlined.

  12. Virtual Reality: You Are There

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Telepresence or "virtual reality," allows a person, with assistance from advanced technology devices, to figuratively project himself into another environment. This technology is marketed by several companies, among them Fakespace, Inc., a former Ames Research Center contractor. Fakespace developed a teleoperational motion platform for transmitting sounds and images from remote locations. The "Molly" matches the user's head motion and, when coupled with a stereo viewing device and appropriate software, creates the telepresence experience. Its companion piece is the BOOM-the user's viewing device that provides the sense of involvement in the virtual environment. Either system may be used alone. Because suits, gloves, headphones, etc. are not needed, a whole range of commercial applications is possible, including computer-aided design techniques and virtual reality visualizations. Customers include Sandia National Laboratories, Stanford Research Institute and Mattel Toys.

  13. 3D Viewing: Odd Perception - Illusion? reality? or both?

    NASA Astrophysics Data System (ADS)

    Kisimoto, K.; Iizasa, K.

    2008-12-01

    We live in the three dimensional space, don't we? It could be at least four dimensions, but that is another story. In either way our perceptual capability of 3D-Viewing is constrained by our 2D-perception (our intrinsic tools of perception). I carried out a few visual experiments using topographic data to show our intrinsic (or biological) disability (or shortcoming) in 3D-recognition of our world. Results of the experiments suggest: (1) 3D-surface model displayed on a 2D-computer screen (or paper) always has two interpretations of the 3D- surface geometry, if we choose one of the interpretation (in other word, if we are hooked by one perception of the two), we maintain its perception even if the 3D-model changes its viewing perspective in time shown on the screen, (2) more interesting is that 3D-real solid object (e.g.,made of clay) also gives above mentioned two interpretations of the geometry of the object, if we observe the object with one-eye. Most famous example of this viewing illusion is exemplified by a magician, who died in 2007, Jerry Andrus who made a super-cool paper crafted dragon which causes visual illusion to one-eyed viewer. I, by the experiments, confirmed this phenomenon in another perceptually persuasive (deceptive?) way. My conclusion is that this illusion is intrinsic, i.e. reality for human, because, even if we live in 3D-space, our perceptional tool (eyes) is composed of 2D sensors whose information is reconstructed or processed to 3D by our experience-based brain. So, (3) when we observe the 3D-surface-model on the computer screen, we are always one eye short even if we use both eyes. One last suggestion from my experiments is that recent highly sophisticated 3D- models might include too many information that human perceptions cannot handle properly, i.e. we might not be understanding the 3D world (geospace) at all, just illusioned.

  14. Marshall Engineers Use Virtual Reality

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  15. Virtual manufacturing in reality

    NASA Astrophysics Data System (ADS)

    Papstel, Jyri; Saks, Alo

    2000-10-01

    SMEs play an important role in manufacturing industry. But from time to time there is a shortage in resources to complete the particular order in time. Number of systems is introduced to produce digital information in order to support product and process development activities. Main problem is lack of opportunity for direct data transition within design system modules when needed temporary extension of design capacity (virtuality) or to implement integrated concurrent product development principles. The planning experience in the field is weakly used as well. The concept of virtual manufacturing is a supporting idea to solve this problem. At the same time a number of practical problems should be solved like information conformity, data transfer, unified technological concepts acceptation etc. In the present paper the proposed ways to solve the practical problems of virtual manufacturing are described. General objective is to introduce the knowledge-based CAPP system as missing module for Virtual Manufacturing in the selected product domain. Surface-centered planning concept based on STEP- based modeling principles, and knowledge-based process planning methodology will be used to gain the objectives. As a result the planning module supplied by design data with direct access, and supporting advising environment is expected. Mould producing SME would be as test basis.

  16. Virtual Libraries: Service Realities.

    ERIC Educational Resources Information Center

    Novak, Jan

    This paper discusses client service issues to be considered when transitioning to a virtual library situation. Themes related to the transitional nature of society in the knowledge era are presented, including: paradox and a contradictory nature; blurring of boundaries; networks, systems, and holistic thinking; process/not product, becoming/not…

  17. Development of visual 3D virtual environment for control software

    NASA Technical Reports Server (NTRS)

    Hirose, Michitaka; Myoi, Takeshi; Amari, Haruo; Inamura, Kohei; Stark, Lawrence

    1991-01-01

    Virtual environments for software visualization may enable complex programs to be created and maintained. A typical application might be for control of regional electric power systems. As these encompass broader computer networks than ever, construction of such systems becomes very difficult. Conventional text-oriented environments are useful in programming individual processors. However, they are obviously insufficient to program a large and complicated system, that includes large numbers of computers connected to each other; such programming is called 'programming in the large.' As a solution for this problem, the authors are developing a graphic programming environment wherein one can visualize complicated software in virtual 3D world. One of the major features of the environment is the 3D representation of concurrent process. 3D representation is used to supply both network-wide interprocess programming capability (capability for 'programming in the large') and real-time programming capability. The authors' idea is to fuse both the block diagram (which is useful to check relationship among large number of processes or processors) and the time chart (which is useful to check precise timing for synchronization) into a single 3D space. The 3D representation gives us a capability for direct and intuitive planning or understanding of complicated relationship among many concurrent processes. To realize the 3D representation, a technology to enable easy handling of virtual 3D object is a definite necessity. Using a stereo display system and a gesture input device (VPL DataGlove), our prototype of the virtual workstation has been implemented. The workstation can supply the 'sensation' of the virtual 3D space to a programmer. Software for the 3D programming environment is implemented on the workstation. According to preliminary assessments, a 50 percent reduction of programming effort is achieved by using the virtual 3D environment. The authors expect that the 3D

  18. Mixed reality orthognathic surgical simulation by entity model manipulation and 3D-image display

    NASA Astrophysics Data System (ADS)

    Shimonagayoshi, Tatsunari; Aoki, Yoshimitsu; Fushima, Kenji; Kobayashi, Masaru

    2005-12-01

    In orthognathic surgery, the framing of 3D-surgical planning that considers the balance between the front and back positions and the symmetry of the jawbone, as well as the dental occlusion of teeth, is essential. In this study, a support system for orthodontic surgery to visualize the changes in the mandible and the occlusal condition and to determine the optimum position in mandibular osteotomy has been developed. By integrating the operating portion of a tooth model that is to determine the optimum occlusal position by manipulating the entity tooth model and the 3D-CT skeletal images (3D image display portion) that are simultaneously displayed in real-time, the determination of the mandibular position and posture in which the improvement of skeletal morphology and occlusal condition is considered, is possible. The realistic operation of the entity model and the virtual 3D image display enabled the construction of a surgical simulation system that involves augmented reality.

  19. The virtual reality arthroscopy training simulator.

    PubMed

    Müller, W; Bockholt, U

    1998-01-01

    Arthroscopy has already become an irreplaceable method in diagnostics. The arthroscope, with optics and light source, and the exploratory probe are inserted into the knee joint through two small incisions underneath the patella. Currently, the skills required for arthroscopy are taught through hands-on clinical experience. As arthroscopies became a more common procedure even in smaller hospitals, it became obvious that special training was necessary to guarantee qualification of the surgeons. On-the-job training proved to be insufficient. Therefore, research groups from the Berufsgenossenschaftliche Unfallklinik Frankfurt am Main approached the Fraunhofer Institute for Computer Graphics to develop a training system for arthroscopy based on virtual reality (VR) techniques. Two main issues are addressed: the three-dimensional (3-D) reconstruction process and the 3-D interaction. To provide the virtual environment a realistic representation of the region of interest with all relevant anatomical structures is required. Based on a magnetic resonance image sequence a realistic representation of the knee joint was obtained suitable for computer simulation. Two main components of the VR interface can be distinguished: the 3-D interaction to guide the surgical instruments and the 2-D graphical user interface for visual feedback and control of the session. Moreover, the 3-D interaction has to be realized by means of Virtual Reality techniques providing a simulation of an arthroscope and an intuitive handling of other surgical instruments. Currently, the main drawback of the developed simulator is the missing of haptic perception, especially of force feedback. In cooperation with the Department of Electro-Mechanical Construction at the Technical University Darmstadt a haptic display is designed and built for the VR arthroscopy training simulator. In parallel we developed a concept for the integration of the haptic display in a configurable way. PMID:10180528

  20. Virtual Reality--Learning by Immersion.

    ERIC Educational Resources Information Center

    Dunning, Jeremy

    1998-01-01

    Discusses the use of virtual reality in educational software. Topics include CAVE (Computer-Assisted Virtual Environments); cost-effective virtual environment tools including QTVR (Quick Time Virtual Reality); interactive exercises; educational criteria for technology-based educational tools; and examples of screen displays. (LRW)

  1. Direct Manipulation in Virtual Reality

    NASA Technical Reports Server (NTRS)

    Bryson, Steve

    2003-01-01

    Virtual Reality interfaces offer several advantages for scientific visualization such as the ability to perceive three-dimensional data structures in a natural way. The focus of this chapter is direct manipulation, the ability for a user in virtual reality to control objects in the virtual environment in a direct and natural way, much as objects are manipulated in the real world. Direct manipulation provides many advantages for the exploration of complex, multi-dimensional data sets, by allowing the investigator the ability to intuitively explore the data environment. Because direct manipulation is essentially a control interface, it is better suited for the exploration and analysis of a data set than for the publishing or communication of features found in that data set. Thus direct manipulation is most relevant to the analysis of complex data that fills a volume of three-dimensional space, such as a fluid flow data set. Direct manipulation allows the intuitive exploration of that data, which facilitates the discovery of data features that would be difficult to find using more conventional visualization methods. Using a direct manipulation interface in virtual reality, an investigator can, for example, move a data probe about in space, watching the results and getting a sense of how the data varies within its spatial volume.

  2. What Are the Learning Affordances of 3-D Virtual Environments?

    ERIC Educational Resources Information Center

    Dalgarno, Barney; Lee, Mark J. W.

    2010-01-01

    This article explores the potential learning benefits of three-dimensional (3-D) virtual learning environments (VLEs). Drawing on published research spanning two decades, it identifies a set of unique characteristics of 3-D VLEs, which includes aspects of their representational fidelity and aspects of the learner-computer interactivity they…

  3. ESL Teacher Training in 3D Virtual Worlds

    ERIC Educational Resources Information Center

    Kozlova, Iryna; Priven, Dmitri

    2015-01-01

    Although language learning in 3D Virtual Worlds (VWs) has become a focus of recent research, little is known about the knowledge and skills teachers need to acquire to provide effective task-based instruction in 3D VWs and the type of teacher training that best prepares instructors for such an endeavor. This study employs a situated learning…

  4. Educational Visualizations in 3D Collaborative Virtual Environments: A Methodology

    ERIC Educational Resources Information Center

    Fominykh, Mikhail; Prasolova-Forland, Ekaterina

    2012-01-01

    Purpose: Collaborative virtual environments (CVEs) have become increasingly popular in educational settings and the role of 3D content is becoming more and more important. Still, there are many challenges in this area, such as lack of empirical studies that provide design for educational activities in 3D CVEs and lack of norms of how to support…

  5. Virtual 3d City Modeling: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2013-08-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as Building, Tree, Vegetation, and some manmade feature belonging to urban area. There are various terms used for 3D city models such as "Cybertown", "Cybercity", "Virtual City", or "Digital City". 3D city models are basically a computerized or digital model of a city contains the graphic representation of buildings and other objects in 2.5 or 3D. Generally three main Geomatics approach are using for Virtual 3-D City models generation, in first approach, researcher are using Conventional techniques such as Vector Map data, DEM, Aerial images, second approach are based on High resolution satellite images with LASER scanning, In third method, many researcher are using Terrestrial images by using Close Range Photogrammetry with DSM & Texture mapping. We start this paper from the introduction of various Geomatics techniques for 3D City modeling. These techniques divided in to two main categories: one is based on Automation (Automatic, Semi-automatic and Manual methods), and another is Based on Data input techniques (one is Photogrammetry, another is Laser Techniques). After details study of this, finally in short, we are trying to give the conclusions of this study. In the last, we are trying to give the conclusions of this research paper and also giving a short view for justification and analysis, and present trend for 3D City modeling. This paper gives an overview about the Techniques related with "Generation of Virtual 3-D City models using Geomatics Techniques" and the Applications of Virtual 3D City models. Photogrammetry, (Close range, Aerial, Satellite), Lasergrammetry, GPS, or combination of these modern Geomatics techniques play a major role to create a virtual 3-D City model. Each and every techniques and method has some advantages and some drawbacks. Point cloud model is a modern trend for virtual 3-D city model. Photo-realistic, Scalable, Geo-referenced virtual 3

  6. Game-Like Language Learning in 3-D Virtual Environments

    ERIC Educational Resources Information Center

    Berns, Anke; Gonzalez-Pardo, Antonio; Camacho, David

    2013-01-01

    This paper presents our recent experiences with the design of game-like applications in 3-D virtual environments as well as its impact on student motivation and learning. Therefore our paper starts with a brief analysis of the motivational aspects of videogames and virtual worlds (VWs). We then go on to explore the possible benefits of both in the…

  7. Virtual reality and anthropology.

    PubMed

    Recheis, W; Weber, G W; Schäfer, K; Knapp, R; Seidler, H; zur Nedden, D

    1999-08-01

    Since the discovery of the Tyrolean Iceman in 1991 advanced imaging and post processing techniques were successfully applied in anthropology. Specific techniques include spiral computed tomography and 3-dimensional reconstructions including stereolithographic and fused deposition modeling of volume data sets. The Iceman's skull was the first to be reproduced using stereolithography, before this method was successfully applied in preoperative planning. With the advent of high-end graphics workstations and biomedical image processing software packages, 3-dimensional reconstructions were established as a routine tool for analyzing volume data sets. These techniques opened totally new insights in the field of physical anthropology. Computed tomography became the ideal research tool to access the internal structures of various precious fossils without damaging or even touching them. Many of the most precious specimens from the species Autralopithecus (1.8-3.5 Myears), Homo heidelbergensis (200-600 kyears) or Homo neanderthalensis (40-100 kyears) were scanned during the last 5 years. Often the fossils are filled with a stone matrix or other materials. During the postprocessing routines highly advanced algorithms were used to remove virtually these incrustations. Thus it was possible to visualize the morphological structures that lie beneath the matrix. Some specimen were partially destroyed, so the missing parts were reconstructed on computer screen in order to get estimations of the brain volume and endocranial morphology, both major fields of interest in physical anthropology. Moreover the computerized form of the data allows new descriptions of morphologic structures by the means of 'geometric morphometrics'. Some of the results may change aspects and interpretations in human evolution. The introduction of new imaging and post processing techniques created a new field of research: Virtual Anthropology.

  8. Rapid prototyping 3D virtual world interfaces within a virtual factory environment

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.

  9. Integration of the virtual 3D model of a control system with the virtual controller

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2015-11-01

    Nowadays the design process includes simulation analysis of different components of a constructed object. It involves the need for integration of different virtual object to simulate the whole investigated technical system. The paper presents the issues related to the integration of a virtual 3D model of a chosen control system of with a virtual controller. The goal of integration is to verify the operation of an adopted object of in accordance with the established control program. The object of the simulation work is the drive system of a tunneling machine for trenchless work. In the first stage of work was created an interactive visualization of functioning of the 3D virtual model of a tunneling machine. For this purpose, the software of the VR (Virtual Reality) class was applied. In the elaborated interactive application were created adequate procedures allowing controlling the drive system of a translatory motion, a rotary motion and the drive system of a manipulator. Additionally was created the procedure of turning on and off the output crushing head, mounted on the last element of the manipulator. In the elaborated interactive application have been established procedures for receiving input data from external software, on the basis of the dynamic data exchange (DDE), which allow controlling actuators of particular control systems of the considered machine. In the next stage of work, the program on a virtual driver, in the ladder diagram (LD) language, was created. The control program was developed on the basis of the adopted work cycle of the tunneling machine. The element integrating the virtual model of the tunneling machine for trenchless work with the virtual controller is the application written in a high level language (Visual Basic). In the developed application was created procedures responsible for collecting data from the running, in a simulation mode, virtual controller and transferring them to the interactive application, in which is verified the

  10. Virtual reality: Avatars in human spaceflight training

    NASA Astrophysics Data System (ADS)

    Osterlund, Jeffrey; Lawrence, Brad

    2012-02-01

    With the advancements in high spatial and temporal resolution graphics, along with advancements in 3D display capabilities to model, simulate, and analyze human-to-machine interfaces and interactions, the world of virtual environments is being used to develop everything from gaming, movie special affects and animations to the design of automobiles. The use of multiple object motion capture technology and digital human tools in aerospace has demonstrated to be a more cost effective alternative to the cost of physical prototypes, provides a more efficient, flexible and responsive environment to changes in the design and training, and provides early human factors considerations concerning the operation of a complex launch vehicle or spacecraft. United Space Alliance (USA) has deployed this technique and tool under Research and Development (R&D) activities on both spacecraft assembly and ground processing operations design and training on the Orion Crew Module. USA utilizes specialized products that were chosen based on functionality, including software and fixed based hardware (e.g., infrared and visible red cameras), along with cyber gloves to ensure fine motor dexterity of the hands. The key findings of the R&D were: mock-ups should be built to not obstruct cameras from markers being tracked; a mock-up toolkit be assembled to facilitate dynamic design changes; markers should be placed in accurate positions on humans and flight hardware to help with tracking; 3D models used in the virtual environment be striped of non-essential data; high computational capable workstations are required to handle the large model data sets; and Technology Interchange Meetings with vendors and other industries also utilizing virtual reality applications need to occur on a continual basis enabling USA to maintain its leading edge within this technology. Parameters of interest and benefit in human spaceflight simulation training that utilizes virtual reality technologies are to

  11. Introduction to augmented and virtual reality

    NASA Astrophysics Data System (ADS)

    Caudell, Thomas P.

    1995-12-01

    This paper introduces the field of augmented reality as a prolog to the body of papers in the remainder of this session. I describe the use of head-mounted display technologies to improve the efficiency and quality of human workers in their performance of engineering design, manufacturing, construction, testing, and maintenance activities. This technology is used to `augment' the visual field of the wearer with information necessary in the performance of the current task. The enabling technology is head-up (see-through) display head sets (HUDsets) combined with head position sensing, real world registration systems, and database access software. A primary difference between virtual reality (VR) and `augmented reality' (AR) is in the complexity of the perceived graphical objects. In AR systems, only simple wire frames, template outlines, designators, and text is displayed. An immediate result of this difference is that augmented reality systems can be driven by standard and inexpensive microprocessors. Many research issues must be addressed before this technology can be widely used, including tracking and registration, human 3D perception and reasoning, and human task performance issues.

  12. [Development of a software for 3D virtual phantom design].

    PubMed

    Zou, Lian; Xie, Zhao; Wu, Qi

    2014-02-01

    In this paper, we present a 3D virtual phantom design software, which was developed based on object-oriented programming methodology and dedicated to medical physics research. This software was named Magical Phan tom (MPhantom), which is composed of 3D visual builder module and virtual CT scanner. The users can conveniently construct any complex 3D phantom, and then export the phantom as DICOM 3.0 CT images. MPhantom is a user-friendly and powerful software for 3D phantom configuration, and has passed the real scene's application test. MPhantom will accelerate the Monte Carlo simulation for dose calculation in radiation therapy and X ray imaging reconstruction algorithm research. PMID:24804488

  13. 3D interactive augmented reality-enhanced digital learning systems for mobile devices

    NASA Astrophysics Data System (ADS)

    Feng, Kai-Ten; Tseng, Po-Hsuan; Chiu, Pei-Shuan; Yang, Jia-Lin; Chiu, Chun-Jie

    2013-03-01

    With enhanced processing capability of mobile platforms, augmented reality (AR) has been considered a promising technology for achieving enhanced user experiences (UX). Augmented reality is to impose virtual information, e.g., videos and images, onto a live-view digital display. UX on real-world environment via the display can be e ectively enhanced with the adoption of interactive AR technology. Enhancement on UX can be bene cial for digital learning systems. There are existing research works based on AR targeting for the design of e-learning systems. However, none of these work focuses on providing three-dimensional (3-D) object modeling for en- hanced UX based on interactive AR techniques. In this paper, the 3-D interactive augmented reality-enhanced learning (IARL) systems will be proposed to provide enhanced UX for digital learning. The proposed IARL systems consist of two major components, including the markerless pattern recognition (MPR) for 3-D models and velocity-based object tracking (VOT) algorithms. Realistic implementation of proposed IARL system is conducted on Android-based mobile platforms. UX on digital learning can be greatly improved with the adoption of proposed IARL systems.

  14. Transforming Clinical Imaging Data for Virtual Reality Learning Objects

    ERIC Educational Resources Information Center

    Trelease, Robert B.; Rosset, Antoine

    2008-01-01

    Advances in anatomical informatics, three-dimensional (3D) modeling, and virtual reality (VR) methods have made computer-based structural visualization a practical tool for education. In this article, the authors describe streamlined methods for producing VR "learning objects," standardized interactive software modules for anatomical sciences…

  15. PC-Based Virtual Reality for CAD Model Viewing

    ERIC Educational Resources Information Center

    Seth, Abhishek; Smith, Shana S.-F.

    2004-01-01

    Virtual reality (VR), as an emerging visualization technology, has introduced an unprecedented communication method for collaborative design. VR refers to an immersive, interactive, multisensory, viewer-centered, 3D computer-generated environment and the combination of technologies required to build such an environment. This article introduces the…

  16. New Desktop Virtual Reality Technology in Technical Education

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2008-01-01

    Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…

  17. Virtual reality in radiation therapy training.

    PubMed

    Boejen, Annette; Grau, Cai

    2011-09-01

    Integration of virtual reality (VR) in clinical training programs is a novel tool in radiotherapy. This paper presents a review of the experience with VR and Immersive visualization in 3D perspective for planning and delivery of external radiotherapy. Planning and delivering radiation therapy is a complex process involving physicians, physicists, radiographers and radiation therapists/nurses (RTT's). The specialists must be able to understand spatial relationships in the patient anatomy. Although still in its infancy, VR tools have become available for radiotherapy training, enabling students to simulate and train clinical situations without interfering with the clinical workflow, and without the risk of making errors. Immersive tools like a 3D linear accelerator and 3D display of dose distributions have been integrated into training, together with IT-labs with clinical software. Training in a VR environment seems to be cost-effective for the clinic. Initial reports suggest that 3D display of dose distributions may improve treatment planning and decision making. Whether VR training qualifies the students better than conventional training is still unsettled, but the first results are encouraging. PMID:20724144

  18. Virtual reality in laparoscopic surgery.

    PubMed

    Uranüs, Selman; Yanik, Mustafa; Bretthauer, Georg

    2004-01-01

    Although the many advantages of laparoscopic surgery have made it an established technique, training in laparoscopic surgery posed problems not encountered in conventional surgical training. Virtual reality simulators open up new perspectives for training in laparoscopic surgery. Under realistic conditions in real time, trainees can tailor their sessions with the VR simulator to suit their needs and goals, and can repeat exercises as often as they wish. VR simulators reduce the number of experimental animals needed for training purposes and are suited to the pursuit of research in laparoscopic surgery. PMID:15747974

  19. Virtual reality in laparoscopic surgery.

    PubMed

    Uranüs, Selman; Yanik, Mustafa; Bretthauer, Georg

    2004-01-01

    Although the many advantages of laparoscopic surgery have made it an established technique, training in laparoscopic surgery posed problems not encountered in conventional surgical training. Virtual reality simulators open up new perspectives for training in laparoscopic surgery. Under realistic conditions in real time, trainees can tailor their sessions with the VR simulator to suit their needs and goals, and can repeat exercises as often as they wish. VR simulators reduce the number of experimental animals needed for training purposes and are suited to the pursuit of research in laparoscopic surgery.

  20. Virtual reality and planetary exploration

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    NASA-Ames is intensively developing virtual-reality (VR) capabilities that can extend and augment computer-generated and remote spatial environments. VR is envisioned not only as a basis for improving human/machine interactions involved in planetary exploration, but also as a medium for the more widespread sharing of the experience of exploration, thereby broadening the support-base for the lunar and planetary-exploration endeavors. Imagery representative of Mars are being gathered for VR presentation at such terrestrial sites as Antarctica and Death Valley.

  1. The SEE Experience: Edutainment in 3D Virtual Worlds.

    ERIC Educational Resources Information Center

    Di Blas, Nicoletta; Paolini, Paolo; Hazan, Susan

    Shared virtual worlds are innovative applications where several users, represented by Avatars, simultaneously access via Internet a 3D space. Users cooperate through interaction with the environment and with each other, manipulating objects and chatting as they go. Apart from in the well documented online action games industry, now often played…

  2. 3D Virtual Worlds as Environments for Literacy Learning

    ERIC Educational Resources Information Center

    Merchant, Guy

    2010-01-01

    Background: Although much has been written about the ways in which new technology might transform educational practice, particularly in the area of literacy learning, there is relatively little empirical work that explores the possibilities and problems--or even what such a transformation might look like in the classroom. 3D virtual worlds offer a…

  3. Cognitive Aspects of Collaboration in 3d Virtual Environments

    NASA Astrophysics Data System (ADS)

    Juřík, V.; Herman, L.; Kubíček, P.; Stachoň, Z.; Šašinka, Č.

    2016-06-01

    Human-computer interaction has entered the 3D era. The most important models representing spatial information — maps — are transferred into 3D versions regarding the specific content to be displayed. Virtual worlds (VW) become promising area of interest because of possibility to dynamically modify content and multi-user cooperation when solving tasks regardless to physical presence. They can be used for sharing and elaborating information via virtual images or avatars. Attractiveness of VWs is emphasized also by possibility to measure operators' actions and complex strategies. Collaboration in 3D environments is the crucial issue in many areas where the visualizations are important for the group cooperation. Within the specific 3D user interface the operators' ability to manipulate the displayed content is explored regarding such phenomena as situation awareness, cognitive workload and human error. For such purpose, the VWs offer a great number of tools for measuring the operators' responses as recording virtual movement or spots of interest in the visual field. Study focuses on the methodological issues of measuring the usability of 3D VWs and comparing them with the existing principles of 2D maps. We explore operators' strategies to reach and interpret information regarding the specific type of visualization and different level of immersion.

  4. Web-based three-dimensional Virtual Body Structures: W3D-VBS.

    PubMed

    Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex

    2002-01-01

    Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user's progress through evaluation tools helps customize lesson plans. A self-guided "virtual tour" of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it. PMID:12223495

  5. Web-based Three-dimensional Virtual Body Structures: W3D-VBS

    PubMed Central

    Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex

    2002-01-01

    Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user’s progress through evaluation tools helps customize lesson plans. A self-guided “virtual tour” of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it. PMID:12223495

  6. A virtual reality platform for assessment and rehabilitation of neglect using a kinect.

    PubMed

    Cipresso, Pietro; Serino, Silvia; Pedroli, Elisa; Gaggioli, Andrea; Riva, Giuseppe

    2014-01-01

    Unilateral Spatial Neglect (USN) is normally assessed with paper-and-pencil tests. Virtual reality can be an effective neuropsychological tool for a more ecological and functional assessment and rehabilitation of neglect. We developed a 3D Virtual Reality platform - NeuroVirtual 3D - for the assessment and rehabilitation of cognitive deficits, in particular for USN. Within the virtual environments it is possible to interact with virtual objects and execute specific exercises using a Microsoft Kinect. Through the analysis of different grasping tasks it is possible to evaluate in an ecological way the patients' ability to find and handle objects in both sides of the virtual space.

  7. Visualizing Compound Rotations with Virtual Reality

    ERIC Educational Resources Information Center

    Flanders, Megan; Kavanagh, Richard C.

    2013-01-01

    Mental rotations are among the most difficult of all spatial tasks to perform, and even those with high levels of spatial ability can struggle to visualize the result of compound rotations. This pilot study investigates the use of the virtual reality-based Rotation Tool, created using the Virtual Reality Modeling Language (VRML) together with…

  8. An Onboard ISS Virtual Reality Trainer

    NASA Technical Reports Server (NTRS)

    Miralles, Evelyn

    2013-01-01

    Prior to the retirement of the Space Shuttle, many exterior repairs on the International Space Station (ISS) were carried out by shuttle astronauts, trained on the ground and flown to the Station to perform these specific repairs. With the retirement of the shuttle, this is no longer an available option. As such, the need for ISS crew members to review scenarios while on flight, either for tasks they already trained for on the ground or for contingency operations has become a very critical issue. NASA astronauts prepare for Extra-Vehicular Activities (EVA) or Spacewalks through numerous training media, such as: self-study, part task training, underwater training in the Neutral Buoyancy Laboratory (NBL), hands-on hardware reviews and training at the Virtual Reality Laboratory (VRLab). In many situations, the time between the last session of a training and an EVA task might be 6 to 8 months. EVA tasks are critical for a mission and as time passes the crew members may lose proficiency on previously trained tasks and their options to refresh or learn a new skill while on flight are limited to reading training materials and watching videos. In addition, there is an increased need for unplanned contingency repairs to fix problems arising as the Station ages. In order to help the ISS crew members maintain EVA proficiency or train for contingency repairs during their mission, the Johnson Space Center's VRLab designed an immersive ISS Virtual Reality Trainer (VRT). The VRT incorporates a unique optical system that makes use of the already successful Dynamic On-board Ubiquitous Graphics (DOUG) software to assist crew members with procedure reviews and contingency EVAs while on board the Station. The need to train and re-train crew members for EVAs and contingency scenarios is crucial and extremely demanding. ISS crew members are now asked to perform EVA tasks for which they have not been trained and potentially have never seen before. The Virtual Reality Trainer (VRT

  9. D3D augmented reality imaging system: proof of concept in mammography

    PubMed Central

    Douglas, David B; Petricoin, Emanuel F; Liotta, Lance; Wilson, Eugene

    2016-01-01

    Purpose The purpose of this article is to present images from simulated breast microcalcifications and assess the pattern of the microcalcifications with a technical development called “depth 3-dimensional (D3D) augmented reality”. Materials and methods A computer, head display unit, joystick, D3D augmented reality software, and an in-house script of simulated data of breast microcalcifications in a ductal distribution were used. No patient data was used and no statistical analysis was performed. Results The D3D augmented reality system demonstrated stereoscopic depth perception by presenting a unique image to each eye, focal point convergence, head position tracking, 3D cursor, and joystick fly-through. Conclusion The D3D augmented reality imaging system offers image viewing with depth perception and focal point convergence. The D3D augmented reality system should be tested to determine its utility in clinical practice. PMID:27563261

  10. Virtual environment interaction through 3D audio by blind children.

    PubMed

    Sánchez, J; Lumbreras, M

    1999-01-01

    Interactive software is actively used for learning, cognition, and entertainment purposes. Educational entertainment software is not very popular among blind children because most computer games and electronic toys have interfaces that are only accessible through visual cues. This work applies the concept of interactive hyperstories to blind children. Hyperstories are implemented in a 3D acoustic virtual world. In past studies we have conceptualized a model to design hyperstories. This study illustrates the feasibility of the model. It also provides an introduction to researchers to the field of entertainment software for blind children. As a result, we have designed and field tested AudioDoom, a virtual environment interacted through 3D Audio by blind children. AudioDoom is also a software that enables testing nontrivial interfaces and cognitive tasks with blind children. We explored the construction of cognitive spatial structures in the minds of blind children through audio-based entertainment and spatial sound navigable experiences. Children playing AudioDoom were exposed to first person experiences by exploring highly interactive virtual worlds through the use of 3D aural representations of the space. This experience was structured in several cognitive tasks where they had to build concrete models of their spatial representations constructed through the interaction with AudioDoom by using Legotrade mark blocks. We analyze our preliminary results after testing AudioDoom with Chilean children from a school for blind children. We discuss issues such as interactivity in software without visual cues, the representation of spatial sound navigable experiences, and entertainment software such as computer games for blind children. We also evaluate the feasibility to construct virtual environments through the design of dynamic learning materials with audio cues.

  11. Special Section: New Ways to Detect Colon Cancer 3-D virtual screening now being used

    MedlinePlus

    ... tech medical fields of biomedical visualization, computer graphics, virtual reality, and multimedia. The year was 1994. Kaufman's "two- ... organ, like the colon—and view it in virtual reality." Later, he and his team used it with ...

  12. Virtual reality training improves balance function.

    PubMed

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-09-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function.

  13. Virtual reality training improves balance function

    PubMed Central

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-01-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function. PMID:25368651

  14. Brave New (Interactive) Worlds: A Review of the Design Affordances and Constraints of Two 3D Virtual Worlds as Interactive Learning Environments

    ERIC Educational Resources Information Center

    Dickey, Michele D.

    2005-01-01

    Three-dimensional virtual worlds are an emerging medium currently being used in both traditional classrooms and for distance education. Three-dimensional (3D) virtual worlds are a combination of desk-top interactive Virtual Reality within a chat environment. This analysis provides an overview of Active Worlds Educational Universe and Adobe…

  15. Using Virtual Reality For Outreach Purposes in Planetology

    NASA Astrophysics Data System (ADS)

    Civet, François; Le Mouélic, Stéphane; Le Menn, Erwan; Beaunay, Stéphanie

    2016-10-01

    2016 has been a year marked by a technological breakthrough : the availability for the first time to the general public of technologically mature virtual reality devices. Virtual Reality consists in visually immerging a user in a 3D environment reproduced either from real and/or imaginary data, with the possibility to move and eventually interact with the different elements. In planetology, most of the places will remain inaccessible to the public for a while, but a fleet of dedicated spacecraft's such as orbiters, landers and rovers allow the possibility to virtually reconstruct the environments, using image processing, cartography and photogrammetry. Virtual reality can then bridge the gap to virtually "send" any user into the place and enjoy the exploration.We are investigating several type of devices to render orbital or ground based data of planetological interest, mostly from Mars. The most simple system consists of a "cardboard" headset, on which the user can simply use his cellphone as the screen. A more comfortable experience is obtained with more complex systems such as the HTC vive or Oculus Rift headsets, which include a tracking system important to minimize motion sickness. The third environment that we have developed is based on the CAVE concept, were four 3D video projectors are used to project on three 2x3m walls plus the ground. These systems can be used for scientific data analysis, but also prove to be perfectly suited for outreach and education purposes.

  16. Gravity and spatial orientation in virtual 3D-mazes.

    PubMed

    Vidal, Manuel; Lipshits, Mark; McIntyre, Joseph; Berthoz, Alain

    2003-01-01

    In order to bring new insights into the processing of 3D spatial information, we conducted experiments on the capacity of human subjects to memorize 3D-structured environments, such as buildings with several floors or the potentially complex 3D structure of an orbital space station. We had subjects move passively in one of two different exploration modes, through a visual virtual environment that consisted of a series of connected tunnels. In upright displacement, self-rotation when going around corners in the tunnels was limited to yaw rotations. For horizontal translations, subjects faced forward in the direction of motion. When moving up or down through vertical segments of the 3D tunnels, however, subjects facing the tunnel wall, remaining upright as if moving up and down in a glass elevator. In the unconstrained displacement mode, subjects would appear to climb or dive face-forward when moving vertically; thus, in this mode subjects could experience visual flow consistent with rotations about any of the 3 canonical axes. In a previous experiment, subjects were asked to determine whether a static, outside view of a test tunnel corresponded or not to the tunnel through which they had just passed. Results showed that performance was better on this task for the upright than for the unconstrained displacement mode; i.e. when subjects remained "upright" with respect to the virtual environment as defined by subject's posture in the first segment. This effect suggests that gravity may provide a key reference frame used in the shift between egocentric and allocentric representations of the 3D virtual world. To check whether it is the polarizing effects of gravity that leads to the favoring of the upright displacement mode, the experimental paradigm was adapted for orbital flight and performed by cosmonauts onboard the International Space Station. For these flight experiments the previous recognition task was replaced by a computerized reconstruction task, which proved

  17. STS-133 Crew Trains in Virtual Reality

    NASA Video Gallery

    In this episode of NASA "Behind the Scenes," STS-133 Pilot Eric Boe and space station Flight Director Royce Renfrew discuss how the virtual reality laboratory at the Johnson Space Center is helping...

  18. Virtual Reality in Education: Defining Researchable Issues.

    ERIC Educational Resources Information Center

    Hedburg, John; Alexander, Shirley

    1994-01-01

    Discusses situated learning and virtual reality, focusing on the pedagogical aspects of the technology and its importance in achieving a learning environment which challenges and supports effective learning. (AEF)

  19. Virtual reality applications in T and D engineering

    SciTech Connect

    Breen, P.T. Jr.; Scott, W.G.

    1995-12-31

    Visualization Technology (VT)--the author`s more meaningful definition of Virtual Reality is a commercial reality. Visualization technology can provide a realistic model of the real world, place a user within the synthetic space and allow him or her to interact within that space through head mounted displays, CRTs, data gloves, and 3D mice. Existing commercial applications of VT include the emulation of power plant control room panels, 3D models of commercial and industrial buildings and virtual models of transportation systems to train the handicapped. The authors believe that VT can greatly reduce the costs and increase the productivity of training T and D personnel, especially for hazardous assignments such as live-line maintenance. VT can also reduce the costs of design, construction and maintenance of major facilities such as power plants, substations, vaults, transmission lines and underground facilities.

  20. Data Visualization Using Immersive Virtual Reality Tools

    NASA Astrophysics Data System (ADS)

    Cioc, Alexandru; Djorgovski, S. G.; Donalek, C.; Lawler, E.; Sauer, F.; Longo, G.

    2013-01-01

    The growing complexity of scientific data poses serious challenges for an effective visualization. Data sets, e.g., catalogs of objects detected in sky surveys, can have a very high dimensionality, ~ 100 - 1000. Visualizing such hyper-dimensional data parameter spaces is essentially impossible, but there are ways of visualizing up to ~ 10 dimensions in a pseudo-3D display. We have been experimenting with the emerging technologies of immersive virtual reality (VR) as a platform for a scientific, interactive, collaborative data visualization. Our initial experiments used the virtual world of Second Life, and more recently VR worlds based on its open source code, OpenSimulator. There we can visualize up to ~ 100,000 data points in ~ 7 - 8 dimensions (3 spatial and others encoded as shapes, colors, sizes, etc.), in an immersive virtual space where scientists can interact with their data and with each other. We are now developing a more scalable visualization environment using the popular (practically an emerging standard) Unity 3D Game Engine, coded using C#, JavaScript, and the Unity Scripting Language. This visualization tool can be used through a standard web browser, or a standalone browser of its own. Rather than merely plotting data points, the application creates interactive three-dimensional objects of various shapes, colors, and sizes, and of course the XYZ positions, encoding various dimensions of the parameter space, that can be associated interactively. Multiple users can navigate through this data space simultaneously, either with their own, independent vantage points, or with a shared view. At this stage ~ 100,000 data points can be easily visualized within seconds on a simple laptop. The displayed data points can contain linked information; e.g., upon a clicking on a data point, a webpage with additional information can be rendered within the 3D world. A range of functionalities has been already deployed, and more are being added. We expect to make this

  1. Virtual Reality at the PC Level

    NASA Technical Reports Server (NTRS)

    Dean, John

    1998-01-01

    The main objective of my research has been to incorporate virtual reality at the desktop level; i.e., create virtual reality software that can be run fairly inexpensively on standard PC's. The standard language used for virtual reality on PC's is VRML (Virtual Reality Modeling Language). It is a new language so it is still undergoing a lot of changes. VRML 1.0 came out only a couple years ago and VRML 2.0 came out around last September. VRML is an interpreted language that is run by a web browser plug-in. It is fairly flexible in terms of allowing you to create different shapes and animations. Before this summer, I knew very little about virtual reality and I did not know VRML at all. I learned the VRML language by reading two books and experimenting on a PC. The following topics are presented: CAD to VRML, VRML 1.0 to VRML 2.0, VRML authoring tools, VRML browsers, finding virtual reality applications, the AXAF project, the VRML generator program, web communities and future plans.

  2. Enhanced LOD Concepts for Virtual 3d City Models

    NASA Astrophysics Data System (ADS)

    Benner, J.; Geiger, A.; Gröger, G.; Häfele, K.-H.; Löwner, M.-O.

    2013-09-01

    Virtual 3D city models contain digital three dimensional representations of city objects like buildings, streets or technical infrastructure. Because size and complexity of these models continuously grow, a Level of Detail (LoD) concept effectively supporting the partitioning of a complete model into alternative models of different complexity and providing metadata, addressing informational content, complexity and quality of each alternative model is indispensable. After a short overview on various LoD concepts, this paper discusses the existing LoD concept of the CityGML standard for 3D city models and identifies a number of deficits. Based on this analysis, an alternative concept is developed and illustrated with several examples. It differentiates between first, a Geometric Level of Detail (GLoD) and a Semantic Level of Detail (SLoD), and second between the interior building and its exterior shell. Finally, a possible implementation of the new concept is demonstrated by means of an UML model.

  3. Virtual environment display for a 3D audio room simulation

    NASA Technical Reports Server (NTRS)

    Chapin, William L.; Foster, Scott H.

    1992-01-01

    The development of a virtual environment simulation system integrating a 3D acoustic audio model with an immersive 3D visual scene is discussed. The system complements the acoustic model and is specified to: allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; reinforce the listener's feeling of telepresence in the acoustical environment with visual and proprioceptive sensations; enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations.

  4. Research on 3D virtual campus scene modeling based on 3ds Max and VRML

    NASA Astrophysics Data System (ADS)

    Kang, Chuanli; Zhou, Yanliu; Liang, Xianyue

    2015-12-01

    With the rapid development of modem technology, the digital information management and the virtual reality simulation technology has become a research hotspot. Virtual campus 3D model can not only express the real world objects of natural, real and vivid, and can expand the campus of the reality of time and space dimension, the combination of school environment and information. This paper mainly uses 3ds Max technology to create three-dimensional model of building and on campus buildings, special land etc. And then, the dynamic interactive function is realized by programming the object model in 3ds Max by VRML .This research focus on virtual campus scene modeling technology and VRML Scene Design, and the scene design process in a variety of real-time processing technology optimization strategy. This paper guarantees texture map image quality and improve the running speed of image texture mapping. According to the features and architecture of Guilin University of Technology, 3ds Max, AutoCAD and VRML were used to model the different objects of the virtual campus. Finally, the result of virtual campus scene is summarized.

  5. Augmented Reality Imaging System: 3D Viewing of a Breast Cancer

    PubMed Central

    Douglas, David B.; Boone, John M.; Petricoin, Emanuel; Liotta, Lance; Wilson, Eugene

    2016-01-01

    Objective To display images of breast cancer from a dedicated breast CT using Depth 3-Dimensional (D3D) augmented reality. Methods A case of breast cancer imaged using contrast-enhanced breast CT (Computed Tomography) was viewed with the augmented reality imaging, which uses a head display unit (HDU) and joystick control interface. Results The augmented reality system demonstrated 3D viewing of the breast mass with head position tracking, stereoscopic depth perception, focal point convergence and the use of a 3D cursor and joy-stick enabled fly through with visualization of the spiculations extending from the breast cancer. Conclusion The augmented reality system provided 3D visualization of the breast cancer with depth perception and visualization of the mass's spiculations. The augmented reality system should be further researched to determine the utility in clinical practice. PMID:27774517

  6. The Application of Virtual Reality on Distance Education

    NASA Astrophysics Data System (ADS)

    Zhan, Zehui

    The features and classifications of Virtual Reality Techniques have been summarized and recommendation of applying Virtual Reality on distance education has been made. Future research is needed on the design and implementation of virtual classroom and courseware.

  7. From Multi-User Virtual Environment to 3D Virtual Learning Environment

    ERIC Educational Resources Information Center

    Livingstone, Daniel; Kemp, Jeremy; Edgar, Edmund

    2008-01-01

    While digital virtual worlds have been used in education for a number of years, advances in the capabilities and spread of technology have fed a recent boom in interest in massively multi-user 3D virtual worlds for entertainment, and this in turn has led to a surge of interest in their educational applications. In this paper we briefly review the…

  8. Virtual Reality: A New Learning Environment.

    ERIC Educational Resources Information Center

    Ferrington, Gary; Loge, Kenneth

    1992-01-01

    Discusses virtual reality (VR) technology and its possible uses in military training, medical education, industrial design and development, the media industry, and education. Three primary applications of VR in the learning process--visualization, simulation, and construction of virtual worlds--are described, and pedagogical and moral issues are…

  9. A New Navigation Method for 3D Virtual Environment Exploration

    NASA Astrophysics Data System (ADS)

    Haydar, Mahmoud; Maidi, Madjid; Roussel, David; Mallem, Malik

    2009-03-01

    Navigation in virtual environments is a complex task which imposes a high cognitive load on the user. It consists on maintaining knowledge of current position and orientation of the user while he moves through the space. In this paper, we present a novel approach for navigation in 3D virtual environments. The method is based on the principle of skiing, and the idea is to provide to the user a total control of his navigation speed and rotation using his two hands. This technique enables user-steered exploration by determining the direction and the speed of motion using the knowledge of the positions of the user hands. A module of speed control is included to the technique to easily control the speed using the angle between the hands. The direction of motion is given by the orthogonal axis of the segment joining the two hands. A user study will show the efficiency of the method in performing exploration tasks in complex 3D large-scale environments. Furthermore, we proposed an experimental protocol to prove that this technique presents a high level of navigation guidance and control, achieving significantly better performance in comparison to simple navigation techniques.

  10. Virtual reality and hallucination: a technoetic perspective

    NASA Astrophysics Data System (ADS)

    Slattery, Diana R.

    2008-02-01

    Virtual Reality (VR), especially in a technologically focused discourse, is defined by a class of hardware and software, among them head-mounted displays (HMDs), navigation and pointing devices; and stereoscopic imaging. This presentation examines the experiential aspect of VR. Putting "virtual" in front of "reality" modifies the ontological status of a class of experience-that of "reality." Reality has also been modified [by artists, new media theorists, technologists and philosophers] as augmented, mixed, simulated, artificial, layered, and enhanced. Modifications of reality are closely tied to modifications of perception. Media theorist Roy Ascott creates a model of three "VR's": Verifiable Reality, Virtual Reality, and Vegetal (entheogenically induced) Reality. The ways in which we shift our perceptual assumptions, create and verify illusions, and enter "the willing suspension of disbelief" that allows us entry into imaginal worlds is central to the experience of VR worlds, whether those worlds are explicitly representational (robotic manipulations by VR) or explicitly imaginal (VR artistic creations). The early rhetoric surrounding VR was interwoven with psychedelics, a perception amplified by Timothy Leary's presence on the historic SIGGRAPH panel, and the Wall Street Journal's tag of VR as "electronic LSD." This paper discusses the connections-philosophical, social-historical, and psychological-perceptual between these two domains.

  11. Mobile Virtual Reality : A Solution for Big Data Visualization

    NASA Astrophysics Data System (ADS)

    Marshall, E.; Seichter, N. D.; D'sa, A.; Werner, L. A.; Yuen, D. A.

    2015-12-01

    Pursuits in geological sciences and other branches of quantitative sciences often require data visualization frameworks that are in continual need of improvement and new ideas. Virtual reality is a medium of visualization that has large audiences originally designed for gaming purposes; Virtual reality can be captured in Cave-like environment but they are unwieldy and expensive to maintain. Recent efforts by major companies such as Facebook have focussed more on a large market , The Oculus is the first of such kind of mobile devices The operating system Unity makes it possible for us to convert the data files into a mesh of isosurfaces and be rendered into 3D. A user is immersed inside of the virtual reality and is able to move within and around the data using arrow keys and other steering devices, similar to those employed in XBox.. With introductions of products like the Oculus Rift and Holo Lens combined with ever increasing mobile computing strength, mobile virtual reality data visualization can be implemented for better analysis of 3D geological and mineralogical data sets. As more new products like the Surface Pro 4 and other high power yet very mobile computers are introduced to the market, the RAM and graphics card capacity necessary to run these models is more available, opening doors to this new reality. The computing requirements needed to run these models are a mere 8 GB of RAM and 2 GHz of CPU speed, which many mobile computers are starting to exceed. Using Unity 3D software to create a virtual environment containing a visual representation of the data, any data set converted into FBX or OBJ format which can be traversed by wearing the Oculus Rift device. This new method for analysis in conjunction with 3D scanning has potential applications in many fields, including the analysis of precious stones or jewelry. Using hologram technology to capture in high-resolution the 3D shape, color, and imperfections of minerals and stones, detailed review and

  12. 3D Technology Selection for a Virtual Learning Environment by Blending ISO 9126 Standard and AHP

    ERIC Educational Resources Information Center

    Cetin, Aydin; Guler, Inan

    2011-01-01

    Web3D presents many opportunities for learners in a virtual world or virtual environment over the web. This is a great opportunity for open-distance education institutions to benefit from web3d technologies to create courses with interactive 3d materials. There are many open source and commercial products offering 3d technologies over the web…

  13. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  14. Virtual reality in a children's hospital.

    PubMed

    Nihei, K; Shirakawa, K; Isshiki, N; Hirose, M; Iwata, H; Kobayashi, N

    1999-01-01

    We used virtual reality technology to improve the quality of life and amenity of in-patients in a children's hospital. Children in the hospital could enjoy a zoo, amusement park, and aquarium, in virtual. They played soccer, skiing and horse riding in virtual. They could communicate with persons who were out of the hospital and attend the school which they had gone to before entering hospital. They played music with children who had been admitted to other children's hospitals. By using this virtual technology, the quality of life of children who suffered from psychological and physiological stress in the hospital greatly improved. It is not only useful for their QOL but also for the healing of illness. However, these methods are very rare. Our systemic in our children's hospital is the first to be reported in Japan both software and hardware of virtual reality technology to increase the QOL of sick children need further development.

  15. Second Life, a 3-D Animated Virtual World: An Alternative Platform for (Art) Education

    ERIC Educational Resources Information Center

    Han, Hsiao-Cheng

    2011-01-01

    3-D animated virtual worlds are no longer only for gaming. With the advance of technology, animated virtual worlds not only are found on every computer, but also connect users with the internet. Today, virtual worlds are created not only by companies, but also through the collaboration of users. Online 3-D animated virtual worlds provide a new…

  16. 3D virtual screening of large combinatorial spaces.

    PubMed

    Muegge, Ingo; Zhang, Qiang

    2015-01-01

    A new method for 3D in silico screening of large virtual combinatorial chemistry spaces is described. The software PharmShape screens millions of individual compounds applying a multi-conformational pharmacophore and shape based approach. Its extension, PharmShapeCC, is capable of screening trillions of compounds from tens of thousands of combinatorial libraries. Key elements of PharmShape and PharmShapeCC are customizable pharmacophore features, a composite inclusion sphere, library core intermediate clustering, and the determination of combinatorial library consensus orientations that allow for orthogonal enumeration of libraries. The performance of the software is illustrated by the prospective identification of a novel CXCR5 antagonist and examples of finding novel chemotypes from synthesizing and evaluating combinatorial hit libraries identified from PharmShapeCC screens for CCR1, LTA4 hydrolase, and MMP-13.

  17. From Vesalius to Virtual Reality: How Embodied Cognition Facilitates the Visualization of Anatomy

    ERIC Educational Resources Information Center

    Jang, Susan

    2010-01-01

    This study examines the facilitative effects of embodiment of a complex internal anatomical structure through three-dimensional ("3-D") interactivity in a virtual reality ("VR") program. Since Shepard and Metzler's influential 1971 study, it has been known that 3-D objects (e.g., multiple-armed cube or external body parts) are visually and…

  18. ICCE/ICCAI 2000 Full & Short Papers (Virtual Reality in Education).

    ERIC Educational Resources Information Center

    2000

    This document contains the full text of the following full and short papers on virtual reality in education from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A CAL System for Appreciation of 3D Shapes by Surface Development (C3D-SD)" (Stephen C. F. Chan, Andy…

  19. Using Virtual Reality Computer Models to Support Student Understanding of Astronomical Concepts

    ERIC Educational Resources Information Center

    Barnett, Michael; Yamagata-Lynch, Lisa; Keating, Tom; Barab, Sasha A.; Hay, Kenneth E.

    2005-01-01

    The purpose of this study was to examine how 3-dimensional (3-D) models of the Solar System supported student development of conceptual understandings of various astronomical phenomena that required a change in frame of reference. In the course described in this study, students worked in teams to design and construct 3-D virtual reality computer…

  20. Virtual Learning Is Becoming Reality.

    ERIC Educational Resources Information Center

    Jancek, Richard L.

    Once a school district decides to offer students virtual classes, it has to recognize the costs associated with the implementation, the logistical needs, the staff that will be needed to assist students, and the maintenance of the technology. Adapting the philosophy of virtual education is only the beginning. The role of the traditional teacher…

  1. Virtual environment display for a 3D audio room simulation

    NASA Astrophysics Data System (ADS)

    Chapin, William L.; Foster, Scott

    1992-06-01

    Recent developments in virtual 3D audio and synthetic aural environments have produced a complex acoustical room simulation. The acoustical simulation models a room with walls, ceiling, and floor of selected sound reflecting/absorbing characteristics and unlimited independent localizable sound sources. This non-visual acoustic simulation, implemented with 4 audio ConvolvotronsTM by Crystal River Engineering and coupled to the listener with a Poihemus IsotrakTM, tracking the listener's head position and orientation, and stereo headphones returning binaural sound, is quite compelling to most listeners with eyes closed. This immersive effect should be reinforced when properly integrated into a full, multi-sensory virtual environment presentation. This paper discusses the design of an interactive, visual virtual environment, complementing the acoustic model and specified to: 1) allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; 2) reinforce the listener's feeling of telepresence into the acoustical environment with visual and proprioceptive sensations; 3) enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and 4) serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations. The installed system implements a head-coupled, wide-angle, stereo-optic tracker/viewer and multi-computer simulation control. The portable demonstration system implements a head-mounted wide-angle, stereo-optic display, separate head and pointer electro-magnetic position trackers, a heterogeneous parallel graphics processing system, and object oriented C++ program code.

  2. Applied virtual reality at the Research Triangle Institute

    NASA Technical Reports Server (NTRS)

    Montoya, R. Jorge

    1994-01-01

    Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.

  3. Objective and subjective quality assessment of geometry compression of reconstructed 3D humans in a 3D virtual room

    NASA Astrophysics Data System (ADS)

    Mekuria, Rufael; Cesar, Pablo; Doumanis, Ioannis; Frisiello, Antonella

    2015-09-01

    Compression of 3D object based video is relevant for 3D Immersive applications. Nevertheless, the perceptual aspects of the degradation introduced by codecs for meshes and point clouds are not well understood. In this paper we evaluate the subjective and objective degradations introduced by such codecs in a state of art 3D immersive virtual room. In the 3D immersive virtual room, users are captured with multiple cameras, and their surfaces are reconstructed as photorealistic colored/textured 3D meshes or point clouds. To test the perceptual effect of compression and transmission, we render degraded versions with different frame rates in different contexts (near/far) in the scene. A quantitative subjective study with 16 users shows that negligible distortion of decoded surfaces compared to the original reconstructions can be achieved in the 3D virtual room. In addition, a qualitative task based analysis in a full prototype field trial shows increased presence, emotion, user and state recognition of the reconstructed 3D Human representation compared to animated computer avatars.

  4. A novel 3D guidance system using augmented reality for percutaneous vertebroplasty: technical note.

    PubMed

    Abe, Yuichiro; Sato, Shigenobu; Kato, Koji; Hyakumachi, Takahiko; Yanagibashi, Yasushi; Ito, Manabu; Abumi, Kuniyoshi

    2013-10-01

    Augmented reality (AR) is an imaging technology by which virtual objects are overlaid onto images of real objects captured in real time by a tracking camera. This study aimed to introduce a novel AR guidance system called virtual protractor with augmented reality (VIPAR) to visualize a needle trajectory in 3D space during percutaneous vertebroplasty (PVP). The AR system used for this study comprised a head-mount display (HMD) with a tracking camera and a marker sheet. An augmented scene was created by overlaying the preoperatively generated needle trajectory path onto a marker detected on the patient using AR software, thereby providing the surgeon with augmented views in real time through the HMD. The accuracy of the system was evaluated by using a computer-generated simulation model in a spine phantom and also evaluated clinically in 5 patients. In the 40 spine phantom trials, the error of the insertion angle (EIA), defined as the difference between the attempted angle and the insertion angle, was evaluated using 3D CT scanning. Computed tomography analysis of the 40 spine phantom trials showed that the EIA in the axial plane significantly improved when VIPAR was used compared with when it was not used (0.96° ± 0.61° vs 4.34° ± 2.36°, respectively). The same held true for EIA in the sagittal plane (0.61° ± 0.70° vs 2.55° ± 1.93°, respectively). In the clinical evaluation of the AR system, 5 patients with osteoporotic vertebral fractures underwent VIPAR-guided PVP from October 2011 to May 2012. The postoperative EIA was evaluated using CT. The clinical results of the 5 patients showed that the EIA in all 10 needle insertions was 2.09° ± 1.3° in the axial plane and 1.98° ± 1.8° in the sagittal plane. There was no pedicle breach or leakage of polymethylmethacrylate. VIPAR was successfully used to assist in needle insertion during PVP by providing the surgeon with an ideal insertion point and needle trajectory through the HMD. The findings indicate

  5. Participatory Gis: Experimentations for a 3d Social Virtual Globe

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Zamboni, G.

    2013-08-01

    The dawn of GeoWeb 2.0, the geographic extension of Web 2.0, has opened new possibilities in terms of online dissemination and sharing of geospatial contents, thus laying the foundations for a fruitful development of Participatory GIS (PGIS). The purpose of the study is to investigate the extension of PGIS applications, which are quite mature in the traditional bi-dimensional framework, up to the third dimension. More in detail, the system should couple a powerful 3D visualization with an increase of public participation by means of a tool allowing data collecting from mobile devices (e.g. smartphones and tablets). The PGIS application, built using the open source NASA World Wind virtual globe, is focussed on the cultural and tourism heritage of Como city, located in Northern Italy. An authentication mechanism was implemented, which allows users to create and manage customized projects through cartographic mash-ups of Web Map Service (WMS) layers. Saved projects populate a catalogue which is available to the entire community. Together with historical maps and the current cartography of the city, the system is also able to manage geo-tagged multimedia data, which come from user field-surveys performed through mobile devices and report POIs (Points Of Interest). Each logged user can then contribute to POIs characterization by adding textual and multimedia information (e.g. images, audios and videos) directly on the globe. All in all, the resulting application allows users to create and share contributions as it usually happens on social platforms, additionally providing a realistic 3D representation enhancing the expressive power of data.

  6. Applications of Virtual Reality to Nuclear Safeguards

    SciTech Connect

    Stansfield, S.

    1998-11-03

    This paper explores two potential applications of Virtual Reality (VR) to international nuclear safeguards: training and information organization and navigation. The applications are represented by two existing prototype systems, one for training nuclear weapons dismantlement and one utilizing a VR model to facilitate intuitive access to related sets of information.

  7. NASA employee utilizes Virtual Reality (VR) equipment

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Bebe Ly of the Information Systems Directorate's Software Technology Branch at JSC gives virtual reality a try. The stero video goggles and headphones allow her to see and hear in a computer-generated world and the gloves allow her to move around and grasp objects.

  8. Surgery, virtual reality, and the future.

    PubMed

    Vosburgh, Kirby G; Golby, Alexandra; Pieper, Steven D

    2013-01-01

    MMVR has provided the leading forum for the multidisciplinary interaction and development of the use of Virtual Reality (VR) techniques in medicine, particularly in surgical practice. Here we look back at the foundations of our field, focusing on the use of VR in Surgery and similar interventional procedures, sum up the current status, and describe the challenges and opportunities going forward.

  9. Virtual Reality Training Environments: Contexts and Concerns.

    ERIC Educational Resources Information Center

    Harmon, Stephen W.; Kenney, Patrick J.

    1994-01-01

    Discusses the contexts where virtual reality (VR) training environments might be appropriate; examines the advantages and disadvantages of VR as a training technology; and presents a case study of a VR training environment used at the NASA Johnson Space Center in preparation for the repair of the Hubble Space Telescope. (AEF)

  10. Evaluation of Virtual Reality Training Using Affect

    ERIC Educational Resources Information Center

    Tichon, Jennifer

    2012-01-01

    Training designed to support and strengthen higher-order mental abilities now often involves immersion in Virtual Reality (VR) where dangerous real world scenarios can be safely replicated. However, despite the growing popularity of VR to train cognitive skills such as decision-making and situation awareness, methods for evaluating their use rely…

  11. Virtual Reality: Visualization in Three Dimensions.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    Virtual reality is a newly emerging tool for scientific visualization that makes possible multisensory, three-dimensional modeling of scientific data. While the emphasis is on visualization, the other senses are added to enhance what the scientist can visualize. Researchers are working to extend the sensory range of what can be perceived in…

  12. Virtual Reality: Is It for Real?

    ERIC Educational Resources Information Center

    Dowding, Tim J.

    1994-01-01

    Defines virtual reality and describes its application to psychomotor skills training. A description of a system that could be used to teach a college course in physical therapy, including the use of miniature computer workstation, sensory gloves, a programmable mannequin, and other existing technology, is provided. (Contains 10 references.) (KRN)

  13. Integrated Data Visualization and Virtual Reality Tool

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  14. Presence Pedagogy: Teaching and Learning in a 3D Virtual Immersive World

    ERIC Educational Resources Information Center

    Bronack, Stephen; Sanders, Robert; Cheney, Amelia; Riedl, Richard; Tashner, John; Matzen, Nita

    2008-01-01

    As the use of 3D immersive virtual worlds in higher education expands, it is important to examine which pedagogical approaches are most likely to bring about success. AET Zone, a 3D immersive virtual world in use for more than seven years, is one embodiment of pedagogical innovation that capitalizes on what virtual worlds have to offer to social…

  15. Controlling social stress in virtual reality environments.

    PubMed

    Hartanto, Dwi; Kampmann, Isabel L; Morina, Nexhmedin; Emmelkamp, Paul G M; Neerincx, Mark A; Brinkman, Willem-Paul

    2014-01-01

    Virtual reality exposure therapy has been proposed as a viable alternative in the treatment of anxiety disorders, including social anxiety disorder. Therapists could benefit from extensive control of anxiety eliciting stimuli during virtual exposure. Two stimuli controls are studied in this study: the social dialogue situation, and the dialogue feedback responses (negative or positive) between a human and a virtual character. In the first study, 16 participants were exposed in three virtual reality scenarios: a neutral virtual world, blind date scenario, and job interview scenario. Results showed a significant difference between the three virtual scenarios in the level of self-reported anxiety and heart rate. In the second study, 24 participants were exposed to a job interview scenario in a virtual environment where the ratio between negative and positive dialogue feedback responses of a virtual character was systematically varied on-the-fly. Results yielded that within a dialogue the more positive dialogue feedback resulted in less self-reported anxiety, lower heart rate, and longer answers, while more negative dialogue feedback of the virtual character resulted in the opposite. The correlations between on the one hand the dialogue stressor ratio and on the other hand the means of SUD score, heart rate and audio length in the eight dialogue conditions showed a strong relationship: r(6) = 0.91, p = 0.002; r(6) = 0.76, p = 0.028 and r(6) = -0.94, p = 0.001 respectively. Furthermore, more anticipatory anxiety reported before exposure was found to coincide with more self-reported anxiety, and shorter answers during the virtual exposure. These results demonstrate that social dialogues in a virtual environment can be effectively manipulated for therapeutic purposes.

  16. Controlling Social Stress in Virtual Reality Environments

    PubMed Central

    Hartanto, Dwi; Kampmann, Isabel L.; Morina, Nexhmedin; Emmelkamp, Paul G. M.; Neerincx, Mark A.; Brinkman, Willem-Paul

    2014-01-01

    Virtual reality exposure therapy has been proposed as a viable alternative in the treatment of anxiety disorders, including social anxiety disorder. Therapists could benefit from extensive control of anxiety eliciting stimuli during virtual exposure. Two stimuli controls are studied in this study: the social dialogue situation, and the dialogue feedback responses (negative or positive) between a human and a virtual character. In the first study, 16 participants were exposed in three virtual reality scenarios: a neutral virtual world, blind date scenario, and job interview scenario. Results showed a significant difference between the three virtual scenarios in the level of self-reported anxiety and heart rate. In the second study, 24 participants were exposed to a job interview scenario in a virtual environment where the ratio between negative and positive dialogue feedback responses of a virtual character was systematically varied on-the-fly. Results yielded that within a dialogue the more positive dialogue feedback resulted in less self-reported anxiety, lower heart rate, and longer answers, while more negative dialogue feedback of the virtual character resulted in the opposite. The correlations between on the one hand the dialogue stressor ratio and on the other hand the means of SUD score, heart rate and audio length in the eight dialogue conditions showed a strong relationship: r(6) = 0.91, p = 0.002; r(6) = 0.76, p = 0.028 and r(6) = −0.94, p = 0.001 respectively. Furthermore, more anticipatory anxiety reported before exposure was found to coincide with more self-reported anxiety, and shorter answers during the virtual exposure. These results demonstrate that social dialogues in a virtual environment can be effectively manipulated for therapeutic purposes. PMID:24671006

  17. Implementation of virtual models from sheet metal forming simulation into physical 3D colour models using 3D printing

    NASA Astrophysics Data System (ADS)

    Junk, S.

    2016-08-01

    Today the methods of numerical simulation of sheet metal forming offer a great diversity of possibilities for optimization in product development and in process design. However, the results from simulation are only available as virtual models. Because there are any forming tools available during the early stages of product development, physical models that could serve to represent the virtual results are therefore lacking. Physical 3D-models can be created using 3D-printing and serve as an illustration and present a better understanding of the simulation results. In this way, the results from the simulation can be made more “comprehensible” within a development team. This paper presents the possibilities of 3D-colour printing with particular consideration of the requirements regarding the implementation of sheet metal forming simulation. Using concrete examples of sheet metal forming, the manufacturing of 3D colour models will be expounded upon on the basis of simulation results.

  18. Web Reference: A Virtual Reality.

    ERIC Educational Resources Information Center

    Foster, Janet

    1999-01-01

    Presents ideas and strategies to enhance digital reference services available via the Internet in public libraries. Describes print publications which include Web reference columns; subject guides, both print and online; and the resources of the Internet Public Library and other virtual reference desks. (LRW)

  19. Transportation planning: A virtual reality

    SciTech Connect

    Bradley, J.; Hefele, J.; Dolin, R.M.

    1994-07-01

    An important factor in the development of any base technology is generating it in such a way that these technologies will continue to be useful through systems upgrades and implementation philosophy metamorphoses. Base technologies of traffic engineering including transportation modeling, traffic impact forecasting, traffic operation management, emergency situation routing and re-routing, and signal systems optimization should all be designed with the future in mind. Advanced Traffic Engineering topics, such as Intelligent Vehicle Highway Systems, are designed with advanced engineering concepts such as rules-based design and artificial intelligence. All aspects of development of base technologies must include Total Quality Engineering as the primary factor in order to succeed. This philosophy for development of base technologies for the County of Los Alamos is being developed leveraging the resources of the Center for Advanced Engineering Technology (CAET) at the Los Alamos National Laboratory. The mission of the CAET is to develop next-generation engineering technology that supports the Los Alamos National Laboratory`s mission and to transfer that technology to industry and academia. The CAET`s goal is to promote industrial, academic, and government interactions in diverse areas of engineering technology, such as, design, analysis, manufacturing, virtual enterprise, robotics, telepresence, rapid prototyping, and virtual environment technology. The Center is expanding, enhancing, and increasing core competencies at the Los Alamos National Laboratory. The CAET has three major thrust areas: development of base technologies, virtual environment technology applications, and educational outreach and training. Virtual environment technology immerses a user in a nonexistent or augmented environment for research or training purposes. Virtual environment technology illustrates the axiom, ``The best way to learn is by doing.``

  20. Development of real-time motion capture system for 3D on-line games linked with virtual character

    NASA Astrophysics Data System (ADS)

    Kim, Jong Hyeong; Ryu, Young Kee; Cho, Hyung Suck

    2004-10-01

    Motion tracking method is being issued as essential part of the entertainment, medical, sports, education and industry with the development of 3-D virtual reality. Virtual human character in the digital animation and game application has been controlled by interfacing devices; mouse, joysticks, midi-slider, and so on. Those devices could not enable virtual human character to move smoothly and naturally. Furthermore, high-end human motion capture systems in commercial market are expensive and complicated. In this paper, we proposed a practical and fast motion capturing system consisting of optic sensors, and linked the data with 3-D game character with real time. The prototype experiment setup is successfully applied to a boxing game which requires very fast movement of human character.

  1. Virtual hospitalization: reality or utopia?

    PubMed

    Maceratini, R; Rafanelli, M; Ricci, F L

    1995-01-01

    In this paper the problem regarding the way in which the increasing capacity and facilities of the telemedicine to link points of care, supporting services and health care sectors, is described. The virtual hospitalization is discussed, as well as its insertion in a new health care system, whose services will provide everyone with effective health care in their homes or in isolated places or in their working places or in emergencies, and which will permit remote consultations between professionals in specialized centers, hospitals and other peripheral points of care.

  2. Virtual reality for dragline planners

    SciTech Connect

    Cobcroft, T.

    2007-03-15

    3d-Dig as an invaluable mine planning and communication tool, developed by Earth Technology Pty Ltd., that makes it possible to easily communicate a mine plan through the use of animations and other graphics. An Australian company has been using it to plan in-detail pits and strips for up to five years in advance; a US operator is using it to optimise dragline stripping around inside corners and to accurately plan the traverse of ramps. The new system offers a better predication of rehandled volumes, linear coal advance and dig time within a strip. It is useful for optimising waste stripping and timing of uncovered coal to enhance blending and shipping reliability. It presents volumetric, spoil placement and positioning data while generating animations that communicate the plan. 5 figs.

  3. Fully Three-Dimensional Virtual-Reality System

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1994-01-01

    Proposed virtual-reality system presents visual displays to simulate free flight in three-dimensional space. System, virtual space pod, is testbed for control and navigation schemes. Unlike most virtual-reality systems, virtual space pod would not depend for orientation on ground plane, which hinders free flight in three dimensions. Space pod provides comfortable seating, convenient controls, and dynamic virtual-space images for virtual traveler. Controls include buttons plus joysticks with six degrees of freedom.

  4. iVirtualWorld: A Domain-Oriented End-User Development Environment for Building 3D Virtual Chemistry Experiments

    ERIC Educational Resources Information Center

    Zhong, Ying

    2013-01-01

    Virtual worlds are well-suited for building virtual laboratories for educational purposes to complement hands-on physical laboratories. However, educators may face technical challenges because developing virtual worlds requires skills in programming and 3D design. Current virtual world building tools are developed for users who have programming…

  5. The Virtual Tablet: Virtual Reality as a Control System

    NASA Technical Reports Server (NTRS)

    Chronister, Andrew

    2016-01-01

    In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of

  6. Applying Virtual Reality to commercial Edutainment

    NASA Technical Reports Server (NTRS)

    Grissom, F.; Goza, Sharon P.; Goza, S. Michael

    1994-01-01

    Virtual reality (VR) when defined as a computer generated, immersive, three dimensional graphics environment which provides varying degrees of interactivity, remains an expensive, highly specialized application, yet to find its way into the school, home, or business. As a novel approach to a theme park-type attraction, though, its use can be justified. This paper describes how a virtual reality 'tour of the human digestive system' was created for the Omniplex Science Museum of Oklahoma City, Oklahoma. The customers main objectives were: (1) to educate; (2) to entertain; (3) to draw visitors; and (4) to generate revenue. The 'Edutainment' system ultimately delivered met these goals. As more such systems come into existence the resulting library of licensable programs will greatly reduce development costs to individual institutions.

  7. Feedback from video for virtual reality Navigation

    SciTech Connect

    Tsap, L V

    2000-10-27

    Important preconditions for wide acceptance of virtual reality (VR) systems include their comfort, ease and naturalness to use. Most existing trackers super from discomfort-related issues. For example, body-based trackers (hand controllers, joysticks, helmet attachments, etc.) restrict spontaneity and naturalness of motion, while ground-based devices (e.g., hand controllers) limit the workspace by literally binding an operator to the ground. There are similar problems with controls. This paper describes using real-time video with registered depth information (from a commercially available camera) for virtual reality navigation. Camera-based setup can replace cumbersome trackers. The method includes selective depth processing for increased speed, and a robust skin-color segmentation for accounting illumination variations.

  8. Virtual reality training for health-care professionals.

    PubMed

    Mantovani, Fabrizia; Castelnuovo, Gianluca; Gaggioli, Andrea; Riva, Giuseppe

    2003-08-01

    Emerging changes in health-care delivery are having a significant impact on the structure of health-care professionals' education. Today it is recognized that medical knowledge doubles every 6-8 years, with new medical procedures emerging everyday. While the half-life of medical information is so short, the average physician practices 30 years and the average nurse 40 years. Continuing education thus represents an important challenge to face. Recent advances in educational technology are offering an increasing number of innovative learning tools. Among these, Virtual Reality represents a promising area with high potential of enhancing the training of health-care professionals. Virtual Reality Training can provide a rich, interactive, engaging educational context, thus supporting experiential learning-by-doing; it can, in fact, contribute to raise interest and motivation in trainees and to effectively support skills acquisition and transfer, since the learning process can be settled within an experiential framework. Current virtual training applications for health-care differ a lot as to both their technological/multimedia sophistication and to the types of skills trained, varying for example from telesurgical applications to interactive simulations of human body and brain, to virtual worlds for emergency training. Other interesting applications include the development of immersive 3D environments for training psychiatrists and psychologists in the treatment of mental disorders. This paper has the main aim of discussing the rationale and main benefits for the use of virtual reality in health-care education and training. Significant research and projects carried out in this field will also be presented, followed by discussion on key issues concerning current limitations and future development directions. PMID:14511451

  9. Virtual reality training for health-care professionals.

    PubMed

    Mantovani, Fabrizia; Castelnuovo, Gianluca; Gaggioli, Andrea; Riva, Giuseppe

    2003-08-01

    Emerging changes in health-care delivery are having a significant impact on the structure of health-care professionals' education. Today it is recognized that medical knowledge doubles every 6-8 years, with new medical procedures emerging everyday. While the half-life of medical information is so short, the average physician practices 30 years and the average nurse 40 years. Continuing education thus represents an important challenge to face. Recent advances in educational technology are offering an increasing number of innovative learning tools. Among these, Virtual Reality represents a promising area with high potential of enhancing the training of health-care professionals. Virtual Reality Training can provide a rich, interactive, engaging educational context, thus supporting experiential learning-by-doing; it can, in fact, contribute to raise interest and motivation in trainees and to effectively support skills acquisition and transfer, since the learning process can be settled within an experiential framework. Current virtual training applications for health-care differ a lot as to both their technological/multimedia sophistication and to the types of skills trained, varying for example from telesurgical applications to interactive simulations of human body and brain, to virtual worlds for emergency training. Other interesting applications include the development of immersive 3D environments for training psychiatrists and psychologists in the treatment of mental disorders. This paper has the main aim of discussing the rationale and main benefits for the use of virtual reality in health-care education and training. Significant research and projects carried out in this field will also be presented, followed by discussion on key issues concerning current limitations and future development directions.

  10. Sound For Animation And Virtual Reality

    NASA Technical Reports Server (NTRS)

    Hahn, James K.; Docter, Pete; Foster, Scott H.; Mangini, Mark; Myers, Tom; Wenzel, Elizabeth M.; Null, Cynthia (Technical Monitor)

    1995-01-01

    Sound is an integral part of the experience in computer animation and virtual reality. In this course, we will present some of the important technical issues in sound modeling, rendering, and synchronization as well as the "art" and business of sound that are being applied in animations, feature films, and virtual reality. The central theme is to bring leading researchers and practitioners from various disciplines to share their experiences in this interdisciplinary field. The course will give the participants an understanding of the problems and techniques involved in producing and synchronizing sounds, sound effects, dialogue, and music. The problem spans a number of domains including computer animation and virtual reality. Since sound has been an integral part of animations and films much longer than for computer-related domains, we have much to learn from traditional animation and film production. By bringing leading researchers and practitioners from a wide variety of disciplines, the course seeks to give the audience a rich mixture of experiences. It is expected that the audience will be able to apply what they have learned from this course in their research or production.

  11. Issues and Challenges of Teaching and Learning in 3D Virtual Worlds: Real Life Case Studies

    ERIC Educational Resources Information Center

    Pfeil, Ulrike; Ang, Chee Siang; Zaphiris, Panayiotis

    2009-01-01

    We aimed to study the characteristics and usage patterns of 3D virtual worlds in the context of teaching and learning. To achieve this, we organised a full-day workshop to explore, discuss and investigate the educational use of 3D virtual worlds. Thirty participants took part in the workshop. All conversations were recorded and transcribed for…

  12. 3D Inhabited Virtual Worlds: Interactivity and Interaction between Avatars, Autonomous Agents, and Users.

    ERIC Educational Resources Information Center

    Jensen, Jens F.

    This paper addresses some of the central questions currently related to 3-Dimensional Inhabited Virtual Worlds (3D-IVWs), their virtual interactions, and communication, drawing from the theory and methodology of sociology, interaction analysis, interpersonal communication, semiotics, cultural studies, and media studies. First, 3D-IVWs--seen as a…

  13. The Virtual Radiopharmacy Laboratory: A 3-D Simulation for Distance Learning

    ERIC Educational Resources Information Center

    Alexiou, Antonios; Bouras, Christos; Giannaka, Eri; Kapoulas, Vaggelis; Nani, Maria; Tsiatsos, Thrasivoulos

    2004-01-01

    This article presents Virtual Radiopharmacy Laboratory (VR LAB), a virtual laboratory accessible through the Internet. VR LAB is designed and implemented in the framework of the VirRAD European project. This laboratory represents a 3D simulation of a radio-pharmacy laboratory, where learners, represented by 3D avatars, can experiment on…

  14. Virtual reality in the operating room of the future.

    PubMed

    Müller, W; Grosskopf, S; Hildebrand, A; Malkewitz, R; Ziegler, R

    1997-01-01

    In cooperation with the Max-Delbrück-Centrum/Robert-Rössle-Klinik (MDC/RRK) in Berlin, the Fraunhofer Institute for Computer Graphics is currently designing and developing a scenario for the operating room of the future. The goal of this project is to integrate new analysis, visualization and interaction tools in order to optimize and refine tumor diagnostics and therapy in combination with laser technology and remote stereoscopic video transfer. Hence, a human 3-D reference model is reconstructed using CT, MR, and anatomical cryosection images from the National Library of Medicine's Visible Human Project. Applying segmentation algorithms and surface-polygonization methods a 3-D representation is obtained. In addition, a "fly-through" the virtual patient is realized using 3-D input devices (data glove, tracking system, 6-DOF mouse). In this way, the surgeon can experience really new perspectives of the human anatomy. Moreover, using a virtual cutting plane any cut of the CT volume can be interactively placed and visualized in realtime. In conclusion, this project delivers visions for the application of effective visualization and VR systems. Commonly known as Virtual Prototyping and applied by the automotive industry long ago, this project shows, that the use of VR techniques can also prototype an operating room. After evaluating design and functionality of the virtual operating room, MDC plans to build real ORs in the near future. The use of VR techniques provides a more natural interface for the surgeon in the OR (e.g., controlling interactions by voice input). Besides preoperative planning future work will focus on supporting the surgeon in performing surgical interventions. An optimal synthesis of real and synthetic data, and the inclusion of visual, aural, and tactile senses in virtual environments can meet these requirements. This Augmented Reality could represent the environment for the surgeons of tomorrow. PMID:10173059

  15. The Engelbourg's ruins: from 3D TLS point cloud acquisition to 3D virtual and historic models

    NASA Astrophysics Data System (ADS)

    Koehl, Mathieu; Berger, Solveig; Nobile, Sylvain

    2014-05-01

    The Castle of Engelbourg was built at the beginning of the 13th century, at the top of the Schlossberg. It is situated on the territory of the municipality of Thann (France), at the crossroads of Alsace and Lorraine, and dominates the outlet of the valley of Thur. Its strategic position was one of the causes of its systematic destructions during the 17th century, and Louis XIV finished his fate by ordering his demolition in 1673. Today only few vestiges remain, of which a section of the main tower from about 7m of diameter and 4m of wide laying on its slice, unique characteristic in the regional castral landscape. It is visible since the valley, was named "the Eye of the witch", and became a key attraction of the region. The site, which extends over approximately one hectare, is for several years the object of numerous archaeological studies and is at the heart of a project of valuation of the vestiges today. It was indeed a key objective, among the numerous planned works, to realize a 3D model of the site in its current state, in other words, a virtual model "such as seized", exploitable as well from a cultural and tourist point of view as by scientists and in archaeological researches. The team of the ICube/INSA lab had in responsibility the realization of this model, the acquisition of the data until the delivery of the virtual model, thanks to 3D TLS and topographic surveying methods. It was also planned to integrate into this 3D model, data of 2D archives, stemming from series of former excavations. The objectives of this project were the following ones: • Acquisition of 3D digital data of the site and 3D modelling • Digitization of the 2D archaeological data and integration in the 3D model • Implementation of a database connected to the 3D model • Virtual Visit of the site The obtained results allowed us to visualize every 3D object individually, under several forms (point clouds, 3D meshed objects and models, etc.) and at several levels of detail

  16. DJ Sim: a virtual reality DJ simulation game

    NASA Astrophysics Data System (ADS)

    Tang, Ka Yin; Loke, Mei Hwan; Chin, Ching Ling; Chua, Gim Guan; Chong, Jyh Herng; Manders, Corey; Khan, Ishtiaq Rasool; Yuan, Miaolong; Farbiz, Farzam

    2009-02-01

    This work describes the process of developing a 3D Virtual Reality (VR) DJ simulation game intended to be displayed on a stereoscopic display. Using a DLP projector and shutter glasses, the user of the system plays a game in which he or she is a DJ in a night club. The night club's music is playing, and the DJ is "scratching" in correspondence to this music. Much in the flavor of Guitar Hero or Dance Dance Revolution, a virtual turntable is manipulated to project information about how the user should perform. The user only needs a small set of hand gestures, corresponding to the turntable scratch movements to play the game. As the music plays, a series of moving arrows approaching the DJ's turntable instruct the user as to when and how to perform the scratches.

  17. Virtual reality simulators and training in laparoscopic surgery.

    PubMed

    Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos

    2015-01-01

    Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training.

  18. Interaction Design and Usability of Learning Spaces in 3D Multi-user Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Minocha, Shailey; Reeves, Ahmad John

    Three-dimensional virtual worlds are multimedia, simulated environments, often managed over the Web, which users can 'inhabit' and interact via their own graphical, self-representations known as 'avatars'. 3D virtual worlds are being used in many applications: education/training, gaming, social networking, marketing and commerce. Second Life is the most widely used 3D virtual world in education. However, problems associated with usability, navigation and way finding in 3D virtual worlds may impact on student learning and engagement. Based on empirical investigations of learning spaces in Second Life, this paper presents design guidelines to improve the usability and ease of navigation in 3D spaces. Methods of data collection include semi-structured interviews with Second Life students, educators and designers. The findings have revealed that design principles from the fields of urban planning, Human- Computer Interaction, Web usability, geography and psychology can influence the design of spaces in 3D multi-user virtual environments.

  19. Selected Applications of Virtual Reality in Manufacturing

    NASA Astrophysics Data System (ADS)

    Novak-Marcincin, Jozef

    2011-01-01

    Virtual reality (VR) has become an important and useful tool in science and engineering. VR applications cover a wide range of industrial areas from product design to analysis, from product prototyping to manufacturing. The design and manufacturing of a product can be viewed, evaluated and improved in a virtual environment before its prototype is made, which is an enormous cost saving. Virtual Manufacturing (VM) is the use of computer models and simulations of manufacturing processes to aid in the design and production of manufactured products. VM is the use of manufacturing-based simulations to optimize the design of product and processes for a specific manufacturing goal such as: design for assembly; quality; lean operations; and/or flexibility.

  20. Virtual and Printed 3D Models for Teaching Crystal Symmetry and Point Groups

    ERIC Educational Resources Information Center

    Casas, Lluís; Estop, Euge`nia

    2015-01-01

    Both, virtual and printed 3D crystal models can help students and teachers deal with chemical education topics such as symmetry and point groups. In the present paper, two freely downloadable tools (interactive PDF files and a mobile app) are presented as examples of the application of 3D design to study point-symmetry. The use of 3D printing to…

  1. Contextual EFL Learning in a 3D Virtual Environment

    ERIC Educational Resources Information Center

    Lan, Yu-Ju

    2015-01-01

    The purposes of the current study are to develop virtually immersive EFL learning contexts for EFL learners in Taiwan to pre- and review English materials beyond the regular English class schedule. A 2-iteration action research lasting for one semester was conducted to evaluate the effects of virtual contexts on learners' EFL learning. 132…

  2. Intelligent virtual reality in the setting of fuzzy sets

    NASA Technical Reports Server (NTRS)

    Dockery, John; Littman, David

    1992-01-01

    The authors have previously introduced the concept of virtual reality worlds governed by artificial intelligence. Creation of an intelligent virtual reality was further proposed as a universal interface for the handicapped. This paper extends consideration of intelligent virtual realty to a context in which fuzzy set principles are explored as a major tool for implementing theory in the domain of applications to the disabled.

  3. Virtual Reality: A Dream Come True or a Nightmare.

    ERIC Educational Resources Information Center

    Cornell, Richard; Bailey, Dan

    Virtual Reality (VR) is a new medium which allows total stimulation of one's senses through human/computer interfaces. VR has applications in training simulators, nano-science, medicine, entertainment, electronic technology, and manufacturing. This paper focuses on some current and potential problems of virtual reality and virtual environments…

  4. Importance of Virtual Reality to Virtual Reality Exposure Therapy, Study Design of a Randomized Trial.

    PubMed

    McLay, Robert N; Baird, Alicia; Murphy, Jennifer; Deal, William; Tran, Lily; Anson, Heather; Klam, Warren; Johnston, Scott

    2015-01-01

    Post Traumatic Stress Disorder (PTSD) can be a debilitating problem in service members who have served in Iraq or Afghanistan. Virtual Reality Exposure Therapy (VRET) is one of the few interventions demonstrated in randomized controlled trials to be effective for PTSD in this population. There are theoretical reasons to expect that Virtual Reality (VR) adds to the effectiveness of exposure therapy, but there is also added expense and difficulty in using VR. Described is a trial comparing outcomes from VRET and a control exposure therapy (CET) protocol in service members with PTSD. PMID:26799904

  5. Visual landmarks facilitate rodent spatial navigation in virtual reality environments

    PubMed Central

    Youngstrom, Isaac A.; Strowbridge, Ben W.

    2012-01-01

    Because many different sensory modalities contribute to spatial learning in rodents, it has been difficult to determine whether spatial navigation can be guided solely by visual cues. Rodents moving within physical environments with visual cues engage a variety of nonvisual sensory systems that cannot be easily inhibited without lesioning brain areas. Virtual reality offers a unique approach to ask whether visual landmark cues alone are sufficient to improve performance in a spatial task. We found that mice could learn to navigate between two water reward locations along a virtual bidirectional linear track using a spherical treadmill. Mice exposed to a virtual environment with vivid visual cues rendered on a single monitor increased their performance over a 3-d training regimen. Training significantly increased the percentage of time avatars controlled by the mice spent near reward locations in probe trials without water rewards. Neither improvement during training or spatial learning for reward locations occurred with mice operating a virtual environment without vivid landmarks or with mice deprived of all visual feedback. Mice operating the vivid environment developed stereotyped avatar turning behaviors when alternating between reward zones that were positively correlated with their performance on the probe trial. These results suggest that mice are able to learn to navigate to specific locations using only visual cues presented within a virtual environment rendered on a single computer monitor. PMID:22345484

  6. Evaluation of Home Delivery of Lectures Utilizing 3D Virtual Space Infrastructure

    ERIC Educational Resources Information Center

    Nishide, Ryo; Shima, Ryoichi; Araie, Hiromu; Ueshima, Shinichi

    2007-01-01

    Evaluation experiments have been essential in exploring home delivery of lectures for which users can experience campus lifestyle and distant learning through 3D virtual space. This paper discusses the necessity of virtual space for distant learners by examining the effects of virtual space. The authors have pursued the possibility of…

  7. Virtual reality and consciousness inference in dreaming.

    PubMed

    Hobson, J Allan; Hong, Charles C-H; Friston, Karl J

    2014-01-01

    This article explores the notion that the brain is genetically endowed with an innate virtual reality generator that - through experience-dependent plasticity - becomes a generative or predictive model of the world. This model, which is most clearly revealed in rapid eye movement (REM) sleep dreaming, may provide the theater for conscious experience. Functional neuroimaging evidence for brain activations that are time-locked to rapid eye movements (REMs) endorses the view that waking consciousness emerges from REM sleep - and dreaming lays the foundations for waking perception. In this view, the brain is equipped with a virtual model of the world that generates predictions of its sensations. This model is continually updated and entrained by sensory prediction errors in wakefulness to ensure veridical perception, but not in dreaming. In contrast, dreaming plays an essential role in maintaining and enhancing the capacity to model the world by minimizing model complexity and thereby maximizing both statistical and thermodynamic efficiency. This perspective suggests that consciousness corresponds to the embodied process of inference, realized through the generation of virtual realities (in both sleep and wakefulness). In short, our premise or hypothesis is that the waking brain engages with the world to predict the causes of sensations, while in sleep the brain's generative model is actively refined so that it generates more efficient predictions during waking. We review the evidence in support of this hypothesis - evidence that grounds consciousness in biophysical computations whose neuronal and neurochemical infrastructure has been disclosed by sleep research.

  8. Virtual reality and consciousness inference in dreaming

    PubMed Central

    Hobson, J. Allan; Hong, Charles C.-H.; Friston, Karl J.

    2014-01-01

    This article explores the notion that the brain is genetically endowed with an innate virtual reality generator that – through experience-dependent plasticity – becomes a generative or predictive model of the world. This model, which is most clearly revealed in rapid eye movement (REM) sleep dreaming, may provide the theater for conscious experience. Functional neuroimaging evidence for brain activations that are time-locked to rapid eye movements (REMs) endorses the view that waking consciousness emerges from REM sleep – and dreaming lays the foundations for waking perception. In this view, the brain is equipped with a virtual model of the world that generates predictions of its sensations. This model is continually updated and entrained by sensory prediction errors in wakefulness to ensure veridical perception, but not in dreaming. In contrast, dreaming plays an essential role in maintaining and enhancing the capacity to model the world by minimizing model complexity and thereby maximizing both statistical and thermodynamic efficiency. This perspective suggests that consciousness corresponds to the embodied process of inference, realized through the generation of virtual realities (in both sleep and wakefulness). In short, our premise or hypothesis is that the waking brain engages with the world to predict the causes of sensations, while in sleep the brain’s generative model is actively refined so that it generates more efficient predictions during waking. We review the evidence in support of this hypothesis – evidence that grounds consciousness in biophysical computations whose neuronal and neurochemical infrastructure has been disclosed by sleep research. PMID:25346710

  9. Virtual reality and consciousness inference in dreaming.

    PubMed

    Hobson, J Allan; Hong, Charles C-H; Friston, Karl J

    2014-01-01

    This article explores the notion that the brain is genetically endowed with an innate virtual reality generator that - through experience-dependent plasticity - becomes a generative or predictive model of the world. This model, which is most clearly revealed in rapid eye movement (REM) sleep dreaming, may provide the theater for conscious experience. Functional neuroimaging evidence for brain activations that are time-locked to rapid eye movements (REMs) endorses the view that waking consciousness emerges from REM sleep - and dreaming lays the foundations for waking perception. In this view, the brain is equipped with a virtual model of the world that generates predictions of its sensations. This model is continually updated and entrained by sensory prediction errors in wakefulness to ensure veridical perception, but not in dreaming. In contrast, dreaming plays an essential role in maintaining and enhancing the capacity to model the world by minimizing model complexity and thereby maximizing both statistical and thermodynamic efficiency. This perspective suggests that consciousness corresponds to the embodied process of inference, realized through the generation of virtual realities (in both sleep and wakefulness). In short, our premise or hypothesis is that the waking brain engages with the world to predict the causes of sensations, while in sleep the brain's generative model is actively refined so that it generates more efficient predictions during waking. We review the evidence in support of this hypothesis - evidence that grounds consciousness in biophysical computations whose neuronal and neurochemical infrastructure has been disclosed by sleep research. PMID:25346710

  10. Sensorimotor Training in Virtual Reality: A Review

    PubMed Central

    Adamovich, Sergei V.; Fluet, Gerard G.; Tunik, Eugene; Merians, Alma S.

    2010-01-01

    Recent experimental evidence suggests that rapid advancement of virtual reality (VR) technologies has great potential for the development of novel strategies for sensorimotor training in neurorehabilitation. We discuss what the adaptive and engaging virtual environments can provide for massive and intensive sensorimotor stimulation needed to induce brain reorganization. Second, discrepancies between the veridical and virtual feedback can be introduced in VR to facilitate activation of targeted brain networks, which in turn can potentially speed up the recovery process. Here we review the existing experimental evidence regarding the beneficial effects of training in virtual environments on the recovery of function in the areas of gait, upper extremity function and balance, in various patient populations. We also discuss possible mechanisms underlying these effects. We feel that future research in the area of virtual rehabilitation should follow several important paths. Imaging studies to evaluate the effects of sensory manipulation on brain activation patterns and the effect of various training parameters on long term changes in brain function are needed to guide future clinical inquiry. Larger clinical studies are also needed to establish the efficacy of sensorimotor rehabilitation using VR approaches in various clinical populations and most importantly, to identify VR training parameters that are associated with optimal transfer into real-world functional improvements. PMID:19713617

  11. Virtual 3D microscopy using multiplane whole slide images in diagnostic pathology.

    PubMed

    Kalinski, Thomas; Zwönitzer, Ralf; Sel, Saadettin; Evert, Matthias; Guenther, Thomas; Hofmann, Harald; Bernarding, Johannes; Roessner, Albert

    2008-08-01

    To reproduce focusing in virtual microscopy, it is necessary to construct 3-dimensional (3D) virtual slides composed of whole slide images with different focuses. As focusing is frequently used for the assessment of Helicobacter pylori colonization in diagnostic pathology, we prepared virtual 3D slides with up to 9 focus planes from 144 gastric biopsy specimens with or without H pylori gastritis. The biopsy specimens were diagnosed in a blinded manner by 3 pathologists according to the updated Sydney classification using conventional microscopy, virtual microscopy with a single focus plane, and virtual 3D microscopy with 5 and 9 focus planes enabling virtual focusing. Regarding the classification of H pylori, we found a positive correlation between the number of focus planes used in virtual microscopy and the number of correct diagnoses as determined by conventional microscopy. Concerning H pylori positivity, the specificity and sensitivity of virtual 3D microscopy using virtual slides with 9 focus planes achieved a minimum of 0.95 each, which was approximately the same as in conventional microscopy. We consider virtual 3D microscopy appropriate for primary diagnosis of H pylori gastritis and equivalent to conventional microscopy.

  12. A rapid algorithm for realistic human reaching and its use in a virtual reality system

    NASA Technical Reports Server (NTRS)

    Aldridge, Ann; Pandya, Abhilash; Goldsby, Michael; Maida, James

    1994-01-01

    The Graphics Analysis Facility (GRAF) at JSC has developed a rapid algorithm for computing realistic human reaching. The algorithm was applied to GRAF's anthropometrically correct human model and used in a 3D computer graphics system and a virtual reality system. The nature of the algorithm and its uses are discussed.

  13. Photographer: Digital Telepresence: Dr Murial Ross's Virtual Reality Application for Neuroscience

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Photographer: Digital Telepresence: Dr Murial Ross's Virtual Reality Application for Neuroscience Research Biocomputation. To study human disorders of balance and space motion sickness. Shown here is a 3D reconstruction of a nerve ending in inner ear, nature's wiring of balance organs.

  14. Linking Audio and Visual Information while Navigating in a Virtual Reality Kiosk Display

    ERIC Educational Resources Information Center

    Sullivan, Briana; Ware, Colin; Plumlee, Matthew

    2006-01-01

    3D interactive virtual reality museum exhibits should be easy to use, entertaining, and informative. If the interface is intuitive, it will allow the user more time to learn the educational content of the exhibit. This research deals with interface issues concerning activating audio descriptions of images in such exhibits while the user is…

  15. 3-D Virtual and Physical Reconstruction of Bendego Iron

    NASA Astrophysics Data System (ADS)

    Belmonte, S. L. R.; Zucolotto, M. E.; Fontes, R. C.; dos Santos, J. R. L.

    2012-09-01

    The use of 3D laser scanning to meteoritic to preserve the original shape of the meteorites before cutting and the facility of saved the datas in STL format (stereolithography) to print three-dimensional physical models and generate a digital replica.

  16. Using virtual reality to analyze sports performance.

    PubMed

    Bideau, Benoit; Kulpa, Richard; Vignais, Nicolas; Brault, Sébastien; Multon, Franck; Craig, Cathy

    2010-01-01

    Improving performance in sports can be difficult because many biomechanical, physiological, and psychological factors come into play during competition. A better understanding of the perception-action loop employed by athletes is necessary. This requires isolating contributing factors to determine their role in player performance. Because of its inherent limitations, video playback doesn't permit such in-depth analysis. Interactive, immersive virtual reality (VR) can overcome these limitations and foster a better understanding of sports performance from a behavioral-neuroscience perspective. Two case studies using VR technology and a sophisticated animation engine demonstrate how to use information from visual displays to inform a player's future course of action. PMID:20650707

  17. Summer Students in Virtual Reality: A Pilot Study on Educational Applications of Virtual Reality Technology.

    ERIC Educational Resources Information Center

    Bricken, Meredith; Byrne, Chris M.

    The goal of this study was to take a first step in evaluating the potential of virtual reality (VR) as a learning environment. The context of the study was The Technology Academy, a technology-oriented summer day camp for students ages 5-18, where student activities center around hands-on exploration of new technology (e.g., robotics, MIDI digital…

  18. Orchestrating learning during implementation of a 3D virtual world

    NASA Astrophysics Data System (ADS)

    Karakus, Turkan; Baydas, Ozlem; Gunay, Fatma; Coban, Murat; Goktas, Yuksel

    2016-10-01

    There are many issues to be considered when designing virtual worlds for educational purposes. In this study, the term orchestration has acquired a new definition as the moderation of problems encountered during the activity of turning a virtual world into an educational setting for winter sports. A development case showed that community plays a key role in both the emergence of challenges and in the determination of their solutions. The implications of this study showed that activity theory was a useful tool for understanding contextual issues. Therefore, instructional designers first developed relevant tools and community-based solutions. This study attempts to use activity theory in a prescriptive way, though it is known as a descriptive theory. Finally, since virtual world projects have many aspects, the variety of challenges and practical solutions presented in this study will provide practitioners with suggestions on how to overcome problems in future.

  19. Virtual reality for automotive design evaluation

    NASA Technical Reports Server (NTRS)

    Dodd, George G.

    1995-01-01

    A general description of Virtual Reality technology and possible applications was given from publicly available material. A video tape was shown demonstrating the use of multiple large-screen stereoscopic displays, configured in a 10' x 10' x 10' room, to allow a person to evaluate and interact with a vehicle which exists only as mathematical data, and is made only of light. The correct viewpoint of the vehicle is maintained by tracking special glasses worn by the subject. Interior illumination was changed by moving a virtual light around by hand; interior colors are changed by pointing at a color on a color palette, then pointing at the desired surface to change. We concluded by discussing research needed to move this technology forward.

  20. Virtual reality for the treatment of autism.

    PubMed

    Strickland, D

    1997-01-01

    Autism is a mental disorder which has received attention in several unrelated studies using virtual reality. One of the first attempts was to diagnose children with special needs at Tokyo University using a sandbox playing technique. Although operating the computer controls proved to be too difficult for the individuals with autism in the Tokyo study, research at the University of Nottingham, UK, is successful in using VR as a learning aid for children with a variety of disorders including autism. Both centers used flat screen computer systems with virtual scenes. Another study which concentrated on using VR as a learning aid with an immersive headset system is described in detail in this chapter. Perhaps because of the seriousness of the disorder and the lack of effective treatments, autism has received more study than attention deficit disorders, although both would appear to benefit from many of the same technology features. PMID:10184809

  1. Virtual reality for the treatment of autism.

    PubMed

    Strickland, D

    1997-01-01

    Autism is a mental disorder which has received attention in several unrelated studies using virtual reality. One of the first attempts was to diagnose children with special needs at Tokyo University using a sandbox playing technique. Although operating the computer controls proved to be too difficult for the individuals with autism in the Tokyo study, research at the University of Nottingham, UK, is successful in using VR as a learning aid for children with a variety of disorders including autism. Both centers used flat screen computer systems with virtual scenes. Another study which concentrated on using VR as a learning aid with an immersive headset system is described in detail in this chapter. Perhaps because of the seriousness of the disorder and the lack of effective treatments, autism has received more study than attention deficit disorders, although both would appear to benefit from many of the same technology features.

  2. Initial validation of a virtual-reality robotic simulator.

    PubMed

    Lendvay, Thomas S; Casale, Pasquale; Sweet, Robert; Peters, Craig

    2008-09-01

    Robotic surgery is an accepted adjunct to minimally invasive surgery, but training is restricted to console time. Virtual-reality (VR) simulation has been shown to be effective for laparoscopic training and so we seek to validate a novel VR robotic simulator. The American Urological Association (AUA) Office of Education approved this study. Subjects enrolled in a robotics training course at the 2007 AUA annual meeting underwent skills training in a da Vinci dry-lab module and a virtual-reality robotics module which included a three-dimensional (3D) VR robotic simulator. Demographic and acceptability data were obtained, and performance metrics from the simulator were compared between experienced and nonexperienced roboticists for a ring transfer task. Fifteen subjects-four with previous robotic surgery experience and 11 without-participated. Nine subjects were still in urology training and nearly half of the group had reported playing video games. Overall performance of the da Vinci system and the simulator were deemed acceptable by a Likert scale (0-6) rating of 5.23 versus 4.69, respectively. Experienced subjects outperformed nonexperienced subjects on the simulator on three metrics: total task time (96 s versus 159 s, P < 0.02), economy of motion (1,301 mm versus 2,095 mm, P < 0.04), and time the telemanipulators spent outside of the center of the platform's workspace (4 s versus 35 s, P < 0.02). This is the first demonstration of face and construct validity of a virtual-reality robotic simulator. Further studies assessing predictive validity are ultimately required to support incorporation of VR robotic simulation into training curricula. PMID:27628251

  3. Virtual reality applications in robotic simulations

    NASA Technical Reports Server (NTRS)

    Homan, David J.; Gott, Charles J.; Goza, S. Michael

    1994-01-01

    Virtual reality (VR) provides a means to practice integrated extravehicular activities (EVA)/remote manipulator system (RMS) operations in the on-orbit configuration with no discomfort or risk to crewmembers. VR afforded the STS-61 crew the luxury of practicing the integrated EVA/RMS operations in an on-orbit configuration prior to the actual flight. The VR simulation was developed by the Automation and Robotics Division's Telepresence/Virtual Reality Lab and Integrated Graphics, Operations, and Analysis Lab (IGOAL) at JSC. The RMS Part Task Trainer (PTT) was developed by the IGOAL for RMS training in 1988 as a fully functional, kinematic simulation of the shuttle RMS and served as the RMS portion of the integrated VR simulation. Because the EVA crewmember could get a realistic view of the shuttle and payload bay in the VR simulation, he/she could explore different positions and views to determine the best method for performing a specific task, thus greatly increasing the efficiency of use of the neutral buoyancy facilities.

  4. Virtual Reality Simulation for the Operating Room

    PubMed Central

    Gallagher, Anthony G.; Ritter, E Matt; Champion, Howard; Higgins, Gerald; Fried, Marvin P.; Moses, Gerald; Smith, C Daniel; Satava, Richard M.

    2005-01-01

    Summary Background Data: To inform surgeons about the practical issues to be considered for successful integration of virtual reality simulation into a surgical training program. The learning and practice of minimally invasive surgery (MIS) makes unique demands on surgical training programs. A decade ago Satava proposed virtual reality (VR) surgical simulation as a solution for this problem. Only recently have robust scientific studies supported that vision Methods: A review of the surgical education, human-factor, and psychology literature to identify important factors which will impinge on the successful integration of VR training into a surgical training program. Results: VR is more likely to be successful if it is systematically integrated into a well-thought-out education and training program which objectively assesses technical skills improvement proximate to the learning experience. Validated performance metrics should be relevant to the surgical task being trained but in general will require trainees to reach an objectively determined proficiency criterion, based on tightly defined metrics and perform at this level consistently. VR training is more likely to be successful if the training schedule takes place on an interval basis rather than massed into a short period of extensive practice. High-fidelity VR simulations will confer the greatest skills transfer to the in vivo surgical situation, but less expensive VR trainers will also lead to considerably improved skills generalizations. Conclusions: VR for improved performance of MIS is now a reality. However, VR is only a training tool that must be thoughtfully introduced into a surgical training curriculum for it to successfully improve surgical technical skills. PMID:15650649

  5. Spilling the beans on java 3D: a tool for the virtual anatomist.

    PubMed

    Guttmann, G D

    1999-04-15

    The computing world has just provided the anatomist with another tool: Java 3D, within the Java 2 platform. On December 9, 1998, Sun Microsystems released Java 2. Java 3D classes are now included in the jar (Java Archive) archives of the extensions directory of Java 2. Java 3D is also a part of the Java Media Suite of APIs (Application Programming Interfaces). But what is Java? How does Java 3D work? How do you view Java 3D objects? A brief introduction to the concepts of Java and object-oriented programming is provided. Also, there is a short description of the tools of Java 3D and of the Java 3D viewer. Thus, the virtual anatomist has another set of computer tools to use for modeling various aspects of anatomy, such as embryological development. Also, the virtual anatomist will be able to assist the surgeon with virtual surgery using the tools found in Java 3D. Java 3D will be able to fulfill gaps, such as the lack of platform independence, interactivity, and manipulability of 3D images, currently existing in many anatomical computer-aided learning programs.

  6. Virtual Reality Exposure Therapy Using a Virtual Iraq: Case Report

    PubMed Central

    Gerardi, Maryrose; Rothbaum, Barbara Olasov; Ressler, Kerry; Heekin, Mary; Rizzo, Albert

    2013-01-01

    Posttraumatic stress disorder (PTSD) has been estimated to affect up to 18% of returning Operation Iraqi Freedom (OIF) veterans. Soldiers need to maintain constant vigilance to deal with unpredictable threats, and an unprecedented number of soldiers are surviving serious wounds. These risk factors are significant for development of PTSD; therefore, early and efficient intervention options must be identified and presented in a form acceptable to military personnel. This case report presents the results of treatment utilizing virtual reality exposure (VRE) therapy (virtual Iraq) to treat an OIF veteran with PTSD. Following brief VRE treatment, the veteran demonstrated improvement in PTSD symptoms as indicated by clinically and statistically significant changes in scores on the Clinician Administered PTSD Scale (CAPS; Blake et al., 1990) and the PTSD Symptom Scale Self-Report (PSS-SR; Foa, Riggs, Dancu, & Rothbaum, 1993). These results indicate preliminary promise for this treatment. PMID:18404648

  7. Virtual reality exposure therapy using a virtual Iraq: case report.

    PubMed

    Gerardi, Maryrose; Rothbaum, Barbara Olasov; Ressler, Kerry; Heekin, Mary; Rizzo, Albert

    2008-04-01

    Posttraumatic stress disorder (PTSD) has been estimated to affect up to 18% of returning Operation Iraqi Freedom (OIF) veterans. Soldiers need to maintain constant vigilance to deal with unpredictable threats, and an unprecedented number of soldiers are surviving serious wounds. These risk factors are significant for development of PTSD; therefore, early and efficient intervention options must be identified and presented in a form acceptable to military personnel. This case report presents the results of treatment utilizing virtual reality exposure (VRE) therapy (virtual Iraq) to treat an OIF veteran with PTSD. Following brief VRE treatment, the veteran demonstrated improvement in PTSD symptoms as indicated by clinically and statistically significant changes in scores on the Clinician Administered PTSD Scale (CAPS; Blake et al., 1990) and the PTSD Symptom Scale Self-Report (PSS-SR; Foa, Riggs, Dancu, & Rothbaum, 1993). These results indicate preliminary promise for this treatment. PMID:18404648

  8. Combination of Virtual Tours, 3d Model and Digital Data in a 3d Archaeological Knowledge and Information System

    NASA Astrophysics Data System (ADS)

    Koehl, M.; Brigand, N.

    2012-08-01

    The site of the Engelbourg ruined castle in Thann, Alsace, France, has been for some years the object of all the attention of the city, which is the owner, and also of partners like historians and archaeologists who are in charge of its study. The valuation of the site is one of the main objective, as well as its conservation and its knowledge. The aim of this project is to use the environment of the virtual tour viewer as new base for an Archaeological Knowledge and Information System (AKIS). With available development tools we add functionalities in particular through diverse scripts that convert the viewer into a real 3D interface. By beginning with a first virtual tour that contains about fifteen panoramic images, the site of about 150 times 150 meters can be completely documented by offering the user a real interactivity and that makes visualization very concrete, almost lively. After the choice of pertinent points of view, panoramic images were realized. For the documentation, other sets of images were acquired at various seasons and climate conditions, which allow documenting the site in different environments and states of vegetation. The final virtual tour was deducted from them. The initial 3D model of the castle, which is virtual too, was also joined in the form of panoramic images for completing the understanding of the site. A variety of types of hotspots were used to connect the whole digital documentation to the site, including videos (as reports during the acquisition phases, during the restoration works, during the excavations, etc.), digital georeferenced documents (archaeological reports on the various constituent elements of the castle, interpretation of the excavations and the searches, description of the sets of collected objects, etc.). The completely personalized interface of the system allows either to switch from a panoramic image to another one, which is the classic case of the virtual tours, or to go from a panoramic photographic image

  9. Three-dimensional virtual reality surgical planning and simulation workbench for orthognathic surgery.

    PubMed

    Xia, J; Samman, N; Yeung, R W; Shen, S G; Wang, D; Ip, H H; Tideman, H

    2000-01-01

    A new integrated computer system, the 3-dimensional (3D) virtual reality surgical planning and simulation workbench for orthognathic surgery (VRSP), is presented. Five major functions are implemented in this system: post-processing and reconstruction of computed tomographic (CT) data, transformation of 3D unique coordinate system geometry, generation of 3D color facial soft tissue models, virtual surgical planning and simulation, and presurgical prediction of soft tissue changes. The basic mensuration functions, such as linear and spatial measurements, are also included. The surgical planning and simulation are based on 3D CT reconstructions, whereas soft tissue prediction is based on an individualized, texture-mapped, color facial soft tissue model. The surgeon "enters" the virtual operatory with virtual reality equipment, "holds" a virtual scalpel, and "operates" on a virtual patient to accomplish actual surgical planning, simulation of the surgical procedure, and prediction of soft tissue changes before surgery. As a final result, a quantitative osteotomy-simulated bone model and predicted color facial model with photorealistic quality can be visualized from any arbitrary viewing point in a personal computer system. This system can be installed in any hospital for daily use.

  10. Treatment of Complicated Grief Using Virtual Reality: A Case Report

    ERIC Educational Resources Information Center

    Botella, C.; Osma, J.; Palacios, A. Garcia; Guillen, V.; Banos, R.

    2008-01-01

    This is the first work exploring the application of new technologies, concretely virtual reality, to facilitate emotional processing in the treatment of Complicated Grief. Our research team has designed a virtual reality environment (EMMA's World) to foster the expression and processing of emotions. In this study the authors present a description…

  11. A Desktop Virtual Reality Earth Motion System in Astronomy Education

    ERIC Educational Resources Information Center

    Chen, Chih Hung; Yang, Jie Chi; Shen, Sarah; Jeng, Ming Chang

    2007-01-01

    In this study, a desktop virtual reality earth motion system (DVREMS) is designed and developed to be applied in the classroom. The system is implemented to assist elementary school students to clarify earth motion concepts using virtual reality principles. A study was conducted to observe the influences of the proposed system in learning.…

  12. Designing a Virtual-Reality-Based, Gamelike Math Learning Environment

    ERIC Educational Resources Information Center

    Xu, Xinhao; Ke, Fengfeng

    2016-01-01

    This exploratory study examined the design issues related to a virtual-reality-based, gamelike learning environment (VRGLE) developed via OpenSimulator, an open-source virtual reality server. The researchers collected qualitative data to examine the VRGLE's usability, playability, and content integration for math learning. They found it important…

  13. Design of Learning Spaces in 3D Virtual Worlds: An Empirical Investigation of "Second Life"

    ERIC Educational Resources Information Center

    Minocha, Shailey; Reeves, Ahmad John

    2010-01-01

    "Second Life" (SL) is a three-dimensional (3D) virtual world, and educational institutions are adopting SL to support their teaching and learning. Although the question of how 3D learning spaces should be designed to support student learning and engagement has been raised among SL educators and designers, there is hardly any guidance or research…

  14. Employing Virtual Humans for Education and Training in X3D/VRML Worlds

    ERIC Educational Resources Information Center

    Ieronutti, Lucio; Chittaro, Luca

    2007-01-01

    Web-based education and training provides a new paradigm for imparting knowledge; students can access the learning material anytime by operating remotely from any location. Web3D open standards, such as X3D and VRML, support Web-based delivery of Educational Virtual Environments (EVEs). EVEs have a great potential for learning and training…

  15. Gaming in a 3D Multiuser Virtual Environment: Engaging Students in Science Lessons

    ERIC Educational Resources Information Center

    Lim, Cher, P.; Nonis, Darren; Hedberg, John

    2006-01-01

    Based on the exploratory study of a 3D multiuser virtual environment (3D MUVE), known as Quest Atlantis (QA), in a series of Primary Four (10- to 11-year-olds) Science lessons at Orchard Primary School in Singapore, this paper examines the issues of learning engagement and describes the socio-cultural context of QA's implementation. The students…

  16. Reordering virtual reality: recording and recreating real-time experiences

    NASA Astrophysics Data System (ADS)

    Dolinsky, Margaret; Sherman, William; Wernert, Eric; Chi, Yichen Catherine

    2012-03-01

    The proliferation of technological devices and artistic strategies has brought about an urgent and justifiable need to capture site-specific time-based virtual reality experiences. Interactive art experiences are specifically dependent on the orchestration of multiple sources including hardware, software, site-specific location, visitor inputs and 3D stereo and sensory interactions. Although a photograph or video may illustrate a particular component of the work, such as an illustration of the artwork or a sample of the sound, these only represent a fraction of the overall experience. This paper seeks to discuss documentation strategies that combine multiple approaches and capture the interactions between art projection, acting, stage design, sight movement, dialogue and audio design.

  17. Role of virtual reality simulation in endoscopy training.

    PubMed

    Harpham-Lockyer, Louis; Laskaratos, Faidon-Marios; Berlingieri, Pasquale; Epstein, Owen

    2015-12-10

    Recent advancements in virtual reality graphics and models have allowed virtual reality simulators to be incorporated into a variety of endoscopic training programmes. Use of virtual reality simulators in training programmes is thought to improve skill acquisition amongst trainees which is reflected in improved patient comfort and safety. Several studies have already been carried out to ascertain the impact that usage of virtual reality simulators may have upon trainee learning curves and how this may translate to patient comfort. This article reviews the available literature in this area of medical education which is particularly relevant to all parties involved in endoscopy training and curriculum development. Assessment of the available evidence for an optimal exposure time with virtual reality simulators and the long-term benefits of their use are also discussed.

  18. Building a virtual archive using brain architecture and Web 3D to deliver neuropsychopharmacology content over the Internet.

    PubMed

    Mongeau, R; Casu, M A; Pani, L; Pillolla, G; Lianas, L; Giachetti, A

    2008-05-01

    The vast amount of heterogeneous data generated in various fields of neurosciences such as neuropsychopharmacology can hardly be classified using traditional databases. We present here the concept of a virtual archive, spatially referenced over a simplified 3D brain map and accessible over the Internet. A simple prototype (available at http://aquatics.crs4.it/neuropsydat3d) has been realized using current Web-based virtual reality standards and technologies. It illustrates how primary literature or summary information can easily be retrieved through hyperlinks mapped onto a 3D schema while navigating through neuroanatomy. Furthermore, 3D navigation and visualization techniques are used to enhance the representation of brain's neurotransmitters, pathways and the involvement of specific brain areas in any particular physiological or behavioral functions. The system proposed shows how the use of a schematic spatial organization of data, widely exploited in other fields (e.g. Geographical Information Systems) can be extremely useful to develop efficient tools for research and teaching in neurosciences. PMID:18262677

  19. Building a virtual archive using brain architecture and Web 3D to deliver neuropsychopharmacology content over the Internet.

    PubMed

    Mongeau, R; Casu, M A; Pani, L; Pillolla, G; Lianas, L; Giachetti, A

    2008-05-01

    The vast amount of heterogeneous data generated in various fields of neurosciences such as neuropsychopharmacology can hardly be classified using traditional databases. We present here the concept of a virtual archive, spatially referenced over a simplified 3D brain map and accessible over the Internet. A simple prototype (available at http://aquatics.crs4.it/neuropsydat3d) has been realized using current Web-based virtual reality standards and technologies. It illustrates how primary literature or summary information can easily be retrieved through hyperlinks mapped onto a 3D schema while navigating through neuroanatomy. Furthermore, 3D navigation and visualization techniques are used to enhance the representation of brain's neurotransmitters, pathways and the involvement of specific brain areas in any particular physiological or behavioral functions. The system proposed shows how the use of a schematic spatial organization of data, widely exploited in other fields (e.g. Geographical Information Systems) can be extremely useful to develop efficient tools for research and teaching in neurosciences.

  20. Applications of Panoramic Images: from 720° Panorama to Interior 3d Models of Augmented Reality

    NASA Astrophysics Data System (ADS)

    Lee, I.-C.; Tsai, F.

    2015-05-01

    A series of panoramic images are usually used to generate a 720° panorama image. Although panoramic images are typically used for establishing tour guiding systems, in this research, we demonstrate the potential of using panoramic images acquired from multiple sites to create not only 720° panorama, but also three-dimensional (3D) point clouds and 3D indoor models. Since 3D modeling is one of the goals of this research, the location of the panoramic sites needed to be carefully planned in order to maintain a robust result for close-range photogrammetry. After the images are acquired, panoramic images are processed into 720° panoramas, and these panoramas which can be used directly as panorama guiding systems or other applications. In addition to these straightforward applications, interior orientation parameters can also be estimated while generating 720° panorama. These parameters are focal length, principle point, and lens radial distortion. The panoramic images can then be processed with closerange photogrammetry procedures to extract the exterior orientation parameters and generate 3D point clouds. In this research, VisaulSFM, a structure from motion software is used to estimate the exterior orientation, and CMVS toolkit is used to generate 3D point clouds. Next, the 3D point clouds are used as references to create building interior models. In this research, Trimble Sketchup was used to build the model, and the 3D point cloud was added to the determining of locations of building objects using plane finding procedure. In the texturing process, the panorama images are used as the data source for creating model textures. This 3D indoor model was used as an Augmented Reality model replacing a guide map or a floor plan commonly used in an on-line touring guide system. The 3D indoor model generating procedure has been utilized in two research projects: a cultural heritage site at Kinmen, and Taipei Main Station pedestrian zone guidance and navigation system. The

  1. The Virtual-casing Principle For 3D Toroidal Systems

    SciTech Connect

    Lazerson, Samuel A.

    2014-02-24

    The capability to calculate the magnetic field due to the plasma currents in a toroidally confined magnetic fusion equilibrium is of manifest relevance to equilibrium reconstruction and stellarator divertor design. Two methodologies arise for calculating such quantities. The first being a volume integral over the plasma current density for a given equilibrium. Such an integral is computationally expensive. The second is a surface integral over a surface current on the equilibrium boundary. This method is computationally desirable as the calculation does not grow as the radial resolution of the volume integral. This surface integral method has come to be known as the "virtual-casing principle". In this paper, a full derivation of this method is presented along with a discussion regarding its optimal application.

  2. Spatial integration of boundaries in a 3D virtual environment.

    PubMed

    Bouchekioua, Youcef; Miller, Holly C; Craddock, Paul; Blaisdell, Aaron P; Molet, Mikael

    2013-10-01

    Prior research, using two- and three-dimensional environments, has found that when both human and nonhuman animals independently acquire two associations between landmarks with a common landmark (e.g., LM1-LM2 and LM2-LM3), each with its own spatial relationship, they behave as if the two unique LMs have a known spatial relationship despite their never having been paired. Seemingly, they have integrated the two associations to create a third association with its own spatial relationship (LM1-LM3). Using sensory preconditioning (Experiment 1) and second-order conditioning (Experiment 2) procedures, we found that human participants integrated information about the boundaries of pathways to locate a goal within a three-dimensional virtual environment in the absence of any relevant landmarks. Spatial integration depended on the participant experiencing a common boundary feature with which to link the pathways. These results suggest that the principles of associative learning also apply to the boundaries of an environment.

  3. Computer-assisted three-dimensional surgical planning: 3D virtual articulator: technical note.

    PubMed

    Ghanai, S; Marmulla, R; Wiechnik, J; Mühling, J; Kotrikova, B

    2010-01-01

    This study presents a computer-assisted planning system for dysgnathia treatment. It describes the process of information gathering using a virtual articulator and how the splints are constructed for orthognathic surgery. The deviation of the virtually planned splints is shown in six cases on the basis of conventionally planned cases. In all cases the plaster models were prepared and scanned using a 3D laser scanner. Successive lateral and posterior-anterior cephalometric images were used for reconstruction before surgery. By identifying specific points on the X-rays and marking them on the virtual models, it was possible to enhance the 2D images to create a realistic 3D environment and to perform virtual repositioning of the jaw. A hexapod was used to transfer the virtual planning to the real splints. Preliminary results showed that conventional repositioning could be replicated using the virtual articulator.

  4. Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project.

    PubMed

    Zucchella, Chiara; Sinforiani, Elena; Tassorelli, Cristina; Cavallini, Elena; Tost-Pardell, Daniela; Grau, Sergi; Pazzi, Stefania; Puricelli, Stefano; Bernini, Sara; Bottiroli, Sara; Vecchi, Tomaso; Sandrini, Giorgio; Nappi, Giuseppe

    2014-01-01

    Conventional cognitive assessment is based on a pencil-and-paper neuropsychological evaluation, which is time consuming, expensive and requires the involvement of several professionals. Information and communication technology could be exploited to allow the development of tools that are easy to use, reduce the amount of data processing, and provide controllable test conditions. Serious games (SGs) have the potential to be new and effective tools in the management and treatment of cognitive impairments Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project in the elderly. Moreover, by adopting SGs in 3D virtual reality settings, cognitive functions might be evaluated using tasks that simulate daily activities, increasing the "ecological validity" of the assessment. In this commentary we report our experience in the creation of the Smart Aging platform, a 3D SGand virtual environment-based platform for the early identification and characterization of mild cognitive impairment.

  5. Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project.

    PubMed

    Zucchella, Chiara; Sinforiani, Elena; Tassorelli, Cristina; Cavallini, Elena; Tost-Pardell, Daniela; Grau, Sergi; Pazzi, Stefania; Puricelli, Stefano; Bernini, Sara; Bottiroli, Sara; Vecchi, Tomaso; Sandrini, Giorgio; Nappi, Giuseppe

    2014-01-01

    Conventional cognitive assessment is based on a pencil-and-paper neuropsychological evaluation, which is time consuming, expensive and requires the involvement of several professionals. Information and communication technology could be exploited to allow the development of tools that are easy to use, reduce the amount of data processing, and provide controllable test conditions. Serious games (SGs) have the potential to be new and effective tools in the management and treatment of cognitive impairments Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project in the elderly. Moreover, by adopting SGs in 3D virtual reality settings, cognitive functions might be evaluated using tasks that simulate daily activities, increasing the "ecological validity" of the assessment. In this commentary we report our experience in the creation of the Smart Aging platform, a 3D SGand virtual environment-based platform for the early identification and characterization of mild cognitive impairment. PMID:25473734

  6. Toward virtual anatomy: a stereoscopic 3-D interactive multimedia computer program for cranial osteology.

    PubMed

    Trelease, R B

    1996-01-01

    Advances in computer visualization and user interface technologies have enabled development of "virtual reality" programs that allow users to perceive and to interact with objects in artificial three-dimensional environments. Such technologies were used to create an image database and program for studying the human skull, a specimen that has become increasingly expensive and scarce. Stereoscopic image pairs of a museum-quality skull were digitized from multiple views. For each view, the stereo pairs were interlaced into a single, field-sequential stereoscopic picture using an image processing program. The resulting interlaced image files are organized in an interactive multimedia program. At run-time, gray-scale 3-D images are displayed on a large-screen computer monitor and observed through liquid-crystal shutter goggles. Users can then control the program and change views with a mouse and cursor to point-and-click on screen-level control words ("buttons"). For each view of the skull, an ID control button can be used to overlay pointers and captions for important structures. Pointing and clicking on "hidden buttons" overlying certain structures triggers digitized audio spoken word descriptions or mini lectures.

  7. Applied virtual reality in aerospace design

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P.

    1995-01-01

    A virtual reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. The objectives of the MSFC VR Applications Program are to develop, assess, validate, and utilize VR in hardware development, operations development and support, mission operations training and science training. Before VR can be used with confidence in a particular application, VR must be validated for that class of applications. For that reason, specific validation studies for selected classes of applications have been proposed and are currently underway. These include macro-ergonomic 'control room class' design analysis, Spacelab stowage reconfiguration training, a full-body microgravity functional reach simulator, a gross anatomy teaching simulator, and micro-ergonomic design analysis. This paper describes the MSFC VR Applications Program and the validation studies.

  8. Dissociation in virtual reality: depersonalization and derealization

    NASA Astrophysics Data System (ADS)

    Garvey, Gregory P.

    2010-01-01

    This paper looks at virtual worlds such as Second Life7 (SL) as possible incubators of dissociation disorders as classified by the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition3 (also known as the DSM-IV). Depersonalization is where "a person feels that he or she has changed in some way or is somehow unreal." Derealization when "the same beliefs are held about one's surroundings." Dissociative Identity Disorder (DID), previously known as multiple personality disorder fits users of Second Life who adopt "in-world" avatars and in effect, enact multiple distinct identities or personalities (known as alter egos or alters). Select questions from the Structured Clinical Interview for Depersonalization (SCI-DER)8 will be discussed as they might apply to the user's experience in Second Life. Finally I would like to consider the hypothesis that rather than a pathological disorder, dissociation is a normal response to the "artificial reality" of Second Life.

  9. Virtual reality treatment of flying phobia.

    PubMed

    Baños, Rosa M; Botella, Cristina; Perpiñá, Concepción; Alcañiz, Mariano; Lozano, Jose Antonio; Osma, Jorge; Gallardo, Myriam

    2002-09-01

    Flying phobia (FP) might become a very incapacitating and disturbing problem in a person's social, working, and private areas. Psychological interventions based on exposure therapy have proved to be effective, but given the particular nature of this disorder they bear important limitations. Exposure therapy for FP might be excessively costly in terms of time, money, and efforts. Virtual reality (VR) overcomes these difficulties as different significant environments might be created, where the patient can interact with what he or she fears while in a totally safe and protected environment-the therapist's consulting room. This paper intends, on one hand, to show the different scenarios designed by our team for the VR treatment of FP, and on the other, to present the first results supporting the effectiveness of this new tool for the treatment of FP in a multiple baseline study. PMID:12381036

  10. Virtual reality treatment of flying phobia.

    PubMed

    Baños, Rosa M; Botella, Cristina; Perpiñá, Concepción; Alcañiz, Mariano; Lozano, Jose Antonio; Osma, Jorge; Gallardo, Myriam

    2002-09-01

    Flying phobia (FP) might become a very incapacitating and disturbing problem in a person's social, working, and private areas. Psychological interventions based on exposure therapy have proved to be effective, but given the particular nature of this disorder they bear important limitations. Exposure therapy for FP might be excessively costly in terms of time, money, and efforts. Virtual reality (VR) overcomes these difficulties as different significant environments might be created, where the patient can interact with what he or she fears while in a totally safe and protected environment-the therapist's consulting room. This paper intends, on one hand, to show the different scenarios designed by our team for the VR treatment of FP, and on the other, to present the first results supporting the effectiveness of this new tool for the treatment of FP in a multiple baseline study.

  11. Virtual Presence: One Step Beyond Reality

    NASA Technical Reports Server (NTRS)

    Budden, Nancy Ann

    1997-01-01

    Our primary objective was to team up a group consisting of scientists and engineers from two different NASA cultures, and simulate an interactive teleoperated robot conducting geologic field work on the Moon or Mars. The information derived from the experiment will benefit both the robotics team and the planetary exploration team in the areas of robot design and development, and mission planning and analysis. The Earth Sciences and Space and Life Sciences Division combines the past with the future contributing experience from Apollo crews exploring the lunar surface, knowledge of reduced gravity environments, the performance limits of EVA suits, and future goals for human exploration beyond low Earth orbit. The Automation, Robotics. and Simulation Division brings to the table the technical expertise of robotic systems, the future goals of highly interactive robotic capabilities, treading on the edge of technology by joining for the first time a unique combination of telepresence with virtual reality.

  12. An Onboard ISS Virtual Reality Trainer

    NASA Technical Reports Server (NTRS)

    Miralles, Evelyn

    2013-01-01

    Prior to the retirement of the Space Shuttle, many exterior repairs on the International Space Station (ISS) were carried out by shuttle astronauts, trained on the ground and flown to the station to perform these repairs. After the retirement of the shuttle, this is no longer an available option. As such, the need for the ISS crew members to review scenarios while on flight, either for tasks they already trained or for contingency operations has become a very critical subject. In many situations, the time between the last session of Neutral Buoyancy Laboratory (NBL) training and an Extravehicular Activity (EVA) task might be 6 to 8 months. In order to help with training for contingency repairs and to maintain EVA proficiency while on flight, the Johnson Space Center Virtual Reality Lab (VRLab) designed an onboard immersive ISS Virtual Reality Trainer (VRT), incorporating a unique optical system and making use of the already successful Dynamic Onboard Ubiquitous Graphical (DOUG) graphics software, to assist crew members with current procedures and contingency EVAs while on flight. The VRT provides an immersive environment similar to the one experienced at the VRLab crew training facility at NASA Johnson Space Center. EVA tasks are critical for a mission since as time passes the crew members may lose proficiency on previously trained tasks. In addition, there is an increased need for unplanned contingency repairs to fix problems arising as the ISS ages. The need to train and re-train crew members for EVAs and contingency scenarios is crucial and extremely demanding. ISS crew members are now asked to perform EVA tasks for which they have not been trained and potentially have never seen before.

  13. Virtual Reality: A Distraction Intervention for Chemotherapy

    PubMed Central

    Schneider, Susan M.; Hood, Linda E.

    2007-01-01

    Purpose/Objectives To explore virtual reality (VR) as a distraction intervention to relieve symptom distress in adults receiving chemotherapy treatments for breast, colon, and lung cancer. Design Crossover design in which participants served as their own control. Setting Outpatient clinic at a comprehensive cancer center in the southeastern United States. Sample 123 adults receiving initial chemotherapy treatments. Methods Participants were randomly assigned to receive the VR distraction intervention during one chemotherapy treatment and then received no intervention (control) during an alternate matched chemotherapy treatment. The Adapted Symptom Distress Scale–2, Revised Piper Fatigue Scale, and State Anxiety Inventory were used to measure symptom distress. The Presence Questionnaire and an open-ended questionnaire were used to evaluate the subjects’ VR experience. The influence of type of cancer, age, and gender on symptom outcomes was explored. Mixed models were used to test for differences in levels of symptom distress. Main Research Variables Virtual reality and symptom distress. Findings Patients had an altered perception of time (p < 0.001) when using VR, which validates the distracting capacity of the intervention. Evaluation of the intervention indicated that patients believed the head-mounted device was easy to use, they experienced no cybersickness, and 82% would use VR again. However, analysis demonstrated no significant differences in symptom distress immediately or two days following chemotherapy treatments. Conclusions Patients stated that using VR made the treatment seem shorter and that chemotherapy treatments with VR were better than treatments without the distraction intervention. However, positive experiences did not result in a decrease in symptom distress. The findings support the idea that using VR can help to make chemotherapy treatments more tolerable, but clinicians should not assume that use of VR will improve chemotherapy

  14. Virtual reality, augmented reality, and robotics applied to digestive operative procedures: from in vivo animal preclinical studies to clinical use

    NASA Astrophysics Data System (ADS)

    Soler, Luc; Marescaux, Jacques

    2006-04-01

    Technological innovations of the 20 th century provided medicine and surgery with new tools, among which virtual reality and robotics belong to the most revolutionary ones. Our work aims at setting up new techniques for detection, 3D delineation and 4D time follow-up of small abdominal lesions from standard mecial images (CT scsan, MRI). It also aims at developing innovative systems making tumor resection or treatment easier with the use of augmented reality and robotized systems, increasing gesture precision. It also permits a realtime great distance connection between practitioners so they can share a same 3D reconstructed patient and interact on a same patient, virtually before the intervention and for real during the surgical procedure thanks to a telesurgical robot. In preclinical studies, our first results obtained from a micro-CT scanner show that these technologies provide an efficient and precise 3D modeling of anatomical and pathological structures of rats and mice. In clinical studies, our first results show the possibility to improve the therapeutic choice thanks to a better detection and and representation of the patient before performing the surgical gesture. They also show the efficiency of augmented reality that provides virtual transparency of the patient in real time during the operative procedure. In the near future, through the exploitation of these systems, surgeons will program and check on the virtual patient clone an optimal procedure without errors, which will be replayed on the real patient by the robot under surgeon control. This medical dream is today about to become reality.

  15. The virtual-casing principle for 3D toroidal systems

    NASA Astrophysics Data System (ADS)

    Lazerson, S. A.

    2012-12-01

    The capability to calculate the magnetic field due to the plasma currents in a toroidally confined magnetic fusion equilibrium is of manifest relevance to equilibrium reconstruction and stellarator divertor design. Two methodologies arise for calculating such quantities. The first being a volume integral over the plasma current density for a given equilibrium. Such an integral is computationally expensive. The second is a surface integral over a surface current on the equilibrium boundary. This method is computationally desirable as the calculation does not grow as the radial resolution of the volume integral. This surface integral method has come to be known as the ‘virtual-casing principle’. In this paper, a full derivation of this method is presented along with a discussion regarding its optimal application. This paper has been authored by Princeton University under Contract Number DE-AC02-09CH11466 with the US Department of Energy. The publisher, by accepting the paper for publication acknowledges, that the United States Government retains a non-exclusive,paid-up, irrevocable, worldwide license to publish or reproduce the published form of this paper, or allow others to do so, for United States Government purposes.

  16. Using voice input and audio feedback to enhance the reality of a virtual experience

    SciTech Connect

    Miner, N.E.

    1994-04-01

    Virtual Reality (VR) is a rapidly emerging technology which allows participants to experience a virtual environment through stimulation of the participant`s senses. Intuitive and natural interactions with the virtual world help to create a realistic experience. Typically, a participant is immersed in a virtual environment through the use of a 3-D viewer. Realistic, computer-generated environment models and accurate tracking of a participant`s view are important factors for adding realism to a virtual experience. Stimulating a participant`s sense of sound and providing a natural form of communication for interacting with the virtual world are equally important. This paper discusses the advantages and importance of incorporating voice recognition and audio feedback capabilities into a virtual world experience. Various approaches and levels of complexity are discussed. Examples of the use of voice and sound are presented through the description of a research application developed in the VR laboratory at Sandia National Laboratories.

  17. Understanding Human Perception of Building Categories in Virtual 3d Cities - a User Study

    NASA Astrophysics Data System (ADS)

    Tutzauer, P.; Becker, S.; Niese, T.; Deussen, O.; Fritsch, D.

    2016-06-01

    Virtual 3D cities are becoming increasingly important as a means of visually communicating diverse urban-related information. To get a deeper understanding of a human's cognitive experience of virtual 3D cities, this paper presents a user study on the human ability to perceive building categories (e.g. residential home, office building, building with shops etc.) from geometric 3D building representations. The study reveals various dependencies between geometric properties of the 3D representations and the perceptibility of the building categories. Knowledge about which geometries are relevant, helpful or obstructive for perceiving a specific building category is derived. The importance and usability of such knowledge is demonstrated based on a perception-guided 3D building abstraction process.

  18. Virtual reality in medical education and assessment

    NASA Technical Reports Server (NTRS)

    Sprague, Laurie A.; Bell, Brad; Sullivan, Tim; Voss, Mark; Payer, Andrew F.; Goza, Stewart Michael

    1994-01-01

    The NASA Johnson Space Center (JSC)/LinCom Corporation, the University of Texas Medical Branch at Galveston (UTMB), and the Galveston Independent School District (GISD) have teamed up to develop a virtual visual environment display (VIVED) that provides a unique educational experience using virtual reality (VR) technologies. The VIVED end product will be a self-contained educational experience allowing students a new method of learning as they interact with the subject matter through VR. This type of interface is intuitive and utilizes spatial and psychomotor abilities which are now constrained or reduced by the current two dimensional terminals and keyboards. The perpetual challenge to educators remains the identification and development of methodologies which conform the learners abilities and preferences. The unique aspects of VR provide an opportunity to explore a new educational experience. Endowing medical students with an understanding of the human body poses some difficulty challenges. One of the most difficult is to convey the three dimensional nature of anatomical structures. The ideal environment for addressing this problem would be one that allows students to become small enough to enter the body and travel through it - much like a person walks through a building. By using VR technology, this effect can be achieved; when VR is combined with multimedia technologies, the effect can be spectacular.

  19. Implementing virtual reality interfaces for the geosciences

    SciTech Connect

    Bethel, W.; Jacobsen, J.; Austin, A.; Lederer, M.; Little, T.

    1996-06-01

    For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called {open_quotes}Virtual Reality{close_quotes} (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools. VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter three or six-dimensional parameters. In the latter, three or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world, but the keyboard is the obstacle in that typing is cumbersome, error-prone and typically slow. In the latter, the user can interact with these parameters by means of motor skills which are highly developed. Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as a tool for computing streamlines via manipulation of a {open_quotes}rake.{close_quotes} The rake is presented to the user in the form of a {open_quotes}virtual well{close_quotes} icon, and provides parameters used by the streamlines algorithm.

  20. A new approach towards image based virtual 3D city modeling by using close range photogrammetry

    NASA Astrophysics Data System (ADS)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2014-05-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing day to day for various engineering and non-engineering applications. Generally three main image based approaches are using for virtual 3D city models generation. In first approach, researchers used Sketch based modeling, second method is Procedural grammar based modeling and third approach is Close range photogrammetry based modeling. Literature study shows that till date, there is no complete solution available to create complete 3D city model by using images. These image based methods also have limitations This paper gives a new approach towards image based virtual 3D city modeling by using close range photogrammetry. This approach is divided into three sections. First, data acquisition process, second is 3D data processing, and third is data combination process. In data acquisition process, a multi-camera setup developed and used for video recording of an area. Image frames created from video data. Minimum required and suitable video image frame selected for 3D processing. In second section, based on close range photogrammetric principles and computer vision techniques, 3D model of area created. In third section, this 3D model exported to adding and merging of other pieces of large area. Scaling and alignment of 3D model was done. After applying the texturing and rendering on this model, a final photo-realistic textured 3D model created. This 3D model transferred into walk-through model or in movie form. Most of the processing steps are automatic. So this method is cost effective and less laborious. Accuracy of this model is good. For this research work, study area is the campus of department of civil engineering, Indian Institute of Technology, Roorkee. This campus acts as a prototype for city. Aerial photography is restricted in many country

  1. Integration of virtual and real scenes within an integral 3D imaging environment

    NASA Astrophysics Data System (ADS)

    Ren, Jinsong; Aggoun, Amar; McCormick, Malcolm

    2002-11-01

    The Imaging Technologies group at De Montfort University has developed an integral 3D imaging system, which is seen as the most likely vehicle for 3D television avoiding psychological effects. To create real fascinating three-dimensional television programs, a virtual studio that performs the task of generating, editing and integrating the 3D contents involving virtual and real scenes is required. The paper presents, for the first time, the procedures, factors and methods of integrating computer-generated virtual scenes with real objects captured using the 3D integral imaging camera system. The method of computer generation of 3D integral images, where the lens array is modelled instead of the physical camera is described. In the model each micro-lens that captures different elemental images of the virtual scene is treated as an extended pinhole camera. An integration process named integrated rendering is illustrated. Detailed discussion and deep investigation are focused on depth extraction from captured integral 3D images. The depth calculation method from the disparity and the multiple baseline method that is used to improve the precision of depth estimation are also presented. The concept of colour SSD and its further improvement in the precision is proposed and verified.

  2. Integrated bronchoscopic video tracking and 3D CT registration for virtual bronchoscopy

    NASA Astrophysics Data System (ADS)

    Higgins, William E.; Helferty, James P.; Padfield, Dirk R.

    2003-05-01

    Lung cancer assessment involves an initial evaluation of 3D CT image data followed by interventional bronchoscopy. The physician, with only a mental image inferred from the 3D CT data, must guide the bronchoscope through the bronchial tree to sites of interest. Unfortunately, this procedure depends heavily on the physician's ability to mentally reconstruct the 3D position of the bronchoscope within the airways. In order to assist physicians in performing biopsies of interest, we have developed a method that integrates live bronchoscopic video tracking and 3D CT registration. The proposed method is integrated into a system we have been devising for virtual-bronchoscopic analysis and guidance for lung-cancer assessment. Previously, the system relied on a method that only used registration of the live bronchoscopic video to corresponding virtual endoluminal views derived from the 3D CT data. This procedure only performs the registration at manually selected sites; it does not draw upon the motion information inherent in the bronchoscopic video. Further, the registration procedure is slow. The proposed method has the following advantages: (1) it tracks the 3D motion of the bronchoscope using the bronchoscopic video; (2) it uses the tracked 3D trajectory of the bronchoscope to assist in locating sites in the 3D CT "virtual world" to perform the registration. In addition, the method incorporates techniques to: (1) detect and exclude corrupted video frames (to help make the video tracking more robust); (2) accelerate the computation of the many 3D virtual endoluminal renderings (thus, speeding up the registration process). We have tested the integrated tracking-registration method on a human airway-tree phantom and on real human data.

  3. A computer-based training system combining virtual reality and multimedia

    NASA Technical Reports Server (NTRS)

    Stansfield, Sharon A.

    1993-01-01

    Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment. The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system.

  4. A computer-based training system combining virtual reality and multimedia

    SciTech Connect

    Stansfield, S.A.

    1993-04-28

    Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment: The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system.

  5. The VRFurnace: A Virtual Reality Application for Energy System Data Analysis

    SciTech Connect

    Peter Eric Johnson

    2001-05-01

    This paper presents the Virtual Reality Furnace (VRFurnace) application, an interactive 3-D visualization platform for pulverized coal furnace analysis. The VRFurnace is a versatile toolkit where a variety of different CFD data sets related to pulverized coal furnaces can be studied interactively. The toolkit combines standard CFD analysis techniques with tools that more effectively utilize the 3-D capabilities of a virtual environment. Interaction with data is achieved through a dynamic instructional menu system. The application has been designed for use in a projection-based system which allows engineers, management, and operators to see and interact with the data at the same time. Future developments are discussed and will include the ability to combine multiple power plant components into a single application, allow remote collaboration between different virtual environments, and allow users to make changes to a flow field and see the results of these changes as they are made creating a complete virtual power plant.

  6. Use of Virtual Reality for Space Flight

    NASA Technical Reports Server (NTRS)

    Harm, Deborah; Taylor, L. C.; Reschke, M. F.

    2011-01-01

    Virtual environments offer unique training opportunities, particularly for training astronauts and preadapting them to the novel sensory conditions of microgravity. Two unresolved human factors issues in virtual reality (VR) systems are: 1) potential "cybersickness", and 2) maladaptive sensorimotor performance following exposure to VR systems. Interestingly, these aftereffects are often quite similar to adaptive sensorimotor responses observed in astronauts during and/or following space flight. Active exploratory behavior in a new environment, with resulting feedback and the formation of new associations between sensory inputs and response outputs, promotes appropriate perception and motor control in the new environment. Thus, people adapt to consistent, sustained alterations of sensory input such as those produced by microgravity. Our research examining the effects of repeated exposures to a full field of view dome VR system showed that motion sickness and initial decrements in eye movement and postural control were greatly diminished following three exposures. These results suggest that repeated transitions between VR and the normal environment preflight might be a useful countermeasure for neurosensory and sensorimotor effects of space flight. The range of VR applications is enormous, extending from ground-based VR training for extravehicular activities at NASA, to medical and educational uses. It seems reasonable to suggest that other space related uses of VR should be investigated. For example, 1) use of head-mounted VR on orbit to rehearse/practice upcoming operational activities, and 2) ground-based VR training for emergency egress procedures. We propose that by combining VR designed for operational activities preflight, along with an appropriate schedule to facilitate sensorimotor adaptation and improve spatial orientation would potentially accomplish two important goals for astronauts and cosmonauts, preflight sensorimotor adaption and enhanced operational

  7. Embodied collaboration support system for 3D shape evaluation in virtual space

    NASA Astrophysics Data System (ADS)

    Okubo, Masashi; Watanabe, Tomio

    2005-12-01

    Collaboration mainly consists of two tasks; one is each partner's task that is performed by the individual, the other is communication with each other. Both of them are very important objectives for all the collaboration support system. In this paper, a collaboration support system for 3D shape evaluation in virtual space is proposed on the basis of both studies in 3D shape evaluation and communication support in virtual space. The proposed system provides the two viewpoints for each task. One is the viewpoint of back side of user's own avatar for the smooth communication. The other is that of avatar's eye for 3D shape evaluation. Switching the viewpoints satisfies the task conditions for 3D shape evaluation and communication. The system basically consists of PC, HMD and magnetic sensors, and users can share the embodied interaction by observing interaction between their avatars in virtual space. However, the HMD and magnetic sensors, which are put on the users, would restrict the nonverbal communication. Then, we have tried to compensate the loss of nodding of partner's avatar by introducing the speech-driven embodied interactive actor InterActor. Sensory evaluation by paired comparison of 3D shapes in the collaborative situation in virtual space and in real space and the questionnaire are performed. The result demonstrates the effectiveness of InterActor's nodding in the collaborative situation.

  8. Applying a 3D Situational Virtual Learning Environment to the Real World Business--An Extended Research in Marketing

    ERIC Educational Resources Information Center

    Wang, Shwu-huey

    2012-01-01

    In order to understand (1) what kind of students can be facilitated through the help of three-dimensional virtual learning environment (3D VLE), and (2) the relationship between a conventional test (ie, paper and pencil test) and the 3D VLE used in this study, the study designs a 3D virtual supermarket (3DVS) to help students transform their role…

  9. Retinal imaging with virtual reality stimulus for studying Salticidae retinas

    NASA Astrophysics Data System (ADS)

    Schiesser, Eric; Canavesi, Cristina; Long, Skye; Jakob, Elizabeth; Rolland, Jannick P.

    2014-12-01

    We present a 3-path optical system for studying the retinal movement of jumping spiders: a visible OLED virtual reality system presents stimulus, while NIR illumination and imaging systems observe retinal movement.

  10. STS-118 Astronaut Dave Williams Trains Using Virtual Reality Hardware

    NASA Technical Reports Server (NTRS)

    2007-01-01

    STS-118 astronaut and mission specialist Dafydd R. 'Dave' Williams, representing the Canadian Space Agency, uses Virtual Reality Hardware in the Space Vehicle Mock Up Facility at the Johnson Space Center to rehearse some of his duties for the upcoming mission. This type of virtual reality training allows the astronauts to wear special gloves and other gear while looking at a computer that displays simulating actual movements around the various locations on the station hardware which with they will be working.

  11. Over-generation-10 size FPD photomasks for virtual reality

    NASA Astrophysics Data System (ADS)

    Shiojiri, Kazuya

    2012-06-01

    With the advent of the large-sized and high-precision flat panel display, enjoying Virtual reality is getting more familiar. SK-Electronics CO., LTD. is only the mask maker that can produce the Over-Generation 10- sized LCD photomasks in the world. We, SK-Electronics CO., LTD. believe that we can provide the pleasure of virtual reality with higher realistic sensation through the super large-sized and high-precision photomasks .

  12. Management and services for large-scale virtual 3D urban model data based on network

    NASA Astrophysics Data System (ADS)

    He, Zhengwei; Chen, Jing; Wu, Huayi

    2008-10-01

    The buildings in modern city are complex and diverse, and the quantity is huge. These bring very big challenge for constructing 3D GIS under network circumstance and eventually realizing the Digital Earth. After analyzed the characteristic of network service about massive 3D urban building model data, this paper focuses on the organization and management of spatial data and the network services strategy, proposes a progressive network transmission schema based on the spatial resolution and the component elements of 3D building model data. Next, this paper put forward multistage-link three-dimensional spatial data organization model and encoding method of spatial index based on fully level quadtree structure. Then, a virtual earth platform, called GeoGlobe, was developed using above theory. Experimental results show that above 3D spatial data management model and service theory can availably provide network services for large-scale 3D urban model data. The application results and user experience good .

  13. Approach to Constructing 3d Virtual Scene of Irrigation Area Using Multi-Source Data

    NASA Astrophysics Data System (ADS)

    Cheng, S.; Dou, M.; Wang, J.; Zhang, S.; Chen, X.

    2015-10-01

    For an irrigation area that is often complicated by various 3D artificial ground features and natural environment, disadvantages of traditional 2D GIS in spatial data representation, management, query, analysis and visualization is becoming more and more evident. Building a more realistic 3D virtual scene is thus especially urgent for irrigation area managers and decision makers, so that they can carry out various irrigational operations lively and intuitively. Based on previous researchers' achievements, a simple, practical and cost-effective approach was proposed in this study, by adopting3D geographic information system (3D GIS), remote sensing (RS) technology. Based on multi-source data such as Google Earth (GE) high-resolution remote sensing image, ASTER G-DEM, hydrological facility maps and so on, 3D terrain model and ground feature models were created interactively. Both of the models were then rendered with texture data and integrated under ArcGIS platform. A vivid, realistic 3D virtual scene of irrigation area that has a good visual effect and possesses primary GIS functions about data query and analysis was constructed.Yet, there is still a long way to go for establishing a true 3D GIS for the irrigation are: issues of this study were deeply discussed and future research direction was pointed out in the end of the paper.

  14. The Usability of Online Geographic Virtual Reality for Urban Planning

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Moore, A. B.

    2013-08-01

    Virtual reality (VR) technology is starting to become widely and freely available (for example the online OpenSimulator tool), with potential for use in 3D urban planning and design tasks but still needing rigorous assessment to establish this. A previous study consulted with a small group of urban professionals, who concluded in a satisfaction usability test that online VR had potential value as a usable 3D communication and remote marketing tool but acknowledged that visual quality and geographic accuracy were obstacles to overcome. This research takes the investigation a significant step further to also examine the usability aspects of efficiency (how quickly tasks are completed) and effectiveness (how successfully tasks are completed), relating to OpenSimulator in an urban planning situation. The comparative study pits a three-dimensional VR model (with increased graphic fidelity and geographic content to address the feedback of the previous study) of a subdivision design (in a Dunedin suburb) against 3D models built with GIS (ArcGIS) and CAD (BricsCAD) tools, two types of software environment well established in urban professional practice. Urban professionals participated in the study by attempting to perform timed tasks correctly in each of the environments before being asked questions about the technologies involved and their perceived importance to their professional work. The results reinforce the positive feedback for VR of the previous study, with the graphical and geographic data issues being somewhat addressed (though participants stressed the need for accurate and precise object and terrain modification capabilities in VR). Ease-ofuse and associated fastest task completion speed were significant positive outcomes to emerge from the comparison with GIS and CAD, pointing to a strong future for VR in an urban planning context.

  15. Role of virtual reality for cerebral palsy management.

    PubMed

    Weiss, Patrice L Tamar; Tirosh, Emanuel; Fehlings, Darcy

    2014-08-01

    Virtual reality is the use of interactive simulations to present users with opportunities to perform in virtual environments that appear, sound, and less frequently, feel similar to real-world objects and events. Interactive computer play refers to the use of a game where a child interacts and plays with virtual objects in a computer-generated environment. Because of their distinctive attributes that provide ecologically realistic and motivating opportunities for active learning, these technologies have been used in pediatric rehabilitation over the past 15 years. The ability of virtual reality to create opportunities for active repetitive motor/sensory practice adds to their potential for neuroplasticity and learning in individuals with neurologic disorders. The objectives of this article is to provide an overview of how virtual reality and gaming are used clinically, to present the results of several example studies that demonstrate their use in research, and to briefly remark on future developments.

  16. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars

    PubMed Central

    Patil, Shashidhar; Chintalapalli, Harinadha Reddy; Kim, Dubeom; Chai, Youngho

    2015-01-01

    In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor. PMID:26094629

  17. The Cognitive Apprenticeship Theory for the Teaching of Mathematics in an Online 3D Virtual Environment

    ERIC Educational Resources Information Center

    Bouta, Hara; Paraskeva, Fotini

    2013-01-01

    Research spanning two decades shows that there is a continuing development of 3D virtual worlds and investment in such environments for educational purposes. Research stresses the need for these environments to be well-designed and for suitable pedagogies to be implemented in the teaching practice in order for these worlds to be fully effective.…

  18. Socialisation for Learning at a Distance in a 3-D Multi-User Virtual Environment

    ERIC Educational Resources Information Center

    Edirisingha, Palitha; Nie, Ming; Pluciennik, Mark; Young, Ruth

    2009-01-01

    This paper reports findings of a pilot study that examined the pedagogical potential of "Second Life" (SL), a popular three-dimensional multi-user virtual environment (3-D MUVE) developed by the Linden Lab. The study is part of a 1-year research and development project titled "Modelling of Secondlife Environments" (http://www.le.ac.uk/moose)…

  19. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars.

    PubMed

    Patil, Shashidhar; Chintalapalli, Harinadha Reddy; Kim, Dubeom; Chai, Youngho

    2015-01-01

    In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor. PMID:26094629

  20. Laying the Groundwork for Socialisation and Knowledge Construction within 3D Virtual Worlds

    ERIC Educational Resources Information Center

    Minocha, Shailey; Roberts, Dave

    2008-01-01

    The paper reports the theoretical underpinnings for the pedagogical role and rationale for adopting 3D virtual worlds for socialisation and knowledge creation in distance education. Socialisation or "knowing one another" in remote distributed environments can be achieved through synchronous technologies such as instant messaging, audio and…

  1. Design and Implementation of a 3D Multi-User Virtual World for Language Learning

    ERIC Educational Resources Information Center

    Ibanez, Maria Blanca; Garcia, Jose Jesus; Galan, Sergio; Maroto, David; Morillo, Diego; Kloos, Carlos Delgado

    2011-01-01

    The best way to learn is by having a good teacher and the best language learning takes place when the learner is immersed in an environment where the language is natively spoken. 3D multi-user virtual worlds have been claimed to be useful for learning, and the field of exploiting them for education is becoming more and more active thanks to the…

  2. Supporting Distributed Team Working in 3D Virtual Worlds: A Case Study in Second Life

    ERIC Educational Resources Information Center

    Minocha, Shailey; Morse, David R.

    2010-01-01

    Purpose: The purpose of this paper is to report on a study into how a three-dimensional (3D) virtual world (Second Life) can facilitate socialisation and team working among students working on a team project at a distance. This models the situation in many commercial sectors where work is increasingly being conducted across time zones and between…

  3. Teaching Digital Natives: 3-D Virtual Science Lab in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Franklin, Teresa J.

    2008-01-01

    This paper presents the development of a 3-D virtual environment in Second Life for the delivery of standards-based science content for middle school students in the rural Appalachian region of Southeast Ohio. A mixed method approach in which quantitative results of improved student learning and qualitative observations of implementation within…

  4. Virtual rough samples to test 3D nanometer-scale scanning electron microscopy stereo photogrammetry

    NASA Astrophysics Data System (ADS)

    Villarrubia, J. S.; Tondare, V. N.; Vladár, A. E.

    2016-03-01

    The combination of scanning electron microscopy for high spatial resolution, images from multiple angles to provide 3D information, and commercially available stereo photogrammetry software for 3D reconstruction offers promise for nanometer-scale dimensional metrology in 3D. A method is described to test 3D photogrammetry software by the use of virtual samples—mathematical samples from which simulated images are made for use as inputs to the software under test. The virtual sample is constructed by wrapping a rough skin with any desired power spectral density around a smooth near-trapezoidal line with rounded top corners. Reconstruction is performed with images simulated from different angular viewpoints. The software's reconstructed 3D model is then compared to the known geometry of the virtual sample. Three commercial photogrammetry software packages were tested. Two of them produced results for line height and width that were within close to 1 nm of the correct values. All of the packages exhibited some difficulty in reconstructing details of the surface roughness.

  5. 3D global estimation and augmented reality visualization of intra-operative X-ray dose.

    PubMed

    Rodas, Nicolas Loy; Padoy, Nicolas

    2014-01-01

    The growing use of image-guided minimally-invasive surgical procedures is confronting clinicians and surgical staff with new radiation exposure risks from X-ray imaging devices. The accurate estimation of intra-operative radiation exposure can increase staff awareness of radiation exposure risks and enable the implementation of well-adapted safety measures. The current surgical practice of wearing a single dosimeter at chest level to measure radiation exposure does not provide a sufficiently accurate estimation of radiation absorption throughout the body. In this paper, we propose an approach that combines data from wireless dosimeters with the simulation of radiation propagation in order to provide a global radiation risk map in the area near the X-ray device. We use a multi-camera RGBD system to obtain a 3D point cloud reconstruction of the room. The positions of the table, C-arm and clinician are then used 1) to simulate the propagation of radiation in a real-world setup and 2) to overlay the resulting 3D risk-map onto the scene in an augmented reality manner. By using real-time wireless dosimeters in our system, we can both calibrate the simulation and validate its accuracy at specific locations in real-time. We demonstrate our system in an operating room equipped with a robotised X-ray imaging device and validate the radiation simulation on several X-ray acquisition setups. PMID:25333145

  6. GEARS a 3D Virtual Learning Environment and Virtual Social and Educational World Used in Online Secondary Schools

    ERIC Educational Resources Information Center

    Barkand, Jonathan; Kush, Joseph

    2009-01-01

    Virtual Learning Environments (VLEs) are becoming increasingly popular in online education environments and have multiple pedagogical advantages over more traditional approaches to education. VLEs include 3D worlds where students can engage in simulated learning activities such as Second Life. According to Claudia L'Amoreaux at Linden Lab, "at…

  7. Using Immersive Virtual Reality for Electrical Substation Training

    ERIC Educational Resources Information Center

    Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana

    2015-01-01

    Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…

  8. Virtual Reality as a Tool in the Education

    ERIC Educational Resources Information Center

    Piovesan, Sandra Dutra; Passerino, Liliana Maria; Pereira, Adriana Soares

    2012-01-01

    The virtual reality is being more and more used in the education, enabling the student to find out, to explore and to build his own knowledge. This paper presents an Educational Software for presence or distance education, for subjects of Formal Language, where the student can manipulate virtually the target that must be explored, analyzed and…

  9. Soldier evaluation of the virtual reality Iraq.

    PubMed

    Reger, Greg M; Gahm, Gregory A; Rizzo, Albert A; Swanson, Robert; Duma, Susan

    2009-01-01

    Repeated combat deployments to Iraq and Afghanistan are resulting in increased rates of posttraumatic stress disorder (PTSD) among military personnel. Although exposure therapy is an effective treatment for this disorder, some personnel do not significantly respond to treatment, possibly due to poor activation of the trauma memory or a lack of emotional engagement during therapy. In addition, some service members do not seek mental healthcare due to treatment stigma. Researchers recently developed a virtual reality (VR) Iraq to attempt to improve activation of the traumatic memory during exposure therapy and to provide a treatment approach that may be more appealing to some service members, relative to traditional face-to-face talk therapy. Initial validation of the application requires an assessment of how well it represents the experiences of previously deployed service members. This study evaluated the realism of the VR Iraq application according to the subjective evaluation of 93 U.S. Army soldiers who returned from Iraq in the last year. Those screening negative for PTSD used and evaluated a VR tactical convoy and a VR dismounted patrol in a simulated Middle Eastern city. Results indicated that 86% of soldiers rated the overall realism of the VR convoy as ranging from adequate to excellent. Eighty-two percent of soldiers reported adequate-to-excellent overall realism of the city environment. Results provide evidence that the VR Iraq presents a realistic context in which VR exposure therapy can be conducted. However, clinical trials are needed to assess the efficacy of VR exposure therapy for Iraq veterans with PTSD. PMID:19199854

  10. Virtual reality and telepresence for military medicine.

    PubMed

    Satava, R M

    1997-01-01

    For decades, warfighters have been putting in place a sophisticated "digital battlefield", an electronic communication and information system to support advanced technology. Medicine is now in a position to leverage these technologies to produce a fundamental revolution, and the keystone is the digital physician. Today nearly all information about a patient can be acquired electronically, and with the new technologies of teleoperation and telesurgery we can provide remote treatment and even surgery through telemedicine. The following framework for military medicine will leverage upon the current electronic battlefield. A personnel status monitor (PSM) will have a global positioning locator to tell the position of each soldier and a suite of vital signs sensors. When a soldier is wounded, the medic will instantly know the location of the soldier, and how serious is the casualty. This will permit the medic to locate the most critically wounded soldier. Once stabilised, he will be placed in a critical care pod, a fully automated intensive care unit in a stretcher, which will monitor his vital signs, administer fluids and medications and provide environmental protection. If immediate surgery is needed, a remote telepresence surgery vehicle will come to the wounded soldier, the medic will place him in the vehicle, and a surgeon will operate remotely using telepresence surgery from a distant Mobile Advance Surgical Hospital (MASH) to the combat zone. Also, the expertise from any specialist will be available from the rear echelons as far back as the home country. For education and training in combat casualty care, virtual reality simulators are being implemented. This same scenario can be utilised in civilian health care, especially in providing care to patients in remote areas who do not currently have access to simple, let alone sophisticated, health care. PMID:9140589

  11. An Interactive 3D Virtual Anatomy Puzzle for Learning and Simulation - Initial Demonstration and Evaluation.

    PubMed

    Messier, Erik; Wilcox, Jascha; Dawson-Elli, Alexander; Diaz, Gabriel; Linte, Cristian A

    2016-01-01

    To inspire young students (grades 6-12) to become medical practitioners and biomedical engineers, it is necessary to expose them to key concepts of the field in a way that is both exciting and informative. Recent advances in medical image acquisition, manipulation, processing, visualization, and display have revolutionized the approach in which the human body and internal anatomy can be seen and studied. It is now possible to collect 3D, 4D, and 5D medical images of patient specific data, and display that data to the end user using consumer level 3D stereoscopic display technology. Despite such advancements, traditional 2D modes of content presentation such as textbooks and slides are still the standard didactic equipment used to teach young students anatomy. More sophisticated methods of display can help to elucidate the complex 3D relationships between structures that are so often missed when viewing only 2D media, and can instill in students an appreciation for the interconnection between medicine and technology. Here we describe the design, implementation, and preliminary evaluation of a 3D virtual anatomy puzzle dedicated to helping users learn the anatomy of various organs and systems by manipulating 3D virtual data. The puzzle currently comprises several components of the human anatomy and can be easily extended to include additional organs and systems. The 3D virtual anatomy puzzle game was implemented and piloted using three display paradigms - a traditional 2D monitor, a 3D TV with active shutter glass, and the DK2 version Oculus Rift, as well as two different user interaction devices - a space mouse and traditional keyboard controls. PMID:27046584

  12. Design and application of real-time visual attention model for the exploration of 3D virtual environments.

    PubMed

    Hillaire, Sébastien; Lécuyer, Anatole; Regia-Corte, Tony; Cozot, Rémi; Royan, Jérôme; Breton, Gaspard

    2012-03-01

    This paper studies the design and application of a novel visual attention model designed to compute user's gaze position automatically, i.e., without using a gaze-tracking system. The model we propose is specifically designed for real-time first-person exploration of 3D virtual environments. It is the first model adapted to this context which can compute in real time a continuous gaze point position instead of a set of 3D objects potentially observed by the user. To do so, contrary to previous models which use a mesh-based representation of visual objects, we introduce a representation based on surface-elements. Our model also simulates visual reflexes and the cognitive processes which take place in the brain such as the gaze behavior associated to first-person navigation in the virtual environment. Our visual attention model combines both bottom-up and top-down components to compute a continuous gaze point position on screen that hopefully matches the user's one. We conducted an experiment to study and compare the performance of our method with a state-of-the-art approach. Our results are found significantly better with sometimes more than 100 percent of accuracy gained. This suggests that computing a gaze point in a 3D virtual environment in real time is possible and is a valid approach, compared to object-based approaches. Finally, we expose different applications of our model when exploring virtual environments. We present different algorithms which can improve or adapt the visual feedback of virtual environments based on gaze information. We first propose a level-of-detail approach that heavily relies on multiple-texture sampling. We show that it is possible to use the gaze information of our visual attention model to increase visual quality where the user is looking, while maintaining a high-refresh rate. Second, we introduce the use of the visual attention model in three visual effects inspired by the human visual system namely: depth-of-field blur, camera

  13. Design and application of real-time visual attention model for the exploration of 3D virtual environments.

    PubMed

    Hillaire, Sébastien; Lécuyer, Anatole; Regia-Corte, Tony; Cozot, Rémi; Royan, Jérôme; Breton, Gaspard

    2012-03-01

    This paper studies the design and application of a novel visual attention model designed to compute user's gaze position automatically, i.e., without using a gaze-tracking system. The model we propose is specifically designed for real-time first-person exploration of 3D virtual environments. It is the first model adapted to this context which can compute in real time a continuous gaze point position instead of a set of 3D objects potentially observed by the user. To do so, contrary to previous models which use a mesh-based representation of visual objects, we introduce a representation based on surface-elements. Our model also simulates visual reflexes and the cognitive processes which take place in the brain such as the gaze behavior associated to first-person navigation in the virtual environment. Our visual attention model combines both bottom-up and top-down components to compute a continuous gaze point position on screen that hopefully matches the user's one. We conducted an experiment to study and compare the performance of our method with a state-of-the-art approach. Our results are found significantly better with sometimes more than 100 percent of accuracy gained. This suggests that computing a gaze point in a 3D virtual environment in real time is possible and is a valid approach, compared to object-based approaches. Finally, we expose different applications of our model when exploring virtual environments. We present different algorithms which can improve or adapt the visual feedback of virtual environments based on gaze information. We first propose a level-of-detail approach that heavily relies on multiple-texture sampling. We show that it is possible to use the gaze information of our visual attention model to increase visual quality where the user is looking, while maintaining a high-refresh rate. Second, we introduce the use of the visual attention model in three visual effects inspired by the human visual system namely: depth-of-field blur, camera

  14. A Virtual Reality Simulator Prototype for Learning and Assessing Phaco-sculpting Skills

    NASA Astrophysics Data System (ADS)

    Choi, Kup-Sze

    This paper presents a virtual reality based simulator prototype for learning phacoemulsification in cataract surgery, with focus on the skills required for making a cross-shape trench in cataractous lens by an ultrasound probe during the phaco-sculpting procedure. An immersive virtual environment is created with 3D models of the lens and surgical tools. Haptic device is also used as 3D user interface. Phaco-sculpting is simulated by interactively deleting the constituting tetrahedrons of the lens model. Collisions between the virtual probe and the lens are effectively identified by partitioning the space containing the lens hierarchically with an octree. The simulator can be programmed to collect real-time quantitative user data for reviewing and assessing trainee's performance in an objective manner. A game-based learning environment can be created on top of the simulator by incorporating gaming elements based on the quantifiable performance metrics.

  15. Evaluating Virtual Reality and Augmented Reality Training for Industrial Maintenance and Assembly Tasks

    ERIC Educational Resources Information Center

    Gavish, Nirit; Gutiérrez, Teresa; Webel, Sabine; Rodríguez, Jorge; Peveri, Matteo; Bockholt, Uli; Tecchia, Franco

    2015-01-01

    The current study evaluated the use of virtual reality (VR) and augmented reality (AR) platforms, developed within the scope of the SKILLS Integrated Project, for industrial maintenance and assembly (IMA) tasks training. VR and AR systems are now widely regarded as promising training platforms for complex and highly demanding IMA tasks. However,…

  16. Two Innovative Steps for Training on Maintenance: 'VIRMAN' Spanish Project based on Virtual Reality 'STARMATE' European Project based on Augmented Reality

    SciTech Connect

    Gonzalez Anez, Francisco

    2002-07-01

    This paper presents two development projects (STARMATE and VIRMAN) focused on supporting training on maintenance. Both projects aim at specifying, designing, developing, and demonstrating prototypes allowing computer guided maintenance of complex mechanical elements using Augmented and Virtual Reality techniques. VIRMAN is a Spanish development project. The objective is to create a computer tool for maintenance training course elaborations and training delivery based on 3D virtual reality models of complex components. The training delivery includes 3D record displays on maintenance procedures with all complementary information for intervention understanding. Users are requested to perform the maintenance intervention trying to follow up the procedure. Users can be evaluated about the level of knowledge achieved. Instructors can check the evaluation records left during the training sessions. VIRMAN is simple software supported by a regular computer and can be used in an Internet framework. STARMATE is a forward step in the area of virtual reality. STARMATE is a European Commission project in the frame of 'Information Societies Technologies'. A consortium of five companies and one research institute shares their expertise in this new technology. STARMATE provides two main functionalities (1) user assistance for achieving assembly/de-assembly and following maintenance procedures, and (2) workforce training. The project relies on Augmented Reality techniques, which is a growing area in Virtual Reality research. The idea of Augmented Reality is to combine a real scene, viewed by the user, with a virtual scene, generated by a computer, augmenting the reality with additional information. The user interface is see-through goggles, headphones, microphone and an optical tracking system. All these devices are integrated in a helmet connected with two regular computers. The user has his hands free for performing the maintenance intervention and he can navigate in the virtual

  17. Virtual Reality and Learning: Where Is the Pedagogy?

    ERIC Educational Resources Information Center

    Fowler, Chris

    2015-01-01

    The aim of this paper was to build upon Dalgarno and Lee's model or framework of learning in three-dimensional (3-D) virtual learning environments (VLEs) and to extend their road map for further research in this area. The enhanced model shares the common goal with Dalgarno and Lee of identifying the learning benefits from using 3-D VLEs. The…

  18. EXPLORING ENVIRONMENTAL DATA IN A HIGHLY IMMERSIVE VIRTUAL REALITY ENVIRONMENT

    EPA Science Inventory

    Geography inherently fills a 3D space and yet we struggle with displaying geography using, primaarily, 2D display devices. Virtual environments offer a more realistically-dimensioned display space and this is being realized in the expanding area of research on 3D Geographic Infor...

  19. Algorithm for simulation of craniotomies assisted by peripheral for 3D virtual navigation.

    PubMed

    Duque, Sara I; Ochoa, John F; Botero, Andrés F; Ramirez, Mateo

    2015-01-01

    Neurosurgical procedures require high precision and an accurate localization of the structures. For that reason and due to the advances in 3D visualization, the software for planning and training neurosurgeries has become an important tool for neurosurgeons and students, but the manipulation of the 3D structures is not always easy for the staff that usually works with 2D images. This paper describes a system developed in open source software that allows performing a virtual craniotomy (a common procedure in neurosurgery that enables the access to intracranial lesions) in 3D slicer; the system includes a peripheral input in order to permit the manipulation of the 3D structures according to camera movements and to guide the movement of the craniotomy tool. PMID:26737914

  20. Comparative analysis of video processing and 3D rendering for cloud video games using different virtualization technologies

    NASA Astrophysics Data System (ADS)

    Bada, Adedayo; Alcaraz-Calero, Jose M.; Wang, Qi; Grecos, Christos

    2014-05-01

    This paper describes a comprehensive empirical performance evaluation of 3D video processing employing the physical/virtual architecture implemented in a cloud environment. Different virtualization technologies, virtual video cards and various 3D benchmarks tools have been utilized in order to analyse the optimal performance in the context of 3D online gaming applications. This study highlights 3D video rendering performance under each type of hypervisors, and other factors including network I/O, disk I/O and memory usage. Comparisons of these factors under well-known virtual display technologies such as VNC, Spice and Virtual 3D adaptors reveal the strengths and weaknesses of the various hypervisors with respect to 3D video rendering and streaming.

  1. vPresent: A cloud based 3D virtual presentation environment for interactive product customization

    NASA Astrophysics Data System (ADS)

    Nan, Xiaoming; Guo, Fei; He, Yifeng; Guan, Ling

    2013-09-01

    In modern society, many companies offer product customization services to their customers. There are two major issues in providing customized products. First, product manufacturers need to effectively present their products to the customers who may be located in any geographical area. Second, customers need to be able to provide their feedbacks on the product in real-time. However, the traditional presentation approaches cannot effectively convey sufficient information for the product or efficiently adjust product design according to customers' real-time feedbacks. In order to address these issues, we propose vPresent , a cloud based 3D virtual presentation environment, in this paper. In vPresent, the product expert can show the 3D virtual product to the remote customers and dynamically customize the product based on customers' feedbacks, while customers can provide their opinions in real time when they are viewing a vivid 3D visualization of the product. Since the proposed vPresent is a cloud based system, the customers are able to access the customized virtual products from anywhere at any time, via desktop, laptop, or even smart phone. The proposed vPresent is expected to effectively deliver 3D visual information to customers and provide an interactive design platform for the development of customized products.

  2. Valorisation of Cultural Heritage Through Virtual Visit and Augmented Reality: the Case of the Abbey of Epau (france)

    NASA Astrophysics Data System (ADS)

    Simonetto, E.; Froment, C.; Labergerie, E.; Ferré, G.; Séchet, B.; Chédorge, H.; Cali, J.; Polidori, L.

    2013-07-01

    Terrestrial Laser Scanning (TLS), 3-D modeling and its Web visualization are the three key steps needed to perform storage and grant-free and wide access to cultural heritage, as highlighted in many recent examples. The goal of this study is to set up 3-D Web resources for "virtually" visiting the exterior of the Abbaye de l'Epau, an old French abbey which has both a rich history and delicate architecture. The virtuality is considered in two ways: the flowing navigation in a virtual reality environment around the abbey and a game activity using augmented reality. First of all, the data acquisition consists in GPS and tacheometry survey, terrestrial laser scanning and photography acquisition. After data pre-processing, the meshed and textured 3-D model is generated using 3-D Reshaper commercial software. The virtual reality visit and augmented reality animation are then created using Unity software. This work shows the interest of such tools in bringing out the regional cultural heritage and making it attractive to the public.

  3. Implementing Virtual Reality Technology as an Effective WEB Based KIOSK: Darulaman's Teacher Training College Tour (IPDA VR Tour)

    ERIC Educational Resources Information Center

    Azman, Fadzil

    2004-01-01

    At present the development of Virtual Reality (VR) technology is expanding due to the importance and needs to use the 3D elements and 360 degrees panorama. In expressing a clearer picture to consumers in various fields such as education, military, medicine, entertainment and so on. In live with the development the web based VR kiosk project in…

  4. Implementing Virtual Reality Technology as an Effective Web Based Kiosk: Darulaman's Teacher Training College Tour (Ipda Vr Tour)

    ERIC Educational Resources Information Center

    Fadzil, Azman

    2006-01-01

    At present, the development of Virtual Reality (VR) technology is expanding due to the importance and needs to use the 3D elements and 360 degrees panorama in expressing a clearer picture to consumers in various fields such as education, military, medicine, entertainment and so on. The web based VR kiosk project in Darulaman's Teacher Training…

  5. fVisiOn: glasses-free tabletop 3D display to provide virtual 3D media naturally alongside real media

    NASA Astrophysics Data System (ADS)

    Yoshida, Shunsuke

    2012-06-01

    A novel glasses-free tabletop 3D display, named fVisiOn, floats virtual 3D objects on an empty, flat, tabletop surface and enables multiple viewers to observe raised 3D images from any angle at 360° Our glasses-free 3D image reproduction method employs a combination of an optical device and an array of projectors and produces continuous horizontal parallax in the direction of a circular path located above the table. The optical device shapes a hollow cone and works as an anisotropic diffuser. The circularly arranged projectors cast numerous rays into the optical device. Each ray represents a particular ray that passes a corresponding point on a virtual object's surface and orients toward a viewing area around the table. At any viewpoint on the ring-shaped viewing area, both eyes collect fractional images from different projectors, and all the viewers around the table can perceive the scene as 3D from their perspectives because the images include binocular disparity. The entire principle is installed beneath the table, so the tabletop area remains clear. No ordinary tabletop activities are disturbed. Many people can naturally share the 3D images displayed together with real objects on the table. In our latest prototype, we employed a handmade optical device and an array of over 100 tiny projectors. This configuration reproduces static and animated 3D scenes for a 130° viewing area and allows 5-cm-tall virtual characters to play soccer and dance on the table.

  6. Interactive graphical model building using telepresence and virtual reality

    SciTech Connect

    Cooke, C.; Stansfield, S.

    1993-10-01

    This paper presents a prototype system developed at Sandia National Laboratories to create and verify computer-generated graphical models of remote physical environments. The goal of the system is to create an interface between an operator and a computer vision system so that graphical models can be created interactively. Virtual reality and telepresence are used to allow interaction between the operator, computer, and remote environment. A stereo view of the remote environment is produced by two CCD cameras. The cameras are mounted on a three degree-of-freedom platform which is slaved to a mechanically-tracked, stereoscopic viewing device. This gives the operator a sense of immersion in the physical environment. The stereo video is enhanced by overlaying the graphical model onto it. Overlay of the graphical model onto the stereo video allows visual verification of graphical models. Creation of a graphical model is accomplished by allowing the operator to assist the computer in modeling. The operator controls a 3-D cursor to mark objects to be modeled. The computer then automatically extracts positional and geometric information about the object and creates the graphical model.

  7. Representing 3D virtual objects: interaction between visuo-spatial ability and type of exploration.

    PubMed

    Meijer, Frank; van den Broek, Egon L

    2010-03-17

    We investigated individual differences in interactively exploring 3D virtual objects. 36 participants explored 24 simple and 24 difficult objects (composed of respectively three and five Biederman geons) actively, passively, or not at all. Both their 3D mental representation of the objects and visuo-spatial ability was assessed. Results show that, regardless of the object's complexity, people with a low VSA benefit from active exploration of objects, where people with a middle or high VSA do not. These findings extend and refine earlier research on interactively learning visuo-spatial information and underline the importance to take individual differences into account. PMID:20116394

  8. Representing 3D virtual objects: interaction between visuo-spatial ability and type of exploration.

    PubMed

    Meijer, Frank; van den Broek, Egon L

    2010-03-17

    We investigated individual differences in interactively exploring 3D virtual objects. 36 participants explored 24 simple and 24 difficult objects (composed of respectively three and five Biederman geons) actively, passively, or not at all. Both their 3D mental representation of the objects and visuo-spatial ability was assessed. Results show that, regardless of the object's complexity, people with a low VSA benefit from active exploration of objects, where people with a middle or high VSA do not. These findings extend and refine earlier research on interactively learning visuo-spatial information and underline the importance to take individual differences into account.

  9. Alleviating travel anxiety through virtual reality and narrated video technology.

    PubMed

    Ahn, J C; Lee, O

    2013-01-01

    This study presents an empirical evidence of benefit of narrative video clips in embedded virtual reality websites of hotels for relieving travel anxiety. Even though it was proven that virtual reality functions do provide some relief in travel anxiety, a stronger virtual reality website can be built when narrative video clips that show video clips with narration about important aspects of the hotel. We posit that these important aspects are 1. Escape route and 2. Surrounding neighborhood information, which are derived from the existing research on anxiety disorder as well as travel anxiety. Thus we created a video clip that showed and narrated about the escape route from the hotel room, another video clip that showed and narrated about surrounding neighborhood. We then conducted experiments with this enhanced virtual reality website of a hotel by having human subjects play with the website and fill out a questionnaire. The result confirms our hypothesis that there is a statistically significant relationship between the degree of travel anxiety and psychological relief caused by the use of embedded virtual reality functions with narrative video clips of a hotel website (Tab. 2, Fig. 3, Ref. 26).

  10. Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments

    NASA Astrophysics Data System (ADS)

    Portalés, Cristina; Lerma, José Luis; Navarro, Santiago

    2010-01-01

    Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.

  11. Virtual reality exposure in the treatment of social phobia.

    PubMed

    Klinger, Evelyne; Légeron, Patrick; Roy, Stéphane; Chemin, Isabelle; Lauer, Françoise; Nugues, Pierre

    2004-01-01

    Social phobia is one of the most frequent psychiatric disorders and is accessible to two forms of scientifically validated treatments: anti-depressant drugs and cognitive-behavioral therapies. Graded exposure to feared social situations (either in vivo or by imagining the situations) is fundamental to obtain an improvement of the anxious symptoms. Virtual reality (VR) may be an alternative to these standard exposure techniques and seems to bring significant advantages by allowing exposures to numerous and varied situations. Moreover studies have shown that human subjects are appropriately sensitive to virtual environments. This chapter reports the definition of a VR-based clinical protocol and a study to treat social phobia using virtual reality techniques. The virtual environments used in the treatment reproduce four situations that social phobics feel the most threatening: performance, intimacy, scrutiny and assertiveness. With the help of the therapist, the patient learns adapted cognitions and behaviors when coping with social situations, with the aim of reducing her or his anxiety in the corresponding real life situations. Some studies have been carried out using virtual reality in the treatment of fear of public speaking, which is only a small part of the symptomatology of most of social phobic patients. The novelty of our work is to address a larger group of situations that the phobic patients experience with high anxiety. In our protocol, the efficacy of the virtual reality treatment is compared to well established and well validated group cognitive-behavioral treatment. PMID:15295148

  12. Versatile, Immersive, Creative and Dynamic Virtual 3-D Healthcare Learning Environments: A Review of the Literature

    PubMed Central

    2008-01-01

    The author provides a critical overview of three-dimensional (3-D) virtual worlds and “serious gaming” that are currently being developed and used in healthcare professional education and medicine. The relevance of this e-learning innovation for teaching students and professionals is debatable and variables influencing adoption, such as increased knowledge, self-directed learning, and peer collaboration, by academics, healthcare professionals, and business executives are examined while looking at various Web 2.0/3.0 applications. There is a need for more empirical research in order to unearth the pedagogical outcomes and advantages associated with this e-learning technology. A brief description of Roger’s Diffusion of Innovations Theory and Siemens’ Connectivism Theory for today’s learners is presented as potential underlying pedagogical tenets to support the use of virtual 3-D learning environments in higher education and healthcare. PMID:18762473

  13. Visualizing Mars Using Virtual Reality: A State of the Art Mapping Technique Used on Mars Pathfinder

    NASA Technical Reports Server (NTRS)

    Stoker, C.; Zbinden, E.; Blackmon, T.; Nguyen, L.

    1999-01-01

    We describe an interactive terrain visualization system which rapidly generates and interactively displays photorealistic three-dimensional (3-D) models produced from stereo images. This product, first demonstrated in Mars Pathfinder, is interactive, 3-D, and can be viewed in an immersive display which qualifies it for the name Virtual Reality (VR). The use of this technology on Mars Pathfinder was the first use of VR for geologic analysis. A primary benefit of using VR to display geologic information is that it provides an improved perception of depth and spatial layout of the remote site. The VR aspect of the display allows an operator to move freely in the environment, unconstrained by the physical limitations of the perspective from which the data were acquired. Virtual Reality offers a way to archive and retrieve information in a way that is intuitively obvious. Combining VR models with stereo display systems can give the user a sense of presence at the remote location. The capability, to interactively perform measurements from within the VR model offers unprecedented ease in performing operations that are normally time consuming and difficult using other techniques. Thus, Virtual Reality can be a powerful a cartographic tool. Additional information is contained in the original extended abstract.

  14. 3D augmented reality for improving social acceptance and public participation in wind farms planning

    NASA Astrophysics Data System (ADS)

    Grassi, S.; Klein, T. M.

    2016-09-01

    Wind energy is one of the most important source of renewable energy characterized by a significant growth in the last decades and giving a more and more relevant contribution to the energy supply. One of the main disadvantages of a faster integration of wind energy into the energy mix is related to the visual impact of wind turbines on the landscape. In addition, the siting of new massive infrastructures has the potential to threaten a community's well-being if new projects are perceived being unfair. The public perception of the impact of wind turbines on the landscape is also crucial for their acceptance. The implementation of wind energy projects is hampered often because of a lack of planning or communication tools enabling a more transparent and efficient interaction between all stakeholders involved in the projects (i.e. developers, local communities and administrations, NGOs, etc.). Concerning the visual assessment of wind farms, a critical gap lies in effective visualization tools to improve the public perception of alternative wind turbines layouts. In this paper, we describe the advantages of a 3D dynamical and interactive visualization platform for an augmented reality to support wind energy planners in order to enhance the social acceptance of new wind energy projects.

  15. Future directions for the development of virtual reality within an automotive manufacturer.

    PubMed

    Lawson, Glyn; Salanitri, Davide; Waterfield, Brian

    2016-03-01

    Virtual Reality (VR) can reduce time and costs, and lead to increases in quality, in the development of a product. Given the pressure on car companies to reduce time-to-market and to continually improve quality, the automotive industry has championed the use of VR across a number of applications, including design, manufacturing, and training. This paper describes interviews with 11 engineers and employees of allied disciplines from an automotive manufacturer about their current physical and virtual properties and processes. The results guided a review of research findings and scientific advances from the academic literature, which formed the basis of recommendations for future developments of VR technologies and applications. These include: develop a greater range of virtual contexts; use multi-sensory simulation; address perceived differences between virtual and real cars; improve motion capture capabilities; implement networked 3D technology; and use VR for market research. PMID:26164106

  16. Future directions for the development of virtual reality within an automotive manufacturer.

    PubMed

    Lawson, Glyn; Salanitri, Davide; Waterfield, Brian

    2016-03-01

    Virtual Reality (VR) can reduce time and costs, and lead to increases in quality, in the development of a product. Given the pressure on car companies to reduce time-to-market and to continually improve quality, the automotive industry has championed the use of VR across a number of applications, including design, manufacturing, and training. This paper describes interviews with 11 engineers and employees of allied disciplines from an automotive manufacturer about their current physical and virtual properties and processes. The results guided a review of research findings and scientific advances from the academic literature, which formed the basis of recommendations for future developments of VR technologies and applications. These include: develop a greater range of virtual contexts; use multi-sensory simulation; address perceived differences between virtual and real cars; improve motion capture capabilities; implement networked 3D technology; and use VR for market research.

  17. The potential of 3-D virtual worlds in professional nursing education.

    PubMed

    Hansen, Margaret M; Murray, Peter J; Erdley, W Scott

    2009-01-01

    Three-dimensional (3-D) virtual worlds (VWs), such as Second Life, are actively being explored for their potential use in health care and nursing professional education and even for practice. The relevance of this e-learning innovation on a large scale for teaching students and professionals is yet to be demonstrated and variables influencing adoption, such as increased knowledge, self-directed learning, and peer collaboration, by academics, and health care professionals requires empirical research. PMID:19592909

  18. [Parallel virtual reality visualization of extreme large medical datasets].

    PubMed

    Tang, Min

    2010-04-01

    On the basis of a brief description of grid computing, the essence and critical techniques of parallel visualization of extreme large medical datasets are discussed in connection with Intranet and common-configuration computers of hospitals. In this paper are introduced several kernel techniques, including the hardware structure, software framework, load balance and virtual reality visualization. The Maximum Intensity Projection algorithm is realized in parallel using common PC cluster. In virtual reality world, three-dimensional models can be rotated, zoomed, translated and cut interactively and conveniently through the control panel built on virtual reality modeling language (VRML). Experimental results demonstrate that this method provides promising and real-time results for playing the role in of a good assistant in making clinical diagnosis. PMID:20481303

  19. The cognitive apprenticeship theory for the teaching of mathematics in an online 3D virtual environment

    NASA Astrophysics Data System (ADS)

    Bouta, Hara; Paraskeva, Fotini

    2013-03-01

    Research spanning two decades shows that there is a continuing development of 3D virtual worlds and investment in such environments for educational purposes. Research stresses the need for these environments to be well-designed and for suitable pedagogies to be implemented in the teaching practice in order for these worlds to be fully effective. To this end, we propose a pedagogical framework based on the cognitive apprenticeship for deriving principles and guidelines to inform the design, development and use of a 3D virtual environment. This study examines how the use of a 3D virtual world facilitates the teaching of mathematics in primary education by combining design principles and guidelines based on the Cognitive Apprenticeship Theory and the teaching methods that this theory introduces. We focus specifically on 5th and 6th grade students' engagement (behavioral, affective and cognitive) while learning fractional concepts over a period of two class sessions. Quantitative and qualitative analyses indicate considerable improvement in the engagement of the students who participated in the experiment. This paper presents the findings regarding students' cognitive engagement in the process of comprehending basic fractional concepts - notoriously hard for students to master. The findings are encouraging and suggestions are made for further research.

  20. 3D Virtual Worlds as Art Media and Exhibition Arenas: Students' Responses and Challenges in Contemporary Art Education

    ERIC Educational Resources Information Center

    Lu, Lilly

    2013-01-01

    3D virtual worlds (3D VWs) are considered one of the emerging learning spaces of the 21st century; however, few empirical studies have investigated educational applications and student learning aspects in art education. This study focused on students' responses to and challenges with 3D VWs in both aspects. The findings show that most…

  1. Virtual 3D bladder reconstruction for augmented medical records from white light cystoscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Lurie, Kristen L.; Zlatev, Dimitar V.; Angst, Roland; Liao, Joseph C.; Ellerbee, Audrey K.

    2016-02-01

    Bladder cancer has a high recurrence rate that necessitates lifelong surveillance to detect mucosal lesions. Examination with white light cystoscopy (WLC), the standard of care, is inherently subjective and data storage limited to clinical notes, diagrams, and still images. A visual history of the bladder wall can enhance clinical and surgical management. To address this clinical need, we developed a tool to transform in vivo WLC videos into virtual 3-dimensional (3D) bladder models using advanced computer vision techniques. WLC videos from rigid cystoscopies (1280 x 720 pixels) were recorded at 30 Hz followed by immediate camera calibration to control for image distortions. Video data were fed into an automated structure-from-motion algorithm that generated a 3D point cloud followed by a 3D mesh to approximate the bladder surface. The highest quality cystoscopic images were projected onto the approximated bladder surface to generate a virtual 3D bladder reconstruction. In intraoperative WLC videos from 36 patients undergoing transurethral resection of suspected bladder tumors, optimal reconstruction was achieved from frames depicting well-focused vasculature, when the bladder was maintained at constant volume with minimal debris, and when regions of the bladder wall were imaged multiple times. A significant innovation of this work is the ability to perform the reconstruction using video from a clinical procedure collected with standard equipment, thereby facilitating rapid clinical translation, application to other forms of endoscopy and new opportunities for longitudinal studies of cancer recurrence.

  2. Elderly Healthcare Monitoring Using an Avatar-Based 3D Virtual Environment

    PubMed Central

    Pouke, Matti; Häkkilä, Jonna

    2013-01-01

    Homecare systems for elderly people are becoming increasingly important due to both economic reasons as well as patients’ preferences. Sensor-based surveillance technologies are an expected future trend, but research so far has devoted little attention to the User Interface (UI) design of such systems and the user-centric design approach. In this paper, we explore the possibilities of an avatar-based 3D visualization system, which exploits wearable sensors and human activity simulations. We present a technical prototype and the evaluation of alternative concept designs for UIs based on a 3D virtual world. The evaluation was conducted with homecare providers through focus groups and an online survey. Our results show firstly that systems taking advantage of 3D virtual world visualization techniques have potential especially due to the privacy preserving and simplified information presentation style, and secondly that simple representations and glancability should be emphasized in the design. The identified key use cases highlight that avatar-based 3D presentations can be helpful if they provide an overview as well as details on demand. PMID:24351747

  3. Elderly healthcare monitoring using an avatar-based 3D virtual environment.

    PubMed

    Pouke, Matti; Häkkilä, Jonna

    2013-12-17

    Homecare systems for elderly people are becoming increasingly important due to both economic reasons as well as patients' preferences. Sensor-based surveillance technologies are an expected future trend, but research so far has devoted little attention to the User Interface (UI) design of such systems and the user-centric design approach. In this paper, we explore the possibilities of an avatar-based 3D visualization system, which exploits wearable sensors and human activity simulations. We present a technical prototype and the evaluation of alternative concept designs for UIs based on a 3D virtual world. The evaluation was conducted with homecare providers through focus groups and an online survey. Our results show firstly that systems taking advantage of 3D virtual world visualization techniques have potential especially due to the privacy preserving and simplified information presentation style, and secondly that simple representations and glancability should be emphasized in the design. The identified key use cases highlight that avatar-based 3D presentations can be helpful if they provide an overview as well as details on demand.

  4. 3D Audio System

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Ames Research Center research into virtual reality led to the development of the Convolvotron, a high speed digital audio processing system that delivers three-dimensional sound over headphones. It consists of a two-card set designed for use with a personal computer. The Convolvotron's primary application is presentation of 3D audio signals over headphones. Four independent sound sources are filtered with large time-varying filters that compensate for motion. The perceived location of the sound remains constant. Possible applications are in air traffic control towers or airplane cockpits, hearing and perception research and virtual reality development.

  5. Architecture of web services in the enhancement of real-time 3D video virtualization in cloud environment

    NASA Astrophysics Data System (ADS)

    Bada, Adedayo; Wang, Qi; Alcaraz-Calero, Jose M.; Grecos, Christos

    2016-04-01

    This paper proposes a new approach to improving the application of 3D video rendering and streaming by jointly exploring and optimizing both cloud-based virtualization and web-based delivery. The proposed web service architecture firstly establishes a software virtualization layer based on QEMU (Quick Emulator), an open-source virtualization software that has been able to virtualize system components except for 3D rendering, which is still in its infancy. The architecture then explores the cloud environment to boost the speed of the rendering at the QEMU software virtualization layer. The capabilities and inherent limitations of Virgil 3D, which is one of the most advanced 3D virtual Graphics Processing Unit (GPU) available, are analyzed through benchmarking experiments and integrated into the architecture to further speed up the rendering. Experimental results are reported and analyzed to demonstrate the benefits of the proposed approach.

  6. Fast extraction of minimal paths in 3D images and applications to virtual endoscopy.

    PubMed

    Deschamps, T; Cohen, L D

    2001-12-01

    The aim of this article is to build trajectories for virtual endoscopy inside 3D medical images, using the most automatic way. Usually the construction of this trajectory is left to the clinician who must define some points on the path manually using three orthogonal views. But for a complex structure such as the colon, those views give little information on the shape of the object of interest. The path construction in 3D images becomes a very tedious task and precise a priori knowledge of the structure is needed to determine a suitable trajectory. We propose a more automatic path tracking method to overcome those drawbacks: we are able to build a path, given only one or two end points and the 3D image as inputs. This work is based on previous work by Cohen and Kimmel [Int. J. Comp. Vis. 24 (1) (1997) 57] for extracting paths in 2D images using Fast Marching algorithm. Our original contribution is twofold. On the first hand, we present a general technical contribution which extends minimal paths to 3D images and gives new improvements of the approach that are relevant in 2D as well as in 3D to extract linear structures in images. It includes techniques to make the path extraction scheme faster and easier, by reducing the user interaction. We also develop a new method to extract a centered path in tubular structures. Synthetic and real medical images are used to illustrate each contribution. On the other hand, we show that our method can be efficiently applied to the problem of finding a centered path in tubular anatomical structures with minimum interactivity, and that this path can be used for virtual endoscopy. Results are shown in various anatomical regions (colon, brain vessels, arteries) with different 3D imaging protocols (CT, MR). PMID:11731307

  7. Exploring Learning through Audience Interaction in Virtual Reality Dome Theaters

    NASA Astrophysics Data System (ADS)

    Apostolellis, Panagiotis; Daradoumis, Thanasis

    Informal learning in public spaces like museums, science centers and planetariums is increasingly popular during the last years. Recent advancements in large-scale displays allowed contemporary technology-enhanced museums to get equipped with digital domes, some with real-time capabilities like Virtual Reality systems. By conducting extensive literature review we have come to the conclusion that little to no research has been carried out on the leaning outcomes that the combination of VR and audience interaction can provide in the immersive environments of dome theaters. Thus, we propose that audience collaboration in immersive virtual reality environments presents a promising approach to support effective learning in groups of school aged children.

  8. Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation.

    PubMed

    Wang, Junchen; Suenaga, Hideyuki; Liao, Hongen; Hoshi, Kazuto; Yang, Liangjing; Kobayashi, Etsuko; Sakuma, Ichiro

    2015-03-01

    Autostereoscopic 3D image overlay for augmented reality (AR) based surgical navigation has been studied and reported many times. For the purpose of surgical overlay, the 3D image is expected to have the same geometric shape as the original organ, and can be transformed to a specified location for image overlay. However, how to generate a 3D image with high geometric fidelity and quantitative evaluation of 3D image's geometric accuracy have not been addressed. This paper proposes a graphics processing unit (GPU) based computer-generated integral imaging pipeline for real-time autostereoscopic 3D display, and an automatic closed-loop 3D image calibration paradigm for displaying undistorted 3D images. Based on the proposed methods, a novel AR device for 3D image surgical overlay is presented, which mainly consists of a 3D display, an AR window, a stereo camera for 3D measurement, and a workstation for information processing. The evaluation on the 3D image rendering performance with 2560×1600 elemental image resolution shows the rendering speeds of 50-60 frames per second (fps) for surface models, and 5-8 fps for large medical volumes. The evaluation of the undistorted 3D image after the calibration yields sub-millimeter geometric accuracy. A phantom experiment simulating oral and maxillofacial surgery was also performed to evaluate the proposed AR overlay device in terms of the image registration accuracy, 3D image overlay accuracy, and the visual effects of the overlay. The experimental results show satisfactory image registration and image overlay accuracy, and confirm the system usability.

  9. Virtual Boutique: a 3D modeling and content-based management approach to e-commerce

    NASA Astrophysics Data System (ADS)

    Paquet, Eric; El-Hakim, Sabry F.

    2000-12-01

    The Virtual Boutique is made out of three modules: the decor, the market and the search engine. The decor is the physical space occupied by the Virtual Boutique. It can reproduce any existing boutique. For this purpose, photogrammetry is used. A set of pictures of a real boutique or space is taken and a virtual 3D representation of this space is calculated from them. Calculations are performed with software developed at NRC. This representation consists of meshes and texture maps. The camera used in the acquisition process determines the resolution of the texture maps. Decorative elements are added like painting, computer generated objects and scanned objects. The objects are scanned with laser scanner developed at NRC. This scanner allows simultaneous acquisition of range and color information based on white laser beam triangulation. The second module, the market, is made out of all the merchandises and the manipulators, which are used to manipulate and compare the objects. The third module, the search engine, can search the inventory based on an object shown by the customer in order to retrieve similar objects base don shape and color. The items of interest are displayed in the boutique by reconfiguring the market space, which mean that the boutique can be continuously customized according to the customer's needs. The Virtual Boutique is entirely written in Java 3D and can run in mono and stereo mode and has been optimized in order to allow high quality rendering.

  10. CT virtual endoscopy and 3D stereoscopic visualisation in the evaluation of coronary stenting.

    PubMed

    Sun, Z; Lawrence-Brown

    2009-10-01

    The aim of this case report is to present the additional value provided by CT virtual endoscopy and 3D stereoscopic visualisation when compared with 2D visualisations in the assessment of coronary stenting. A 64-year old patient was treated with left coronary stenting 8 years ago and recently followed up with multidetector row CT angiography. An in-stent restenosis of the left coronary artery was suspected based on 2D axial and multiplanar reformatted images. 3D virtual endoscopy was generated to demonstrate the smooth intraluminal surface of coronary artery wall, and there was no evidence of restenosis or intraluminal irregularity. Virtual fly-through of the coronary artery was produced to examine the entire length of the coronary artery with the aim of demonstrating the intraluminal changes following placement of the coronary stent. In addition, stereoscopic views were generated to show the relationship between coronary artery branches and the coronary stent. In comparison with traditional 2D visualisations, virtual endoscopy was useful for assessment of the intraluminal appearance of the coronary artery wall following coronary stent implantation, while stereoscopic visualisation improved observers' understanding of the complex cardiac structures. Thus, both methods could be used as a complementary tool in cardiac imaging.

  11. Thermal feedback in virtual reality and telerobotic systems

    NASA Technical Reports Server (NTRS)

    Zerkus, Mike; Becker, Bill; Ward, Jon; Halvorsen, Lars

    1994-01-01

    A new concept has been developed that allows temperature to be part of the virtual world. The Displaced Temperature Sensing System (DTSS) can 'display' temperature in a virtual reality system.The DTSS can also serve as a feedback device for telerobotics. For virtual reality applications the virtual world software would be required to have a temperature map of its world. By whatever means (magnetic tracker, ultrasound tracker, etc.) the hand and fingers, which have been instrumented with thermodes, would be tracked. The temperature associated with the current position would be transmitted to the DRSS via a serial data link. The DTSS would provide that temperature to the fingers. For telerobotic operation the function of the DTSS is to transmit a temperature from a remote location to the fingers where the temperature can be felt.

  12. A Calligraphy Mastering Support System Using Virtual Reality Technology and its Learning Effects

    NASA Astrophysics Data System (ADS)

    Muranaka, Noriaki; Yamamoto, Takafumi; Imanishi, Shigeru

    The virtual reality is one in the intelligence information carrier technology. As the application to the education field of the virtual reality, we examine a support system for the calligraphy mastering. The purpose of this system is to realize the model where the information during writing progress can be real time displayed. As a result, we can learn calligraphy casually without being limited to the place and the time. We use 3-D computer graphics (CG) for the virtual image to decrease memory capacity and we examine about the learning effect of this system. We can not get the sense of the writing brush from the tablet pen of the conventional system. Therefore, we developed the writing brush type input device which is near the sense of the writing brush. Moreover, we developed the automatic animation generation processing used 3-D CG. The practice person can experience the subtle writing progress of calligraphy from the eye of calligraphy teacher using this animation. We are using a semitransparent screen and a half mirror instead of HMD to use in general VR. Practice persons improve at the short time by learning their writing brush forming die input pen synchronous with virtual writing brush.

  13. Web GIS in practice VII: stereoscopic 3-D solutions for online maps and virtual globes

    PubMed Central

    Boulos, Maged N Kamel; Robinson, Larry R

    2009-01-01

    Because our pupils are about 6.5 cm apart, each eye views a scene from a different angle and sends a unique image to the visual cortex, which then merges the images from both eyes into a single picture. The slight difference between the right and left images allows the brain to properly perceive the 'third dimension' or depth in a scene (stereopsis). However, when a person views a conventional 2-D (two-dimensional) image representation of a 3-D (three-dimensional) scene on a conventional computer screen, each eye receives essentially the same information. Depth in such cases can only be approximately inferred from visual clues in the image, such as perspective, as only one image is offered to both eyes. The goal of stereoscopic 3-D displays is to project a slightly different image into each eye to achieve a much truer and realistic perception of depth, of different scene planes, and of object relief. This paper presents a brief review of a number of stereoscopic 3-D hardware and software solutions for creating and displaying online maps and virtual globes (such as Google Earth) in "true 3D", with costs ranging from almost free to multi-thousand pounds sterling. A practical account is also given of the experience of the USGS BRD UMESC (United States Geological Survey's Biological Resources Division, Upper Midwest Environmental Sciences Center) in setting up a low-cost, full-colour stereoscopic 3-D system. PMID:19849837

  14. A Voice and Mouse Input Interface for 3D Virtual Environments

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Bryson, Steve T.

    2003-01-01

    There have been many successful stories on how 3D input devices can be fully integrated into an immersive virtual environment. Electromagnetic trackers, optical trackers, gloves, and flying mice are just some of these input devices. Though we can use existing 3D input devices that are commonly used for VR applications, there are several factors that prevent us from choosing these input devices for our applications. One main factor is that most of these tracking devices are not suitable for prolonged use due to human fatigue associated with using them. A second factor is that many of them would occupy additional office space. Another factor is that many of the 3D input devices are expensive due to the unusual hardware that are required. For our VR applications, we want a user interface that would work naturally with standard equipment. In this paper, we demonstrate applications or our proposed muitimodal interface using a 3D dome display. We also show that effective data analysis can be achieved while the scientists view their data rendered inside the dome display and perform user interactions simply using the mouse and voice input. Though the sphere coordinate grid seems to be ideal for interaction using a 3D dome display, we can also use other non-spherical grids as well.

  15. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.

  16. Web GIS in practice VII: stereoscopic 3-D solutions for online maps and virtual globes

    USGS Publications Warehouse

    Boulos, Maged N.K.; Robinson, Larry R.

    2009-01-01

    Because our pupils are about 6.5 cm apart, each eye views a scene from a different angle and sends a unique image to the visual cortex, which then merges the images from both eyes into a single picture. The slight difference between the right and left images allows the brain to properly perceive the 'third dimension' or depth in a scene (stereopsis). However, when a person views a conventional 2-D (two-dimensional) image representation of a 3-D (three-dimensional) scene on a conventional computer screen, each eye receives essentially the same information. Depth in such cases can only be approximately inferred from visual clues in the image, such as perspective, as only one image is offered to both eyes. The goal of stereoscopic 3-D displays is to project a slightly different image into each eye to achieve a much truer and realistic perception of depth, of different scene planes, and of object relief. This paper presents a brief review of a number of stereoscopic 3-D hardware and software solutions for creating and displaying online maps and virtual globes (such as Google Earth) in "true 3D", with costs ranging from almost free to multi-thousand pounds sterling. A practical account is also given of the experience of the USGS BRD UMESC (United States Geological Survey's Biological Resources Division, Upper Midwest Environmental Sciences Center) in setting up a low-cost, full-colour stereoscopic 3-D system.

  17. Blood Pool Segmentation Results in Superior Virtual Cardiac Models than Myocardial Segmentation for 3D Printing.

    PubMed

    Farooqi, Kanwal M; Lengua, Carlos Gonzalez; Weinberg, Alan D; Nielsen, James C; Sanz, Javier

    2016-08-01

    The method of cardiac magnetic resonance (CMR) three-dimensional (3D) image acquisition and post-processing which should be used to create optimal virtual models for 3D printing has not been studied systematically. Patients (n = 19) who had undergone CMR including both 3D balanced steady-state free precession (bSSFP) imaging and contrast-enhanced magnetic resonance angiography (MRA) were retrospectively identified. Post-processing for the creation of virtual 3D models involved using both myocardial (MS) and blood pool (BP) segmentation, resulting in four groups: Group 1-bSSFP/MS, Group 2-bSSFP/BP, Group 3-MRA/MS and Group 4-MRA/BP. The models created were assessed by two raters for overall quality (1-poor; 2-good; 3-excellent) and ability to identify predefined vessels (1-5: superior vena cava, inferior vena cava, main pulmonary artery, ascending aorta and at least one pulmonary vein). A total of 76 virtual models were created from 19 patient CMR datasets. The mean overall quality scores for Raters 1/2 were 1.63 ± 0.50/1.26 ± 0.45 for Group 1, 2.12 ± 0.50/2.26 ± 0.73 for Group 2, 1.74 ± 0.56/1.53 ± 0.61 for Group 3 and 2.26 ± 0.65/2.68 ± 0.48 for Group 4. The numbers of identified vessels for Raters 1/2 were 4.11 ± 1.32/4.05 ± 1.31 for Group 1, 4.90 ± 0.46/4.95 ± 0.23 for Group 2, 4.32 ± 1.00/4.47 ± 0.84 for Group 3 and 4.74 ± 0.56/4.63 ± 0.49 for Group 4. Models created using BP segmentation (Groups 2 and 4) received significantly higher ratings than those created using MS for both overall quality and number of vessels visualized (p < 0.05), regardless of the acquisition technique. There were no significant differences between Groups 1 and 3. The ratings for Raters 1 and 2 had good correlation for overall quality (ICC = 0.63) and excellent correlation for the total number of vessels visualized (ICC = 0.77). The intra-rater reliability was good for Rater A (ICC = 0.65). Three models were successfully printed

  18. Blood Pool Segmentation Results in Superior Virtual Cardiac Models than Myocardial Segmentation for 3D Printing.

    PubMed

    Farooqi, Kanwal M; Lengua, Carlos Gonzalez; Weinberg, Alan D; Nielsen, James C; Sanz, Javier

    2016-08-01

    The method of cardiac magnetic resonance (CMR) three-dimensional (3D) image acquisition and post-processing which should be used to create optimal virtual models for 3D printing has not been studied systematically. Patients (n = 19) who had undergone CMR including both 3D balanced steady-state free precession (bSSFP) imaging and contrast-enhanced magnetic resonance angiography (MRA) were retrospectively identified. Post-processing for the creation of virtual 3D models involved using both myocardial (MS) and blood pool (BP) segmentation, resulting in four groups: Group 1-bSSFP/MS, Group 2-bSSFP/BP, Group 3-MRA/MS and Group 4-MRA/BP. The models created were assessed by two raters for overall quality (1-poor; 2-good; 3-excellent) and ability to identify predefined vessels (1-5: superior vena cava, inferior vena cava, main pulmonary artery, ascending aorta and at least one pulmonary vein). A total of 76 virtual models were created from 19 patient CMR datasets. The mean overall quality scores for Raters 1/2 were 1.63 ± 0.50/1.26 ± 0.45 for Group 1, 2.12 ± 0.50/2.26 ± 0.73 for Group 2, 1.74 ± 0.56/1.53 ± 0.61 for Group 3 and 2.26 ± 0.65/2.68 ± 0.48 for Group 4. The numbers of identified vessels for Raters 1/2 were 4.11 ± 1.32/4.05 ± 1.31 for Group 1, 4.90 ± 0.46/4.95 ± 0.23 for Group 2, 4.32 ± 1.00/4.47 ± 0.84 for Group 3 and 4.74 ± 0.56/4.63 ± 0.49 for Group 4. Models created using BP segmentation (Groups 2 and 4) received significantly higher ratings than those created using MS for both overall quality and number of vessels visualized (p < 0.05), regardless of the acquisition technique. There were no significant differences between Groups 1 and 3. The ratings for Raters 1 and 2 had good correlation for overall quality (ICC = 0.63) and excellent correlation for the total number of vessels visualized (ICC = 0.77). The intra-rater reliability was good for Rater A (ICC = 0.65). Three models were successfully printed

  19. Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project

    PubMed Central

    Zucchella, Chiara; Sinforiani, Elena; Tassorelli, Cristina; Cavallini, Elena; Tost-Pardell, Daniela; Grau, Sergi; Pazzi, Stefania; Puricelli, Stefano; Bernini, Sara; Bottiroli, Sara; Vecchi, Tomaso; Sandrini, Giorgio; Nappi, Giuseppe

    2014-01-01

    Summary Conventional cognitive assessment is based on a pencil-and-paper neuropsychological evaluation, which is time consuming, expensive and requires the involvement of several professionals. Information and communication technology could be exploited to allow the development of tools that are easy to use, reduce the amount of data processing, and provide controllable test conditions. Serious games (SGs) have the potential to be new and effective tools in the management and treatment of cognitive impairments in the elderly. Moreover, by adopting SGs in 3D virtual reality settings, cognitive functions might be evaluated using tasks that simulate daily activities, increasing the “ecological validity” of the assessment. In this commentary we report our experience in the creation of the Smart Aging platform, a 3D SG-and virtual environment-based platform for the early identification and characterization of mild cognitive impairment. PMID:25473734

  20. Dynamic WIFI-Based Indoor Positioning in 3D Virtual World

    NASA Astrophysics Data System (ADS)

    Chan, S.; Sohn, G.; Wang, L.; Lee, W.

    2013-11-01

    A web-based system based on the 3DTown project was proposed using Google Earth plug-in that brings information from indoor positioning devices and real-time sensors into an integrated 3D indoor and outdoor virtual world to visualize the dynamics of urban life within the 3D context of a city. We addressed limitation of the 3DTown project with particular emphasis on video surveillance camera used for indoor tracking purposes. The proposed solution was to utilize wireless local area network (WLAN) WiFi as a replacement technology for localizing objects of interest due to the wide spread availability and large coverage area of WiFi in indoor building spaces. Indoor positioning was performed using WiFi without modifying existing building infrastructure or introducing additional access points (AP)s. A hybrid probabilistic approach was used for indoor positioning based on previously recorded WiFi fingerprint database in the Petrie Science and Engineering building at York University. In addition, we have developed a 3D building modeling module that allows for efficient reconstruction of outdoor building models to be integrated with indoor building models; a sensor module for receiving, distributing, and visualizing real-time sensor data; and a web-based visualization module for users to explore the dynamic urban life in a virtual world. In order to solve the problems in the implementation of the proposed system, we introduce approaches for integration of indoor building models with indoor positioning data, as well as real-time sensor information and visualization on the web-based system. In this paper we report the preliminary results of our prototype system, demonstrating the system's capability for implementing a dynamic 3D indoor and outdoor virtual world that is composed of discrete modules connected through pre-determined communication protocols.

  1. Comparing two types of navigational interfaces for Virtual Reality.

    PubMed

    Teixeira, Luís; Vilar, Elisângela; Duarte, Emília; Rebelo, Francisco; da Silva, Fernando Moreira

    2012-01-01

    Previous studies suggest significant differences between navigating virtual environments in a life-like walking manner (i.e., using treadmills or walk-in-place techniques) and virtual navigation (i.e., flying while really standing). The latter option, which usually involves hand-centric devices (e.g., joysticks), is the most common in Virtual Reality-based studies, mostly due to low costs, less space and technology demands. However, recently, new interaction devices, originally conceived for videogames have become available offering interesting potentialities for research. This study aimed to explore the potentialities of the Nintendo Wii Balance Board as a navigation interface in a Virtual Environment presented in an immersive Virtual Reality system. Comparing participants' performance while engaged in a simulated emergency egress allows determining the adequacy of such alternative navigation interface on the basis of empirical results. Forty university students participated in this study. Results show that participants were more efficient when performing navigation tasks using the Joystick than with the Balance Board. However there were no significantly differences in the behavioral compliance with exit signs. Therefore, this study suggests that, at least for tasks similar to the studied, the Balance Board have good potentiality to be used as a navigation interface for Virtual Reality systems.

  2. [A new concept in digestive surgery: the computer assisted surgical procedure, from virtual reality to telemanipulation].

    PubMed

    Marescaux, J; Clément, J M; Nord, M; Russier, Y; Tassetti, V; Mutter, D; Cotin, S; Ayache, N

    1997-11-01

    Surgical simulation increasingly appears to be an essential aspect of tomorrow's surgery. The development of a hepatic surgery simulator is an advanced concept calling for a new writing system which will transform the medical world: virtual reality. Virtual reality extends the perception of our five senses by representing more than the real state of things by the means of computer sciences and robotics. It consists of three concepts: immersion, navigation and interaction. Three reasons have led us to develop this simulator: the first is to provide the surgeon with a comprehensive visualisation of the organ. The second reason is to allow for planning and surgical simulation that could be compared with the detailed flight-plan for a commercial jet pilot. The third lies in the fact that virtual reality is an integrated part of the concept of computer assisted surgical procedure. The project consists of a sophisticated simulator which has to include five requirements: visual fidelity, interactivity, physical properties, physiological properties, sensory input and output. In this report we will describe how to get a realistic 3D model of the liver from bi-dimensional 2D medical images for anatomical and surgical training. The introduction of a tumor and the consequent planning and virtual resection is also described, as are force feedback and real-time interaction. PMID:9554121

  3. [A new concept in digestive surgery: the computer assisted surgical procedure, from virtual reality to telemanipulation].

    PubMed

    Marescaux, J; Clément, J M; Nord, M; Russier, Y; Tassetti, V; Mutter, D; Cotin, S; Ayache, N

    1997-11-01

    Surgical simulation increasingly appears to be an essential aspect of tomorrow's surgery. The development of a hepatic surgery simulator is an advanced concept calling for a new writing system which will transform the medical world: virtual reality. Virtual reality extends the perception of our five senses by representing more than the real state of things by the means of computer sciences and robotics. It consists of three concepts: immersion, navigation and interaction. Three reasons have led us to develop this simulator: the first is to provide the surgeon with a comprehensive visualisation of the organ. The second reason is to allow for planning and surgical simulation that could be compared with the detailed flight-plan for a commercial jet pilot. The third lies in the fact that virtual reality is an integrated part of the concept of computer assisted surgical procedure. The project consists of a sophisticated simulator which has to include five requirements: visual fidelity, interactivity, physical properties, physiological properties, sensory input and output. In this report we will describe how to get a realistic 3D model of the liver from bi-dimensional 2D medical images for anatomical and surgical training. The introduction of a tumor and the consequent planning and virtual resection is also described, as are force feedback and real-time interaction.

  4. Virtual Reality Simulation of the International Space Welding Experiment

    NASA Technical Reports Server (NTRS)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.

  5. Avalanche for shape and feature-based virtual screening with 3D alignment.

    PubMed

    Diller, David J; Connell, Nancy D; Welsh, William J

    2015-11-01

    This report introduces a new ligand-based virtual screening tool called Avalanche that incorporates both shape- and feature-based comparison with three-dimensional (3D) alignment between the query molecule and test compounds residing in a chemical database. Avalanche proceeds in two steps. The first step is an extremely rapid shape/feature based comparison which is used to narrow the focus from potentially millions or billions of candidate molecules and conformations to a more manageable number that are then passed to the second step. The second step is a detailed yet still rapid 3D alignment of the remaining candidate conformations to the query conformation. Using the 3D alignment, these remaining candidate conformations are scored, re-ranked and presented to the user as the top hits for further visualization and evaluation. To provide further insight into the method, the results from two prospective virtual screens are presented which show the ability of Avalanche to identify hits from chemical databases that would likely be missed by common substructure-based or fingerprint-based search methods. The Avalanche method is extended to enable patent landscaping, i.e., structural refinements to improve the patentability of hits for deployment in drug discovery campaigns. PMID:26458937

  6. Virtual Reality and Cyberspace: From Science Fiction to Science Fact.

    ERIC Educational Resources Information Center

    Stone, Robert J.

    1991-01-01

    Traces the history of virtual reality (VR), or cyberspace, and describes some of the research and development efforts currently being carried out in the United Kingdom, Europe, and the United States. Applications of VR in interactive computer-aided design (CAD), the military, leisure activities, spaceflight, teleconferencing, and medicine are…

  7. Virtual Reality: Teaching Tool of the Twenty-First Century?

    ERIC Educational Resources Information Center

    Hoffman, Helene; Vu, Dzung

    1997-01-01

    Virtual reality-based procedural and surgical simulations promise to revolutionize medical training. A wide range of simulations representing diverse content areas and varied implementation strategies are under development or in early use. The new systems will make broad-based training experiences available for students at all levels without risks…

  8. 2010 and Beyond: Virtual Reality and the Communication Classroom.

    ERIC Educational Resources Information Center

    Siddens, Paul J., III

    The use of virtual reality technology in the Communication discipline is a challenge that educators in the field should investigate thoroughly and begin to embrace as they move into the 21st century. Classrooms with access to the Internet allow students to move outside the physical boundaries of the classroom and suggest a significant change in…

  9. Virtual Reality Augmentation for Functional Assessment and Treatment of Stuttering

    ERIC Educational Resources Information Center

    Brundage, Shelley B.

    2007-01-01

    Stuttering characteristics, assessment, and treatment principles present challenges to assessment and treatment that can be addressed with virtual reality (VR) technology. This article describes how VR can be used to assist clinicians in meeting some of these challenges with adults who stutter. A review of current VR research at the Stuttering…

  10. Virtual Reality for Life Skills Education: Program Evaluation

    ERIC Educational Resources Information Center

    Vogel, Jennifer; Bowers, Clint; Meehan, Cricket; Hoeft, Raegan; Bradley, Kristy

    2004-01-01

    A program evaluation was completed for a Virtual Reality (VR) pilot project intended to aid deaf children in learning various life skills which they may be at risk of not adequately learning. Such skills include crossing the street safely, exiting a building during a fire drill, and avoiding situations in which strangers may harm them. The VR was…

  11. Advanced Virtual Reality Simulations in Aerospace Education and Research

    NASA Astrophysics Data System (ADS)

    Plotnikova, L.; Trivailo, P.

    2002-01-01

    Recent research developments at Aerospace Engineering, RMIT University have demonstrated great potential for using Virtual Reality simulations as a very effective tool in advanced structures and dynamics applications. They have also been extremely successful in teaching of various undergraduate and postgraduate courses for presenting complex concepts in structural and dynamics designs. Characteristic examples are related to the classical orbital mechanics, spacecraft attitude and structural dynamics. Advanced simulations, reflecting current research by the authors, are mainly related to the implementation of various non-linear dynamic techniques, including using Kane's equations to study dynamics of space tethered satellite systems and the Co-rotational Finite Element method to study reconfigurable robotic systems undergoing large rotations and large translations. The current article will describe the numerical implementation of the modern methods of dynamics, and will concentrate on the post-processing stage of the dynamic simulations. Numerous examples of building Virtual Reality stand-alone animations, designed by the authors, will be discussed in detail. These virtual reality examples will include: The striking feature of the developed technology is the use of the standard mathematical packages, like MATLAB, as a post-processing tool to generate Virtual Reality Modelling Language files with brilliant interactive, graphics and audio effects. These stand-alone demonstration files can be run under Netscape or Microsoft Explorer and do not require MATLAB. Use of this technology enables scientists to easily share their results with colleagues using the Internet, contributing to the flexible learning development at schools and Universities.

  12. Virtual Reality in the Teaching of the Technical Drawing.

    ERIC Educational Resources Information Center

    Gomez, Luis Alberto; Lemos, David; Carlos de Souza, Antonio; Speck, Henderson Jose

    This paper proposes the use of the Virtual Reality Modeling Language (VRML) language for teaching technical drawing projections. Three dimensional models and exercises are left to students over the Internet substituting the old wood models in the classroom. An introduction to the VRML language is presented. A detailed description on how models are…

  13. Virtual Reality in Psychological, Medical and Pedagogical Applications

    ERIC Educational Resources Information Center

    Eichenberg, Christiane, Ed.

    2012-01-01

    This book has an aim to present latest applications, trends and developments of virtual reality technologies in three humanities disciplines: in medicine, psychology and pedagogy. Studies show that people in both educational as well as in the medical therapeutic range expect more and more that modern media are included in the corresponding demand…

  14. Immersive Training Systems: Virtual Reality and Education and Training.

    ERIC Educational Resources Information Center

    Psotka, Joseph

    1995-01-01

    Describes virtual reality (VR) technology and VR research on education and training. Focuses on immersion as the key added value of VR, analyzes cognitive variables connected to immersion, how it is generated in synthetic environments and its benefits. Discusses value of tracked, immersive visual displays over nonimmersive simulations. Contains 78…

  15. Virtual Reality Hypermedia Design Frameworks for Science Instruction.

    ERIC Educational Resources Information Center

    Maule, R. William; Oh, Byron; Check, Rosa

    This paper reports on a study that conceptualizes a research framework to aid software design and development for virtual reality (VR) computer applications for instruction in the sciences. The framework provides methodologies for the processing, collection, examination, classification, and presentation of multimedia information within hyperlinked…

  16. Using Virtual Reality To Bring Your Instruction to Life.

    ERIC Educational Resources Information Center

    Gaddis, Tony

    Prepared by the manager of a virtual reality (VR) laboratory at North Carolina's Haywood Community College, the three papers collected in this document are designed to help instructors incorporate VR into their classes. The first paper reviews the characteristics of VR, defining it as a computer-generated simulation of a three-dimensional…

  17. Feasibility of Virtual Reality Environments for Adolescent Social Anxiety Disorder

    ERIC Educational Resources Information Center

    Parrish, Danielle E.; Oxhandler, Holly K.; Duron, Jacuelynn F.; Swank, Paul; Bordnick, Patrick

    2016-01-01

    Purpose: This study assessed the feasibility of virtual reality (VR) exposure as an assessment and treatment modality for youth with social anxiety disorder (SAD). Methods: Forty-one adolescents, 20 of which were identified as having SAD, were recruited from a community sample. Youth with and without SAD were exposed to two social virtual…

  18. Issues Surrounding the Use of Virtual Reality in Geographic Education

    ERIC Educational Resources Information Center

    Lisichenko, Richard

    2015-01-01

    As with all classroom innovations intended to improve geographic education, the adoption of virtual reality (VR) poses issues for consideration prior to endorsing its use. Of these, effectiveness, implementation, and safe use need to be addressed. Traditionally, sense of place, geographic knowledge, and firsthand experiences provided by field…

  19. The Future of Virtual Reality in the Classroom

    ERIC Educational Resources Information Center

    Vance, Amelia

    2016-01-01

    As state boards of education and other state policymakers consider the future of schools, sorting fad technology from technology that accelerates learning is key. Virtual reality (VR) is one such technology with promise that seems unlikely to fizzle. Hailed as potentially transformative for education and still in the early stages of application,…

  20. Exploration through Virtual Reality: Encounters with the Target Culture

    ERIC Educational Resources Information Center

    O'Brien, Mary Grantham; Levy, Richard M.

    2008-01-01

    This paper presents the results of a study on the use of a virtual reality (VR) world in a German language classroom. After participating in a lesson on the use of commands, students experienced the language and culture through navigation in a VR world. It is argued that this new medium allows for students to be immersed in the target culture and…

  1. Teaching Marketing through a Micro-Economy in Virtual Reality

    ERIC Educational Resources Information Center

    Drake-Bridges, Erin; Strelzoff, Andrew; Sulbaran, Tulio

    2011-01-01

    Teaching retailing principles to students is a challenge because although real-world wholesale and retail decision making very heavily depends on dynamic conditions, classroom exercises are limited to abstract discussions and role-playing. This article describes two interlocking class projects taught using the virtual reality of secondlife.com,…

  2. A Virtual Reality Dance Training System Using Motion Capture Technology

    ERIC Educational Resources Information Center

    Chan, J. C. P.; Leung, H.; Tang, J. K. T.; Komura, T.

    2011-01-01

    In this paper, a new dance training system based on the motion capture and virtual reality (VR) technologies is proposed. Our system is inspired by the traditional way to learn new movements-imitating the teacher's movements and listening to the teacher's feedback. A prototype of our proposed system is implemented, in which a student can imitate…

  3. Improving Weight Maintenance Using Virtual Reality (Second Life)

    ERIC Educational Resources Information Center

    Sullivan, Debra K.; Goetz, Jeannine R.; Gibson, Cheryl A.; Washburn, Richard A.; Smith, Bryan K.; Lee, Jaehoon; Gerald, Stephanie; Fincham, Tennille; Donnelly, Joseph E.

    2013-01-01

    Objective: Compare weight loss and maintenance between a face-to-face (FTF) weight management clinic and a clinic delivered via virtual reality (VR). Methods: Participants were randomized to 3 months of weight loss with a weekly clinic delivered via FTF or VR and then 6 months' weight maintenance delivered with VR. Data were collected at baseline…

  4. Language Learning in Virtual Reality Environments: Past, Present, and Future

    ERIC Educational Resources Information Center

    Lin, Tsun-Ju; Lan, Yu-Ju

    2015-01-01

    This study investigated the research trends in language learning in a virtual reality environment by conducting a content analysis of findings published in the literature from 2004 to 2013 in four top ranked computer-assisted language learning journals: "Language Learning & Technology," "CALICO Journal," "Computer…

  5. A Constructivist Approach to Virtual Reality for Experiential Learning

    ERIC Educational Resources Information Center

    Aiello, P.; D'Elia, F.; Di Tore, S.; Sibilio, M.

    2012-01-01

    Consideration of a possible use of virtual reality technologies in school contexts requires gathering together the suggestions of many scientific domains aimed at "understanding" the features of these same tools that let them offer valid support to the teaching-learning processes in educational settings. Specifically, the present study is aimed at…

  6. Virtual reality and haptic interface for cellular injection simulation.

    PubMed

    Banerjee, P Pat; Rizzi, Silvio; Luciano, Cristian

    2007-01-01

    This paper presents the application of virtual reality and haptics to the simulation of cellular micromanipulation for research, training and automation purposes. A collocated graphic/haptic working volume provides a realistic visual and force feedback to guide the user in performing a cell injection procedure. A preliminary experiment shows promising results.

  7. Virtual Reality: An Experiential Tool for Clinical Psychology

    ERIC Educational Resources Information Center

    Riva, Giuseppe

    2009-01-01

    Several Virtual Reality (VR) applications for the understanding, assessment and treatment of mental health problems have been developed in the last 15 years. Typically, in VR the patient learns to manipulate problematic situations related to his/her problem. In fact, VR can be described as an advanced form of human-computer interface that is able…

  8. Virtual reality as a distraction technique in chronic pain patients.

    PubMed

    Wiederhold, Brenda K; Gao, Kenneth; Sulea, Camelia; Wiederhold, Mark D

    2014-06-01

    We explored the use of virtual reality distraction techniques for use as adjunctive therapy to treat chronic pain. Virtual environments were specifically created to provide pleasant and engaging experiences where patients navigated on their own through rich and varied simulated worlds. Real-time physiological monitoring was used as a guide to determine the effectiveness and sustainability of this intervention. Human factors studies showed that virtual navigation is a safe and effective method for use with chronic pain patients. Chronic pain patients demonstrated significant relief in subjective ratings of pain that corresponded to objective measurements in peripheral, noninvasive physiological measures.

  9. Learning Science in a Virtual Reality Application: The Impacts of Animated-Virtual Actors' Visual Complexity

    ERIC Educational Resources Information Center

    Kartiko, Iwan; Kavakli, Manolya; Cheng, Ken

    2010-01-01

    As the technology in computer graphics advances, Animated-Virtual Actors (AVAs) in Virtual Reality (VR) applications become increasingly rich and complex. Cognitive Theory of Multimedia Learning (CTML) suggests that complex visual materials could hinder novice learners from attending to the lesson properly. On the other hand, previous studies have…

  10. Learning and Teaching in Virtual Worlds: Implications of Virtual Reality for Education.

    ERIC Educational Resources Information Center

    Moore, Paul

    1995-01-01

    Surveys the research into virtual reality (VR) and focuses on the implications of immersive virtual worlds for learning and teaching. Topics include how VR differs from other forms of interactive multimedia, VR and the development of educational theory and methodology, and case studies in educational VR research. (Author/LRW)

  11. Applications of virtual reality to nuclear safeguards and non-proliferation

    SciTech Connect

    Stansfield, S.

    1996-12-31

    This paper presents several applications of virtual reality relevant to the areas of nuclear safeguards and non-proliferation. Each of these applications was developed to the prototype stage at Sandia National Laboratories` Virtual Reality and Intelligent Simulation laboratory. These applications include the use of virtual reality for facility visualization, training of inspection personnel, and security and monitoring of nuclear facilities.

  12. Using Virtual Reality Environment to Improve Joint Attention Associated with Pervasive Developmental Disorder

    ERIC Educational Resources Information Center

    Cheng, Yufang; Huang, Ruowen

    2012-01-01

    The focus of this study is using data glove to practice Joint attention skill in virtual reality environment for people with pervasive developmental disorder (PDD). The virtual reality environment provides a safe environment for PDD people. Especially, when they made errors during practice in virtual reality environment, there is no suffering or…

  13. The Potential of Using Virtual Reality Technology in Physical Activity Settings

    ERIC Educational Resources Information Center

    Pasco, Denis

    2013-01-01

    In recent years, virtual reality technology has been successfully used for learning purposes. The purposes of the article are to examine current research on the role of virtual reality in physical activity settings and discuss potential application of using virtual reality technology to enhance learning in physical education. The article starts…

  14. Education about Hallucinations Using an Internet Virtual Reality System: A Qualitative Survey

    ERIC Educational Resources Information Center

    Yellowlees, Peter M.; Cook, James N.

    2006-01-01

    Objective: The authors evaluate an Internet virtual reality technology as an education tool about the hallucinations of psychosis. Method: This is a pilot project using Second Life, an Internet-based virtual reality system, in which a virtual reality environment was constructed to simulate the auditory and visual hallucinations of two patients…

  15. The role of presence in virtual reality exposure therapy

    PubMed Central

    Price, Matthew; Anderson, Page

    2013-01-01

    A growing body of literature suggests that virtual reality is a successful tool for exposure therapy in the treatment of anxiety disorders. Virtual reality (VR) researchers posit the construct of presence, defined as the interpretation of an artificial stimulus as if it were real, to be a presumed factor that enables anxiety to be felt during virtual reality exposure therapy (VRE). However, a handful of empirical studies on the relation between presence and anxiety in VRE have yielded mixed findings. The current study tested the following hypotheses about the relation between presence and anxiety in VRE with a clinical sample of fearful flyers: (1) presence is related to in-session anxiety; (2) presence mediates the extent that pre-existing (pre-treatment) anxiety is experienced during exposure with VR; (3) presence is positively related to the amount of phobic elements included within the virtual environment; (4) presence is related to treatment outcome. Results supported presence as a factor that contributes to the experience of anxiety in the virtual environment as well as a relation between presence and the phobic elements, but did not support a relation between presence and treatment outcome. The study suggests that presence may be a necessary but insufficient requirement for successful VRE. PMID:17145164

  16. An efficient 3D R-tree spatial index method for virtual geographic environments

    NASA Astrophysics Data System (ADS)

    Zhu, Qing; Gong, Jun; Zhang, Yeting

    A three-dimensional (3D) spatial index is required for real time applications of integrated organization and management in virtual geographic environments of above ground, underground, indoor and outdoor objects. Being one of the most promising methods, the R-tree spatial index has been paid increasing attention in 3D geospatial database management. Since the existing R-tree methods are usually limited by their weakness of low efficiency, due to the critical overlap of sibling nodes and the uneven size of nodes, this paper introduces the k-means clustering method and employs the 3D overlap volume, 3D coverage volume and the minimum bounding box shape value of nodes as the integrative grouping criteria. A new spatial cluster grouping algorithm and R-tree insertion algorithm is then proposed. Experimental analysis on comparative performance of spatial indexing shows that by the new method the overlap of R-tree sibling nodes is minimized drastically and a balance in the volumes of the nodes is maintained.

  17. 3D QSAR Studies, Pharmacophore Modeling and Virtual Screening on a Series of Steroidal Aromatase Inhibitors

    PubMed Central

    Xie, Huiding; Qiu, Kaixiong; Xie, Xiaoguang

    2014-01-01

    Aromatase inhibitors are the most important targets in treatment of estrogen-dependent cancers. In order to search for potent steroidal aromatase inhibitors (SAIs) with lower side effects and overcome cellular resistance, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were performed on a series of SAIs to build 3D QSAR models. The reliable and predictive CoMFA and CoMSIA models were obtained with statistical results (CoMFA: q2 = 0.636, r2ncv = 0.988, r2pred = 0.658; CoMSIA: q2 = 0.843, r2ncv = 0.989, r2pred = 0.601). This 3D QSAR approach provides significant insights that can be used to develop novel and potent SAIs. In addition, Genetic algorithm with linear assignment of hypermolecular alignment of database (GALAHAD) was used to derive 3D pharmacophore models. The selected pharmacophore model contains two acceptor atoms and four hydrophobic centers, which was used as a 3D query for virtual screening against NCI2000 database. Six hit compounds were obtained and their biological activities were further predicted by the CoMFA and CoMSIA models, which are expected to design potent and novel SAIs. PMID:25405729

  18. Virtual Sculpting and 3D Printing for Young People with Disabilities.

    PubMed

    Mcloughlin, Leigh; Fryazinov, Oleg; Moseley, Mark; Sanchez, Mathieu; Adzhiev, Valery; Comninos, Peter; Pasko, Alexander

    2016-01-01

    The SHIVA project was designed to provide virtual sculpting tools for young people with complex disabilities, allowing them to engage with artistic and creative activities that they might otherwise never be able to access. Their creations are then physically built using 3D printing. To achieve this, the authors built a generic, accessible GUI and a suitable geometric modeling system and used these to produce two prototype modeling exercises. These tools were deployed in a school for students with complex disabilities and are now being used for a variety of educational and developmental purposes. This article presents the project's motivations, approach, and implementation details together with initial results, including 3D printed objects designed by young people with disabilities. PMID:26780761

  19. Virtual Sculpting and 3D Printing for Young People with Disabilities.

    PubMed

    Mcloughlin, Leigh; Fryazinov, Oleg; Moseley, Mark; Sanchez, Mathieu; Adzhiev, Valery; Comninos, Peter; Pasko, Alexander

    2016-01-01

    The SHIVA project was designed to provide virtual sculpting tools for young people with complex disabilities, allowing them to engage with artistic and creative activities that they might otherwise never be able to access. Their creations are then physically built using 3D printing. To achieve this, the authors built a generic, accessible GUI and a suitable geometric modeling system and used these to produce two prototype modeling exercises. These tools were deployed in a school for students with complex disabilities and are now being used for a variety of educational and developmental purposes. This article presents the project's motivations, approach, and implementation details together with initial results, including 3D printed objects designed by young people with disabilities.

  20. Water-friendly virtual reality pain control during wound care.

    PubMed

    Hoffman, Hunter G; Patterson, David R; Magula, Jeff; Carrougher, Gretchen J; Zeltzer, Karen; Dagadakis, Stephen; Sharar, Sam R

    2004-02-01

    Recent research suggests that entering an immersive virtual environment can serve as a powerful nonpharmacologic analgesic for severe burn pain. The present case study describes an attempt to use water-friendly virtual reality (VR) technology with a burn patient undergoing wound care in a hydrotherapy tub. The patient was a 40-year-old male with 19% total body surface area deep flame/flash burns to his legs, neck, back, and buttocks. The virtual reality treatment decreased the patient's sensory and affective pain ratings and decreased the amount of time spent thinking about his pain during wound care. We believe that VR analgesia works by drawing attention away from the wound care, leaving less attention available to process incoming pain signals. The water-friendly VR helmet dramatically increases the number of patients with severe burns that could potentially be treated with VR (see http://www.vrpain.com).

  1. The assessment of virtual reality for human anatomy instruction

    NASA Technical Reports Server (NTRS)

    Benn, Karen P.

    1994-01-01

    This research project seeks to meet the objective of science training by developing, assessing, and validating virtual reality as a human anatomy training medium. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment the traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three dimensional, unlike the one dimensional depiction found in textbooks and the two dimensional depiction found on the computer. Virtual reality is a breakthrough technology that allows one to step through the computer screen into a three dimensional world. This technology offers many opportunities to enhance science education. Therefore, a virtual testing environment of the abdominopelvic region of a human cadaver was created to study the placement of body parts within the nine anatomical divisions of the abdominopelvic region and the four abdominal quadrants.

  2. The use of virtual reality in acrophobia research and treatment.

    PubMed

    Coelho, Carlos M; Waters, Allison M; Hine, Trevor J; Wallis, Guy

    2009-06-01

    Acrophobia, or fear of heights, is a widespread and debilitating anxiety disorder affecting perhaps 1 in 20 adults. Virtual reality (VR) technology has been used in the psychological treatment of acrophobia since 1995, and has come to dominate the treatment of numerous anxiety disorders. It is now known that virtual reality exposure therapy (VRET) regimens are highly effective for acrophobia treatment. This paper reviews current theoretical understanding of acrophobia as well as the evolution of its common treatments from the traditional exposure therapies to the most recent virtually guided ones. In particular, the review focuses on recent innovations in the use of VR technology and discusses the benefits it may offer for examining the underlying causes of the disorder, allowing for the systematic assessment of interrelated factors such as the visual, vestibular and postural control systems.

  3. Envisioning the future of home care: applications of immersive virtual reality.

    PubMed

    Brennan, Patricia Flatley; Arnott Smith, Catherine; Ponto, Kevin; Radwin, Robert; Kreutz, Kendra

    2013-01-01

    Accelerating the design of technologies to support health in the home requires 1) better understanding of how the household context shapes consumer health behaviors and (2) the opportunity to afford engineers, designers, and health professionals the chance to systematically study the home environment. We developed the Living Environments Laboratory (LEL) with a fully immersive, six-sided virtual reality CAVE to enable recreation of a broad range of household environments. We have successfully developed a virtual apartment, including a kitchen, living space, and bathroom. Over 2000 people have visited the LEL CAVE. Participants use an electronic wand to activate common household affordances such as opening a refrigerator door or lifting a cup. Challenges currently being explored include creating natural gesture to interface with virtual objects, developing robust, simple procedures to capture actual living environments and rendering them in a 3D visualization, and devising systematic stable terminologies to characterize home environments.

  4. Design Virtual Reality Scene Roam for Tour Animations Base on VRML and Java

    NASA Astrophysics Data System (ADS)

    Cao, Zaihui; hu, Zhongyan

    Virtual reality has been involved in a wide range of academic and commercial applications. It can give users a natural feeling of the environment by creating realistic virtual worlds. Implementing a virtual tour through a model of a tourist area on the web has become fashionable. In this paper, we present a web-based application that allows a user to, walk through, see, and interact with a fully three-dimensional model of the tourist area. Issues regarding navigation and disorientation areaddressed and we suggest a combination of the metro map and an intuitive navigation system. Finally we present a prototype which implements our ideas. The application of VR techniques integrates the visualization and animation of the three dimensional modelling to landscape analysis. The use of the VRML format produces the possibility to obtain some views of the 3D model and to explore it in real time. It is an important goal for the spatial information sciences.

  5. Going Virtual… or Not: Development and Testing of a 3D Virtual Astronomy Environment

    NASA Astrophysics Data System (ADS)

    Ruzhitskaya, L.; Speck, A.; Ding, N.; Baldridge, S.; Witzig, S.; Laffey, J.

    2013-04-01

    We present our preliminary results of a pilot study of students' knowledge transfer of an astronomy concept into a new environment. We also share our discoveries on what aspects of a 3D environment students consider being motivational and discouraging for their learning. This study was conducted among 64 non-science major students enrolled in an astronomy laboratory course. During the course, students learned the concept and applications of Kepler's laws using a 2D interactive environment. Later in the semester, the students were placed in a 3D environment in which they were asked to conduct observations and to answers a set of questions pertaining to the Kepler's laws of planetary motion. In this study, we were interested in observing scrutinizing and assessing students' behavior: from choices that they made while creating their avatars (virtual representations) to tools they choose to use, to their navigational patterns, to their levels of discourse in the environment. These helped us to identify what features of the 3D environment our participants found to be helpful and interesting and what tools created unnecessary clutter and distraction. The students' social behavior patterns in the virtual environment together with their answers to the questions helped us to determine how well they understood Kepler's laws, how well they could transfer the concepts to a new situation, and at what point a motivational tool such as a 3D environment becomes a disruption to the constructive learning. Our founding confirmed that students construct deeper knowledge of a concept when they are fully immersed in the environment.

  6. Computer Based Training: Field Deployable Trainer and Shared Virtual Reality

    NASA Technical Reports Server (NTRS)

    Mullen, Terence J.

    1997-01-01

    Astronaut training has traditionally been conducted at specific sites with specialized facilities. Because of its size and nature the training equipment is generally not portable. Efforts are now under way to develop training tools that can be taken to remote locations, including into orbit. Two of these efforts are the Field Deployable Trainer and Shared Virtual Reality projects. Field Deployable Trainer NASA has used the recent shuttle mission by astronaut Shannon Lucid to the Russian space station, Mir, as an opportunity to develop and test a prototype of an on-orbit computer training system. A laptop computer with a customized user interface, a set of specially prepared CD's, and video tapes were taken to the Mir by Ms. Lucid. Based upon the feedback following the launch of the Lucid flight, our team prepared materials for the next Mir visitor. Astronaut John Blaha will fly on NASA/MIR Long Duration Mission 3, set to launch in mid September. He will take with him a customized hard disk drive and a package of compact disks containing training videos, references and maps. The FDT team continues to explore and develop new and innovative ways to conduct offsite astronaut training using personal computers. Shared Virtual Reality Training NASA's Space Flight Training Division has been investigating the use of virtual reality environments for astronaut training. Recent efforts have focused on activities requiring interaction by two or more people, called shared VR. Dr. Bowen Loftin, from the University of Houston, directs a virtual reality laboratory that conducts much of the NASA sponsored research. I worked on a project involving the development of a virtual environment that can be used to train astronauts and others to operate a science unit called a Biological Technology Facility (BTF). Facilities like this will be used to house and control microgravity experiments on the space station. It is hoped that astronauts and instructors will ultimately be able to share

  7. Analyzing Pathfinder data using virtual reality and superresolved imaging

    NASA Astrophysics Data System (ADS)

    Stoker, Carol R.; Zbinden, Eric; Blackmon, Theodore T.; Kanefsky, Bob; Hagen, Joel; Neveu, Charles; Rasmussen, Daryl; Schwehr, Kurt; Sims, Michael

    1999-04-01

    The Mars Pathfinder mission used a unique capability to rapidly generate and interactively display three-dimensional (3-D) photorealistic virtual reality (VR) models of the Martian surface. An interactive terrain visualization system creates and renders digital terrain models produced from stereo images taken by the Imager for Mars Pathfinder (IMP) camera. The stereo pipeline, an automated machine vision algorithm, correlates features between the left and right images to determine their disparity and computes the corresponding positions using the known camera geometry. These positions are connected to form a polygonal mesh upon which IMP images are overlaid as textures. During the Pathfinder mission, VR models were produced and displayed almost as fast as images were received. The VR models were viewed using MarsMap, an interface that allows the model to be viewed from any perspective driven by a standard three-button computer mouse. MarsMap incorporates graphical representations of the lander and rover and the sequence and spatial locations at which rover data were taken. Graphical models of the rover were placed in the model to indicate the rover position at the end of each day of the mission. Images taken by Sojourner cameras are projected into the model as 2-D ``billboards'' to show their proper perspective. Distance and angle measurements can be made on features viewed in the model using a mouse-driven 3-D cursor and a point-and-click interface. MarsMap was used to assist with archiving and planning Sojourner activities and to make detailed measurements of surface features such as wind streaks and rock size and orientation that are difficult to perform using 2-D images. Superresolution image processing is a computational method for improving image resolution by a factor of n1/2 by combining n independent images. This technique was used on Pathfinder to obtain better resolved images of Martian surface features. We show results from superresolving IMP camera

  8. 3-D Imaging In Virtual Environment: A Scientific Clinical and Teaching Tool

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; DeVincenzi, Donald L. (Technical Monitor)

    1996-01-01

    The advent of powerful graphics workstations and computers has led to the advancement of scientific knowledge through three-dimensional (3-D) reconstruction and imaging of biological cells and tissues. The Biocomputation Center at NASA Ames Research Center pioneered the effort to produce an entirely computerized method for reconstruction of objects from serial sections studied in a transmission electron microscope (TEM). The software developed, ROSS (Reconstruction of Serial Sections), is now being distributed to users across the United States through Space Act Agreements. The software is in widely disparate fields such as geology, botany, biology and medicine. In the Biocomputation Center, ROSS serves as the basis for development of virtual environment technologies for scientific and medical use. This report will describe the Virtual Surgery Workstation Project that is ongoing with clinicians at Stanford University Medical Center, and the role of the Visible Human data in the project.

  9. NeuroVR: an open source virtual reality platform for clinical psychology and behavioral neurosciences.

    PubMed

    Riva, Giuseppe; Gaggioli, Andrea; Villani, Daniela; Preziosa, Alessandra; Morganti, Francesca; Corsi, Riccardo; Faletti, Gianluca; Vezzadini, Luca

    2007-01-01

    In the past decade, the use of virtual reality for clinical and research applications has become more widespread. However, the diffusion of this approach is still limited by three main issues: poor usability, lack of technical expertise among clinical professionals, and high costs. To address these challenges, we introduce NeuroVR (http://www.neurovr.org--http://www.neurotiv.org), a cost-free virtual reality platform based on open-source software, that allows non-expert users to adapt the content of a pre-designed virtual environment to meet the specific needs of the clinical or experimental setting. Using the NeuroVR Editor, the user can choose the appropriate psychological stimuli/stressors from a database of objects (both 2D and 3D) and videos, and easily place them into the virtual environment. The edited scene can then be visualized in the NeuroVR Player using either immersive or non-immersive displays. Currently, the NeuroVR library includes different virtual scenes (apartment, office, square, supermarket, park, classroom, etc.), covering two of the most studied clinical applications of VR: specific phobias and eating disorders. The NeuroVR Editor is based on Blender (http://www.blender.org), the open source, cross-platform suite of tools for 3D creation, and is available as a completely free resource. An interesting feature of the NeuroVR Editor is the possibility to add new objects to the database. This feature allows the therapist to enhance the patient's feeling of familiarity and intimacy with the virtual scene, i.e., by using photos or movies of objects/people that are part of the patient's daily life, thereby improving the efficacy of the exposure. The NeuroVR platform runs on standard personal computers with Microsoft Windows; the only requirement for the hardware is related to the graphics card, which must support OpenGL. PMID:17377310

  10. New technologies applied to surgical processes: Virtual Reality and rapid prototyping.

    PubMed

    Suárez-Mejías, Cristina; Gomez-Ciriza, Gorka; Valverde, Israel; Parra Calderón, Carlos; Gómez-Cía, Tomás

    2015-01-01

    AYRA is software of virtual reality for training, planning and optimizing surgical procedures. AYRA was developed under a research, development and innovation project financed by the Andalusian Ministry of Health, called VirSSPA. Nowadays AYRA has been successfully used in more than 1160 real cases and after proving its efficiency it has been introduced in the clinical practice at the Virgen del Rocío University Hospital . Furthermore, AYRA allows generating physical 3D biomodels using rapid prototyping technology. They are used for surgical planning support, intraoperative reference or defect reconstruction. In this paper, some of these tools and some real cases are presented.

  11. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  12. Towards a Transcription System of Sign Language for 3D Virtual Agents

    NASA Astrophysics Data System (ADS)

    Do Amaral, Wanessa Machado; de Martino, José Mario

    Accessibility is a growing concern in computer science. Since virtual information is mostly presented visually, it may seem that access for deaf people is not an issue. However, for prelingually deaf individuals, those who were deaf since before acquiring and formally learn a language, written information is often of limited accessibility than if presented in signing. Further, for this community, signing is their language of choice, and reading text in a spoken language is akin to using a foreign language. Sign language uses gestures and facial expressions and is widely used by deaf communities. To enabling efficient production of signed content on virtual environment, it is necessary to make written records of signs. Transcription systems have been developed to describe sign languages in written form, but these systems have limitations. Since they were not originally designed with computer animation in mind, in general, the recognition and reproduction of signs in these systems is an easy task only to those who deeply know the system. The aim of this work is to develop a transcription system to provide signed content in virtual environment. To animate a virtual avatar, a transcription system requires explicit enough information, such as movement speed, signs concatenation, sequence of each hold-and-movement and facial expressions, trying to articulate close to reality. Although many important studies in sign languages have been published, the transcription problem remains a challenge. Thus, a notation to describe, store and play signed content in virtual environments offers a multidisciplinary study and research tool, which may help linguistic studies to understand the sign languages structure and grammar.

  13. Astronauts Prepare for Mission With Virtual Reality Hardware

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Astronauts John M. Grunsfeld (left), STS-109 payload commander, and Nancy J. Currie, mission specialist, use the virtual reality lab at Johnson Space Center to train for upcoming duties aboard the Space Shuttle Columbia. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team to perform its duties for the fourth Hubble Space Telescope Servicing mission. The most familiar form of virtual reality technology is some form of headpiece, which fits over your eyes and displays a three dimensional computerized image of another place. Turn your head left and right, and you see what would be to your sides; turn around, and you see what might be sneaking up on you. An important part of the technology is some type of data glove that you use to propel yourself through the virtual world. Currently, the medical community is using the new technologies in four major ways: To see parts of the body more accurately, for study, to make better diagnosis of disease and to plan surgery in more detail; to obtain a more accurate picture of a procedure during surgery; to perform more types of surgery with the most noninvasive, accurate methods possible; and to model interactions among molecules at a molecular level.

  14. INTERACTIVITY INFLUENCES THE MAGNITUDE OF VIRTUAL REALITY ANALGESIA

    PubMed Central

    Wender, Regina; Hoffman, Hunter G.; Hunner, Harley H.; Seibel, Eric J.; Patterson, David R.; Sharar, Sam R.

    2009-01-01

    Despite medication with opioids and other powerful pharmacologic pain medications, most patients rate their pain during severe burn wound care as severe to excruciating. Excessive pain is a widespread medical problem in a wide range of patient populations. Immersive virtual reality (VR) distraction may help reduce pain associated with medical procedures. Recent research manipulating immersiveness has shown that a high tech VR helmet reduces pain more effectively than a low tech VR helmet. The present study explores the effect of interactivity on the analgesic effectiveness of virtual reality. Using a double blind design, in the present study, twenty-one volunteers were randomly assigned to one of two groups, and received a thermal pain stimulus during either interactive VR, or during non-interactive VR. Subjects in both groups individually glided through the virtual world, but one group could look around and interact with the environment using the trackball, whereas participants in the other group had no trackball. Afterwards, each participant provided subjective 0–10 ratings of cognitive, sensory and affective components of pain, and the amount of fun during the pain stimulus. Compared to the non-interactive VR group, participants in the interactive VR group showed 75% more reduction in pain unpleasantness (p < .005) and 74% more reduction in worst pain (p < .005). Interactivity increased the analgesic effectiveness of immersive virtual reality. PMID:20390047

  15. INTERACTIVITY INFLUENCES THE MAGNITUDE OF VIRTUAL REALITY ANALGESIA.

    PubMed

    Wender, Regina; Hoffman, Hunter G; Hunner, Harley H; Seibel, Eric J; Patterson, David R; Sharar, Sam R

    2009-01-01

    Despite medication with opioids and other powerful pharmacologic pain medications, most patients rate their pain during severe burn wound care as severe to excruciating. Excessive pain is a widespread medical problem in a wide range of patient populations. Immersive virtual reality (VR) distraction may help reduce pain associated with medical procedures. Recent research manipulating immersiveness has shown that a high tech VR helmet reduces pain more effectively than a low tech VR helmet. The present study explores the effect of interactivity on the analgesic effectiveness of virtual reality. Using a double blind design, in the present study, twenty-one volunteers were randomly assigned to one of two groups, and received a thermal pain stimulus during either interactive VR, or during non-interactive VR. Subjects in both groups individually glided through the virtual world, but one group could look around and interact with the environment using the trackball, whereas participants in the other group had no trackball. Afterwards, each participant provided subjective 0-10 ratings of cognitive, sensory and affective components of pain, and the amount of fun during the pain stimulus. Compared to the non-interactive VR group, participants in the interactive VR group showed 75% more reduction in pain unpleasantness (p < .005) and 74% more reduction in worst pain (p < .005). Interactivity increased the analgesic effectiveness of immersive virtual reality.

  16. Virtual reality simulation in neurosurgery: technologies and evolution.

    PubMed

    Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H

    2013-01-01

    Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery. PMID:23254804

  17. Designing 3 Dimensional Virtual Reality Using Panoramic Image

    NASA Astrophysics Data System (ADS)

    Wan Abd Arif, Wan Norazlinawati; Wan Ahmad, Wan Fatimah; Nordin, Shahrina Md.; Abdullah, Azrai; Sivapalan, Subarna

    The high demand to improve the quality of the presentation in the knowledge sharing field is to compete with rapidly growing technology. The needs for development of technology based learning and training lead to an idea to develop an Oil and Gas Plant Virtual Environment (OGPVE) for the benefit of our future. Panoramic Virtual Reality learning based environment is essential in order to help educators overcome the limitations in traditional technical writing lesson. Virtual reality will help users to understand better by providing the simulations of real-world and hard to reach environment with high degree of realistic experience and interactivity. Thus, in order to create a courseware which will achieve the objective, accurate images of intended scenarios must be acquired. The panorama shows the OGPVE and helps to generate ideas to users on what they have learnt. This paper discusses part of the development in panoramic virtual reality. The important phases for developing successful panoramic image are image acquisition and image stitching or mosaicing. In this paper, the combination of wide field-of-view (FOV) and close up image used in this panoramic development are also discussed.

  18. Simulation Of Assembly Processes With Technical Of Virtual Reality

    NASA Astrophysics Data System (ADS)

    García García, Manuel; Arenas Reina, José Manuel; Lite, Alberto Sánchez; Sebastián Pérez, Miguel Ángel

    2009-11-01

    Virtual reality techniques use at industrial processes provides a real approach to product life cycle. For components manual assembly, the use of virtual surroundings facilitates a simultaneous engineering in which variables such as human factors and productivity take a real act. On the other hand, in the actual phase of industrial competition it is required a rapid adjustment to client needs and to market situation. In this work it is analyzed the assembly of the front components of a vehicle using virtual reality tools and following up a product-process design methodology which includes every life service stage. This study is based on workstations design, taking into account productive and human factors from the ergonomic point of view implementing a postural study of every assembly operation, leaving the rest of stages for a later study. Design is optimized applying this methodology together with the use of virtual reality tools. It is also achieved a 15% reduction on time assembly and of 90% reduction in muscle—skeletal diseases at every assembly operation.

  19. Dynamic 3-D virtual fixtures for minimally invasive beating heart procedures.

    PubMed

    Ren, Jing; Patel, Rajni V; McIsaac, Kenneth A; Guiraudon, Gerard; Peters, Terry M

    2008-08-01

    Two-dimensional or 3-D visual guidance is often used for minimally invasive cardiac surgery and diagnosis. This visual guidance suffers from several drawbacks such as limited field of view, loss of signal from time to time, and in some cases, difficulty of interpretation. These limitations become more evident in beating-heart procedures when the surgeon has to perform a surgical procedure in the presence of heart motion. In this paper, we propose dynamic 3-D virtual fixtures (DVFs) to augment the visual guidance system with haptic feedback, to provide the surgeon with more helpful guidance by constraining the surgeon's hand motions thereby protecting sensitive structures. DVFs can be generated from preoperative dynamic magnetic resonance (MR) or computed tomograph (CT) images and then mapped to the patient during surgery. We have validated the feasibility of the proposed method on several simulated surgical tasks using a volunteer's cardiac image dataset. Validation results show that the integration of visual and haptic guidance can permit a user to perform surgical tasks more easily and with reduced error rate. We believe this is the first work presented in the field of virtual fixtures that explicitly considers heart motion.

  20. Factory of Realities: On the Emergence of Virtual Spatiotemporal Structures

    NASA Astrophysics Data System (ADS)

    Zapatrin, Romàn R.

    The ubiquitous nature of modern Information Retrieval (IR) and Virtual World give rise to new realities. To what extent are these `realities' real? Which `physics' should be applied to quantitatively describe them? In this chapter, I dwell on few examples. The first is adaptive neural networks, which are not networks and not neural, but still provide service similar to classical artificial neural networks (ANNs) in extended fashion. The second is the emergence of objects looking like Einsteinian space-time, which describe the behavior of an Internet surfer like geodesic motion. The third is the demonstration of nonclassical and even stronger-than-quantum probabilities in IR, their use...

  1. Augmented reality navigation with automatic marker-free image registration using 3-D image overlay for dental surgery.

    PubMed

    Wang, Junchen; Suenaga, Hideyuki; Hoshi, Kazuto; Yang, Liangjing; Kobayashi, Etsuko; Sakuma, Ichiro; Liao, Hongen

    2014-04-01

    Computer-assisted oral and maxillofacial surgery (OMS) has been rapidly evolving since the last decade. State-of-the-art surgical navigation in OMS still suffers from bulky tracking sensors, troublesome image registration procedures, patient movement, loss of depth perception in visual guidance, and low navigation accuracy. We present an augmented reality navigation system with automatic marker-free image registration using 3-D image overlay and stereo tracking for dental surgery. A customized stereo camera is designed to track both the patient and instrument. Image registration is performed by patient tracking and real-time 3-D contour matching, without requiring any fiducial and reference markers. Real-time autostereoscopic 3-D imaging is implemented with the help of a consumer-level graphics processing unit. The resulting 3-D image of the patient's anatomy is overlaid on the surgical site by a half-silvered mirror using image registration and IP-camera registration to guide the surgeon by exposing hidden critical structures. The 3-D image of the surgical instrument is also overlaid over the real one for an augmented display. The 3-D images present both stereo and motion parallax from which depth perception can be obtained. Experiments were performed to evaluate various aspects of the system; the overall image overlay error of the proposed system was 0.71 mm.

  2. Utilising a Collaborative Macro-Script to Enhance Student Engagement: A Mixed Method Study in a 3D Virtual Environment

    ERIC Educational Resources Information Center

    Bouta, Hara; Retalis, Symeon; Paraskeva, Fotini

    2012-01-01

    This study examines the effect of using an online 3D virtual environment in teaching Mathematics in Primary Education. In particular, it explores the extent to which student engagement--behavioral, affective and cognitive--is fostered by such tools in order to enhance collaborative learning. For the study we used a purpose-created 3D virtual…

  3. The cranial nerve skywalk: A 3D tutorial of cranial nerves in a virtual platform.

    PubMed

    Richardson-Hatcher, April; Hazzard, Matthew; Ramirez-Yanez, German

    2014-01-01

    Visualization of the complex courses of the cranial nerves by students in the health-related professions is challenging through either diagrams in books or plastic models in the gross laboratory. Furthermore, dissection of the cranial nerves in the gross laboratory is an extremely meticulous task. Teaching and learning the cranial nerve pathways is difficult using two-dimensional (2D) illustrations alone. Three-dimensional (3D) models aid the teacher in describing intricate and complex anatomical structures and help students visualize them. The study of the cranial nerves can be supplemented with 3D, which permits the students to fully visualize their distribution within the craniofacial complex. This article describes the construction and usage of a virtual anatomy platform in Second Life™, which contains 3D models of the cranial nerves III, V, VII, and IX. The Cranial Nerve Skywalk features select cranial nerves and the associated autonomic pathways in an immersive online environment. This teaching supplement was introduced to groups of pre-healthcare professional students in gross anatomy courses at both institutions and student feedback is included.

  4. The cranial nerve skywalk: A 3D tutorial of cranial nerves in a virtual platform.

    PubMed

    Richardson-Hatcher, April; Hazzard, Matthew; Ramirez-Yanez, German

    2014-01-01

    Visualization of the complex courses of the cranial nerves by students in the health-related professions is challenging through either diagrams in books or plastic models in the gross laboratory. Furthermore, dissection of the cranial nerves in the gross laboratory is an extremely meticulous task. Teaching and learning the cranial nerve pathways is difficult using two-dimensional (2D) illustrations alone. Three-dimensional (3D) models aid the teacher in describing intricate and complex anatomical structures and help students visualize them. The study of the cranial nerves can be supplemented with 3D, which permits the students to fully visualize their distribution within the craniofacial complex. This article describes the construction and usage of a virtual anatomy platform in Second Life™, which contains 3D models of the cranial nerves III, V, VII, and IX. The Cranial Nerve Skywalk features select cranial nerves and the associated autonomic pathways in an immersive online environment. This teaching supplement was introduced to groups of pre-healthcare professional students in gross anatomy courses at both institutions and student feedback is included. PMID:24678025

  5. Virtual bronchoscopic approach for combining 3D CT and endoscopic video

    NASA Astrophysics Data System (ADS)

    Sherbondy, Anthony J.; Kiraly, Atilla P.; Austin, Allen L.; Helferty, James P.; Wan, Shu-Yen; Turlington, Janice Z.; Yang, Tao; Zhang, Chao; Hoffman, Eric A.; McLennan, Geoffrey; Higgins, William E.

    2000-04-01

    To improve the care of lung-cancer patients, we are devising a diagnostic paradigm that ties together three-dimensional (3D) high-resolution computed-tomographic (CT) imaging and bronchoscopy. The system expands upon the new concept of virtual endoscopy that has seen recent application to the chest, colon, and other anatomical regions. Our approach applies computer-graphics and image-processing tools to the analysis of 3D CT chest images and complementary bronchoscopic video. It assumes a two-stage assessment of a lung-cancer patient. During Stage 1 (CT assessment), the physician interacts with a number of visual and quantitative tools to evaluate the patient's 'virtual anatomy' (3D CT scan). Automatic analysis gives navigation paths through major airways and to pre-selected suspect sites. These paths provide useful guidance during Stage-1 CT assessment. While interacting with these paths and other software tools, the user builds a multimedia Case Study, capturing telling snapshot views, movies, and quantitative data. The Case Study contains a report on the CT scan and also provides planning information for subsequent bronchoscopic evaluation. During Stage 2 (bronchoscopy), the physician uses (1) the original CT data, (2) software graphical tools, (3) the Case Study, and (4) a standard bronchoscopy suite to have an augmented vision for bronchoscopic assessment and treatment. To use the two data sources (CT and bronchoscopic video) simultaneously, they must be registered. We perform this registration using both manual interaction and an automated matching approach based on mutual information. We demonstrate our overall progress to date using human CT cases and CT-video from a bronchoscopy- training device.

  6. Integration of the virtual model of a Stewart platform with the avatar of a vehicle in a virtual reality

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2016-08-01

    The development of methods of computer aided design and engineering allows conducting virtual tests, among others concerning motion simulation of technical means. The paper presents a method of integrating an object in the form of a virtual model of a Stewart platform with an avatar of a vehicle moving in a virtual environment. The area of the problem includes issues related to the problem of fidelity of mapping the work of the analyzed technical mean. The main object of investigations is a 3D model of a Stewart platform, which is a subsystem of the simulator designated for driving learning for disabled persons. The analyzed model of the platform, prepared for motion simulation, was created in the “Motion Simulation” module of a CAD/CAE class system Siemens PLM NX. Whereas the virtual environment, in which the moves the avatar of the passenger car, was elaborated in a VR class system EON Studio. The element integrating both of the mentioned software environments is a developed application that reads information from the virtual reality (VR) concerning the current position of the car avatar. Then, basing on the accepted algorithm, it sends control signals to respective joints of the model of the Stewart platform (CAD).

  7. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    PubMed

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  8. Virtual reality and telerobotics applications of an Address Recalculation Pipeline

    NASA Technical Reports Server (NTRS)

    Regan, Matthew; Pose, Ronald

    1994-01-01

    The technology described in this paper was designed to reduce latency to user interactions in immersive virtual reality environments. It is also ideally suited to telerobotic applications such as interaction with remote robotic manipulators in space or in deep sea operations. in such circumstances the significant latency is observed response to user stimulus which is due to communications delays, and the disturbing jerkiness due to low and unpredictable frame rates on compressed video user feedback or computationally limited virtual worlds, can be masked by our techniques. The user is provided with highly responsive visual feedback independent of communication or computational delays in providing physical video feedback or in rendering virtual world images. Virtual and physical environments can be combined seamlessly using these techniques.

  9. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    PubMed Central

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  10. Immersive virtual reality and environmental noise assessment: An innovative audio–visual approach

    SciTech Connect

    Ruotolo, Francesco; Maffei, Luigi; Di Gabriele, Maria; Iachini, Tina; Masullo, Massimiliano; Ruggiero, Gennaro; Senese, Vincenzo Paolo

    2013-07-15

    Several international studies have shown that traffic noise has a negative impact on people's health and that people's annoyance does not depend only on noise energetic levels, but rather on multi-perceptual factors. The combination of virtual reality technology and audio rendering techniques allow us to experiment a new approach for environmental noise assessment that can help to investigate in advance the potential negative effects of noise associated with a specific project and that in turn can help designers to make educated decisions. In the present study, the audio–visual impact of a new motorway project on people has been assessed by means of immersive virtual reality technology. In particular, participants were exposed to 3D reconstructions of an actual landscape without the projected motorway (ante operam condition), and of the same landscape with the projected motorway (post operam condition). Furthermore, individuals' reactions to noise were assessed by means of objective cognitive measures (short term verbal memory and executive functions) and subjective evaluations (noise and visual annoyance). Overall, the results showed that the introduction of a projected motorway in the environment can have immediate detrimental effects of people's well-being depending on the distance from the noise source. In particular, noise due to the new infrastructure seems to exert a negative influence on short term verbal memory and to increase both visual and noise annoyance. The theoretical and practical implications of these findings are discussed. -- Highlights: ► Impact of traffic noise on people's well-being depends on multi-perceptual factors. ► A multisensory virtual reality technology is used to simulate a projected motorway. ► Effects on short-term memory and auditory and visual subjective annoyance were found. ► The closer the distance from the motorway the stronger was the effect. ► Multisensory virtual reality methodologies can be used to study

  11. Virtual Reality Technologies for Research and Education in Obesity and Diabetes: Research Needs and Opportunities

    PubMed Central

    Ershow, Abby G; Peterson, Charles M; Riley, William T; Rizzo, Albert “Skip”; Wansink, Brian

    2011-01-01

    The rising rates, high prevalence, and adverse consequences of obesity and diabetes call for new approaches to the complex behaviors needed to prevent and manage these conditions. Virtual reality (VR) technologies, which provide controllable, multisensory, interactive three-dimensional (3D) stimulus environments, are a potentially valuable means of engaging patients in interventions that foster more healthful eating and physical activity patterns. Furthermore, the capacity of VR technologies to motivate, record, and measure human performance represents a novel and useful modality for conducting research. This article summarizes background information and discussions for a joint July 2010 National Institutes of Health – Department of Defense workshop entitled Virtual Reality Technologies for Research and Education in Obesity and Diabetes. The workshop explored the research potential of VR technologies as tools for behavioral and neuroscience studies in diabetes and obesity, and the practical potential of VR in fostering more effective utilization of diabetes- and obesity-related nutrition and lifestyle information. Virtual reality technologies were considered especially relevant for fostering desirable health-related behaviors through motivational reinforcement, personalized teaching approaches, and social networking. Virtual reality might also be a means of extending the availability and capacity of health care providers. Progress in the field will be enhanced by further developing available platforms and taking advantage of VR’s capabilities as a research tool for well-designed hypothesis-testing behavioral science. Multidisciplinary collaborations are needed between the technology industry and academia, and among researchers in biomedical, behavioral, pedagogical, and computer science disciplines. Research priorities and funding opportunities for use of VR to improve prevention and management of obesity and diabetes can be found at agency websites (National

  12. Virtual reality technologies for research and education in obesity and diabetes: research needs and opportunities.

    PubMed

    Ershow, Abby G; Peterson, Charles M; Riley, William T; Rizzo, Albert Skip; Wansink, Brian

    2011-03-01

    The rising rates, high prevalence, and adverse consequences of obesity and diabetes call for new approaches to the complex behaviors needed to prevent and manage these conditions. Virtual reality (VR) technologies, which provide controllable, multisensory, interactive three-dimensional (3D) stimulus environments, are a potentially valuable means of engaging patients in interventions that foster more healthful eating and physical activity patterns. Furthermore, the capacity of VR technologies to motivate, record, and measure human performance represents a novel and useful modality for conducting research. This article summarizes background information and discussions for a joint July 2010 National Institutes of Health - Department of Defense workshop entitled Virtual Reality Technologies for Research and Education in Obesity and Diabetes. The workshop explored the research potential of VR technologies as tools for behavioral and neuroscience studies in diabetes and obesity, and the practical potential of VR in fostering more effective utilization of diabetes- and obesity-related nutrition and lifestyle information. Virtual reality technologies were considered especially relevant for fostering desirable health-related behaviors through motivational reinforcement, personalized teaching approaches, and social networking. Virtual reality might also be a means of extending the availability and capacity of health care providers. Progress in the field will be enhanced by further developing available platforms and taking advantage of VR's capabilities as a research tool for well-designed hypothesis-testing behavioral science. Multidisciplinary collaborations are needed between the technology industry and academia, and among researchers in biomedical, behavioral, pedagogical, and computer science disciplines. Research priorities and funding opportunities for use of VR to improve prevention and management of obesity and diabetes can be found at agency websites (National

  13. Virtual reality technologies for research and education in obesity and diabetes: research needs and opportunities.

    PubMed

    Ershow, Abby G; Peterson, Charles M; Riley, William T; Rizzo, Albert Skip; Wansink, Brian

    2011-03-01

    The rising rates, high prevalence, and adverse consequences of obesity and diabetes call for new approaches to the complex behaviors needed to prevent and manage these conditions. Virtual reality (VR) technologies, which provide controllable, multisensory, interactive three-dimensional (3D) stimulus environments, are a potentially valuable means of engaging patients in interventions that foster more healthful eating and physical activity patterns. Furthermore, the capacity of VR technologies to motivate, record, and measure human performance represents a novel and useful modality for conducting research. This article summarizes background information and discussions for a joint July 2010 National Institutes of Health - Department of Defense workshop entitled Virtual Reality Technologies for Research and Education in Obesity and Diabetes. The workshop explored the research potential of VR technologies as tools for behavioral and neuroscience studies in diabetes and obesity, and the practical potential of VR in fostering more effective utilization of diabetes- and obesity-related nutrition and lifestyle information. Virtual reality technologies were considered especially relevant for fostering desirable health-related behaviors through motivational reinforcement, personalized teaching approaches, and social networking. Virtual reality might also be a means of extending the availability and capacity of health care providers. Progress in the field will be enhanced by further developing available platforms and taking advantage of VR's capabilities as a research tool for well-designed hypothesis-testing behavioral science. Multidisciplinary collaborations are needed between the technology industry and academia, and among researchers in biomedical, behavioral, pedagogical, and computer science disciplines. Research priorities and funding opportunities for use of VR to improve prevention and management of obesity and diabetes can be found at agency websites (National

  14. Instructors' Perceptions of Three-Dimensional (3D) Virtual Worlds: Instructional Use, Implementation and Benefits for Adult Learners

    ERIC Educational Resources Information Center

    Stone, Sophia Jeffries

    2009-01-01

    The purpose of this dissertation research study was to explore instructors' perceptions of the educational application of three-dimensional (3D) virtual worlds in a variety of academic discipline areas and to assess the strengths and limitations this virtual environment presents for teaching adult learners. The guiding research question for this…

  15. Using a Quest in a 3D Virtual Environment for Student Interaction and Vocabulary Acquisition in Foreign Language Learning

    ERIC Educational Resources Information Center

    Kastoudi, Denise

    2011-01-01

    The gaming and interactional nature of the virtual environment of Second Life offers opportunities for language learning beyond the traditional pedagogy. This study case examined the potential of 3D virtual quest games to enhance vocabulary acquisition through interaction, negotiation of meaning and noticing. Four adult students of English at…

  16. Human Factors in Virtual Reality Development

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Proffitt, Dennis R.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    This half-day tutorial will provide an overview of basic perceptual functioning as it relates to the design of virtual environment systems. The tutorial consists of three parts. First, basic issues in visual perception will be presented, including discussions of the visual sensations of brightness and color, and the visual perception of depth relationships in three-dimensional space (with a special emphasis on motion -specified depth). The second section will discuss the importance of conducting human-factors user studies and evaluations. Examples and suggestions on how best to get help with user studies will be provided. Finally, we will discuss how, by drawing on their complementary competencies, perceptual psychologists and computer engineers can work as a team to develop optimal VR systems, technologies, and techniques.

  17. Virtual Reality Simulation of Gynecologic Laparoscopy

    PubMed

    Bernstein

    1996-08-01

    Realistic virtual simulation of gynecologic laparoscopy would permit the surgeon to practice any procedure, with any degree of pathology, at any time and as many times as necessary to achieve proficiency before attempting it in the operating room. Effective computer simulation requires accurate anatomy, realistic three-dimensional computer graphics, the ability to cut and deform tissue in response to instruments, and an appropriate hardware interface. The Visible Human Project from the National Library of Medicine has made available extremely accurate, three-dimensional, digital data that computer animation companies have begun to transform to three-dimensional graphic images. The problem of tissue deformation and movement is approached by a software package called TELEOS. Hardware consisting of two scissor-grip laparoscopic handles mounted on a sensor can interface with any simulation program to simulate a multiplicity of laparoscopic instruments. The next step will be to combine TELEOS with the three-dimensional anatomy data and configure it for gynecologic surgery.

  18. Treatment of complicated grief using virtual reality: a case report.

    PubMed

    Botella, C; Osma, J; Palacios, A García; Guillén, V; Baños, R

    2008-01-01

    This is the first work exploring the application of new technologies, concretely virtual reality, to facilitate emotional processing in the treatment of Complicated Grief. Our research team has designed a virtual reality environment (EMMA's World) to foster the expression and processing of emotions. In this study the authors present a description of EMMA's World, the clinical protocol, and a case report. The treatment program was applied in eight sessions. We present a brief description of the session agendas including the techniques used. We offer short-term (from pre-test to post-test) and long-term (2-, 6- and 12-month follow-ups) efficacy data. Our results offer preliminary support of the use of EMMA's World for the treatment of Complicated Grief. PMID:18924294

  19. Application of Virtual, Augmented, and Mixed Reality to Urology

    PubMed Central

    2016-01-01

    Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected. PMID:27706017

  20. Location and Longing: The Nicotine Craving Experience in Virtual Reality

    PubMed Central

    Carter, Brian L.; Bordnick, Patrick; Traylor, Amy; Day, Susan X.; Paris, Megan

    2008-01-01

    Considerable research suggests that cigarette craving is complex, with psychological, emotional, cognitive, and behavioral aspects that are inadequately captured by typical craving assessments that focus on level of severity. That is, the experience of craving, for cigarette smokers, remains poorly understood. This study immersed smokers in different virtual reality (VR) scenarios (with and without cigarette cues present), collected detailed craving assessments, and analyzed the data using a multidimensional analytic approach. Non-treatment-seeking, nicotine dependent smokers (N = 22) experienced two different virtual reality scenarios, one with cigarette cues and one without, and rated 24 descriptors related to craving. Multidimensional scaling (MDS) models demonstrate that smokers’ experience of craving is qualitatively, structurally different under VR smoking cue conditions versus neutral conditions. This finding sheds new light on the complexity of craving as well as implications for its measurement. PMID:18243586

  1. Polymer-based actuators for virtual reality devices

    NASA Astrophysics Data System (ADS)

    Bolzmacher, Christian; Hafez, Moustapha; Benali Khoudja, Mohamed; Bernardoni, Paul; Dubowsky, Steven

    2004-07-01

    Virtual Reality (VR) is gaining more importance in our society. For many years, VR has been limited to the entertainment applications. Today, practical applications such as training and prototyping find a promising future in VR. Therefore there is an increasing demand for low-cost, lightweight haptic devices in virtual reality (VR) environment. Electroactive polymers seem to be a potential actuation technology that could satisfy these requirements. Dielectric polymers developed the past few years have shown large displacements (more than 300%). This feature makes them quite interesting for integration in haptic devices due to their muscle-like behaviour. Polymer actuators are flexible and lightweight as compared to traditional actuators. Using stacks with several layers of elatomeric film increase the force without limiting the output displacement. The paper discusses some design methods for a linear dielectric polymer actuator for VR devices. Experimental results of the actuator performance is presented.

  2. Collaborative virtual reality environments for computational science and design.

    SciTech Connect

    Papka, M. E.

    1998-02-17

    The authors are developing a networked, multi-user, virtual-reality-based collaborative environment coupled to one or more petaFLOPs computers, enabling the interactive simulation of 10{sup 9} atom systems. The purpose of this work is to explore the requirements for this coupling. Through the design, development, and testing of such systems, they hope to gain knowledge that allows computational scientists to discover and analyze their results more quickly and in a more intuitive manner.

  3. Virtual reality in rhinology-a new dimension of clinical experience.

    PubMed

    Klapan, Ivica; Raos, Pero; Galeta, Tomislav; Kubat, Goranka

    2016-07-01

    There is often a need to more precisely identify the extent of pathology and the fine elements of intracranial anatomic features during the diagnostic process and during many operations in the nose, sinus, orbit, and skull base region. In two case reports, we describe the methods used in the diagnostic workup and surgical therapy in the nose and paranasal sinus region. Besides baseline x-ray, multislice computed tomography, and magnetic resonance imaging, operative field imaging was performed via a rapid prototyping model, virtual endoscopy, and 3-D imaging. Different head tissues were visualized in different colors, showing their anatomic interrelations and the extent of pathologic tissue within the operative field. This approach has not yet been used as a standard preoperative or intraoperative procedure in otorhinolaryngology. In this way, we tried to understand the new, visualized "world of anatomic relations within the patient's head" by creating an impression of perception (virtual perception) of the given position of all elements in a particular anatomic region of the head, which does not exist in the real world (virtual world). This approach was aimed at upgrading the diagnostic workup and surgical therapy by ensuring a faster, safer and, above all, simpler operative procedure. In conclusion, any ENT specialist can provide virtual reality support in implementing surgical procedures, with additional control of risks and within the limits of normal tissue, without additional trauma to the surrounding tissue in the anatomic region. At the same time, the virtual reality support provides an impression of the virtual world as the specialist navigates through it and manipulates virtual objects. PMID:27434481

  4. Virtual reality in rhinology-a new dimension of clinical experience.

    PubMed

    Klapan, Ivica; Raos, Pero; Galeta, Tomislav; Kubat, Goranka

    2016-07-01

    There is often a need to more precisely identify the extent of pathology and the fine elements of intracranial anatomic features during the diagnostic process and during many operations in the nose, sinus, orbit, and skull base region. In two case reports, we describe the methods used in the diagnostic workup and surgical therapy in the nose and paranasal sinus region. Besides baseline x-ray, multislice computed tomography, and magnetic resonance imaging, operative field imaging was performed via a rapid prototyping model, virtual endoscopy, and 3-D imaging. Different head tissues were visualized in different colors, showing their anatomic interrelations and the extent of pathologic tissue within the operative field. This approach has not yet been used as a standard preoperative or intraoperative procedure in otorhinolaryngology. In this way, we tried to understand the new, visualized "world of anatomic relations within the patient's head" by creating an impression of perception (virtual perception) of the given position of all elements in a particular anatomic region of the head, which does not exist in the real world (virtual world). This approach was aimed at upgrading the diagnostic workup and surgical therapy by ensuring a faster, safer and, above all, simpler operative procedure. In conclusion, any ENT specialist can provide virtual reality support in implementing surgical procedures, with additional control of risks and within the limits of normal tissue, without additional trauma to the surrounding tissue in the anatomic region. At the same time, the virtual reality support provides an impression of the virtual world as the specialist navigates through it and manipulates virtual objects.

  5. Dynamic concision for three-dimensional reconstruction of human organ built with virtual reality modelling language (VRML).

    PubMed

    Yu, Zheng-yang; Zheng, Shu-sen; Chen, Lei-ting; He, Xiao-qian; Wang, Jian-jun

    2005-07-01

    This research studies the process of 3D reconstruction and dynamic concision based on 2D medical digital images using virtual reality modelling language (VRML) and JavaScript language, with a focus on how to realize the dynamic concision of 3D medical model with script node and sensor node in VRML. The 3D reconstruction and concision of body internal organs can be built with such high quality that they are better than those obtained from the traditional methods. With the function of dynamic concision, the VRML browser can offer better windows for man-computer interaction in real-time environment than ever before. 3D reconstruction and dynamic concision with VRML can be used to meet the requirement for the medical observation of 3D reconstruction and have a promising prospect in the fields of medical imaging.

  6. Rapid prototyping--when virtual meets reality.

    PubMed

    Beguma, Zubeda; Chhedat, Pratik

    2014-01-01

    Rapid prototyping (RP) describes the customized production of solid models using 3D computer data. Over the past decade, advances in RP have continued to evolve, resulting in the development of new techniques that have been applied to the fabrication of various prostheses. RP fabrication technologies include stereolithography (SLA), fused deposition modeling (FDM), computer numerical controlled (CNC) milling, and, more recently, selective laser sintering (SLS). The applications of RP techniques for dentistry include wax pattern fabrication for dental prostheses, dental (facial) prostheses mold (shell) fabrication, and removable dental prostheses framework fabrication. In the past, a physical plastic shape of the removable partial denture (RPD) framework was produced using an RP machine, and then used as a sacrificial pattern. Yet with the advent of the selective laser melting (SLM) technique, RPD metal frameworks can be directly fabricated, thereby omitting the casting stage. This new approach can also generate the wax pattern for facial prostheses directly, thereby reducing labor-intensive laboratory procedures. Many people stand to benefit from these new RP techniques for producing various forms of dental prostheses, which in the near future could transform traditional prosthodontic practices.

  7. Rapid prototyping--when virtual meets reality.

    PubMed

    Beguma, Zubeda; Chhedat, Pratik

    2014-01-01

    Rapid prototyping (RP) describes the customized production of solid models using 3D computer data. Over the past decade, advances in RP have continued to evolve, resulting in the development of new techniques that have been applied to the fabrication of various prostheses. RP fabrication technologies include stereolithography (SLA), fused deposition modeling (FDM), computer numerical controlled (CNC) milling, and, more recently, selective laser sintering (SLS). The applications of RP techniques for dentistry include wax pattern fabrication for dental prostheses, dental (facial) prostheses mold (shell) fabrication, and removable dental prostheses framework fabrication. In the past, a physical plastic shape of the removable partial denture (RPD) framework was produced using an RP machine, and then used as a sacrificial pattern. Yet with the advent of the selective laser melting (SLM) technique, RPD metal frameworks can be directly fabricated, thereby omitting the casting stage. This new approach can also generate the wax pattern for facial prostheses directly, thereby reducing labor-intensive laboratory procedures. Many people stand to benefit from these new RP techniques for producing various forms of dental prostheses, which in the near future could transform traditional prosthodontic practices. PMID:25643461

  8. Building a 3D Virtual Liver: Methods for Simulating Blood Flow and Hepatic Clearance on 3D Structures.

    PubMed

    White, Diana; Coombe, Dennis; Rezania, Vahid; Tuszynski, Jack

    2016-01-01

    In this paper, we develop a spatio-temporal modeling approach to describe blood and drug flow, as well as drug uptake and elimination, on an approximation of the liver. Extending on previously developed computational approaches, we generate an approximation of a liver, which consists of a portal and hepatic vein vasculature structure, embedded in the surrounding liver tissue. The vasculature is generated via constrained constructive optimization, and then converted to a spatial grid of a selected grid size. Estimates for surrounding upscaled lobule tissue properties are then presented appropriate to the same grid size. Simulation of fluid flow and drug metabolism (hepatic clearance) are completed using discretized forms of the relevant convective-diffusive-reactive partial differential equations for these processes. This results in a single stage, uniformly consistent method to simulate equations for blood and drug flow, as well as drug metabolism, on a 3D structure representative of a liver. PMID:27649537

  9. Building a 3D Virtual Liver: Methods for Simulating Blood Flow and Hepatic Clearance on 3D Structures

    PubMed Central

    Rezania, Vahid; Tuszynski, Jack

    2016-01-01

    In this paper, we develop a spatio-temporal modeling approach to describe blood and drug flow, as well as drug uptake and elimination, on an approximation of the liver. Extending on previously developed computational approaches, we generate an approximation of a liver, which consists of a portal and hepatic vein vasculature structure, embedded in the surrounding liver tissue. The vasculature is generated via constrained constructive optimization, and then converted to a spatial grid of a selected grid size. Estimates for surrounding upscaled lobule tissue properties are then presented appropriate to the same grid size. Simulation of fluid flow and drug metabolism (hepatic clearance) are completed using discretized forms of the relevant convective-diffusive-reactive partial differential equations for these processes. This results in a single stage, uniformly consistent method to simulate equations for blood and drug flow, as well as drug metabolism, on a 3D structure representative of a liver. PMID:27649537

  10. Mixed reality virtual pets to reduce childhood obesity.

    PubMed

    Johnsen, Kyle; Ahn, Sun Joo; Moore, James; Brown, Scott; Robertson, Thomas P; Marable, Amanda; Basu, Aryabrata

    2014-04-01

    Novel approaches are needed to reduce the high rates of childhood obesity in the developed world. While multifactorial in cause, a major factor is an increasingly sedentary lifestyle of children. Our research shows that a mixed reality system that is of interest to children can be a powerful motivator of healthy activity. We designed and constructed a mixed reality system that allowed children to exercise, play with, and train a virtual pet using their own physical activity as input. The health, happiness, and intelligence of each virtual pet grew as its associated child owner exercised more, reached goals, and interacted with their pet. We report results of a research study involving 61 children from a local summer camp that shows a large increase in recorded and observed activity, alongside observational evidence that the virtual pet was responsible for that change. These results, and the ease at which the system integrated into the camp environment, demonstrate the practical potential to impact the exercise behaviors of children with mixed reality. PMID:24650979

  11. Mixed reality virtual pets to reduce childhood obesity.

    PubMed

    Johnsen, Kyle; Ahn, Sun Joo; Moore, James; Brown, Scott; Robertson, Thomas P; Marable, Amanda; Basu, Aryabrata

    2014-04-01

    Novel approaches are needed to reduce the high rates of childhood obesity in the developed world. While multifactorial in cause, a major factor is an increasingly sedentary lifestyle of children. Our research shows that a mixed reality system that is of interest to children can be a powerful motivator of healthy activity. We designed and constructed a mixed reality system that allowed children to exercise, play with, and train a virtual pet using their own physical activity as input. The health, happiness, and intelligence of each virtual pet grew as its associated child owner exercised more, reached goals, and interacted with their pet. We report results of a research study involving 61 children from a local summer camp that shows a large increase in recorded and observed activity, alongside observational evidence that the virtual pet was responsible for that change. These results, and the ease at which the system integrated into the camp environment, demonstrate the practical potential to impact the exercise behaviors of children with mixed reality.

  12. Role of computer vision in augmented virtual reality

    NASA Astrophysics Data System (ADS)

    Sharma, Rajeev; Molineros, Jose

    1995-03-01

    An important issue in augmented virtual reality is making the virtual world sensitive to the current state of the surrounding real world as the user interacts with it--changing gaze, manipulating an object, etc. For providing the right virtual stimulus at the right position and time, the system needs some sensor to interpret the surrounding scene. Computer vision holds great potential in providing the necessary interpretation of the scene. We present the preliminary design of a computer vision-based augmented reality system for helping a human in assembling an industrial part from its components. The context of assembly helps in keeping the computer vision task simple by exploiting the geometric model of the assembly components for recognition and pose estimation. The augmentation stimuli include labeling of objects in the scene, helping with sequencing using an assembly planner, visualization of assembly at different stages, handling errors by the human operator, etc. Such a system would have potential applications in assembling complex parts, maintenance, and education. We will present an overview of the design of the system and discuss some of the issues involved in computer vision-based augmented reality.

  13. Radiofrequency ablation of hepatic tumors: simulation, planning, and contribution of virtual reality and haptics.

    PubMed

    Villard, Caroline; Soler, Luc; Gangi, Afshin

    2005-08-01

    For radiofrequency ablation (RFA) of liver tumors, evaluation of vascular architecture, post-RFA necrosis prediction, and the choice of a suitable needle placement strategy using conventional radiological techniques remain difficult. In an attempt to enhance the safety of RFA, a 3D simulator, treatment planning, and training tool, that simulates the insertion of the needle, the necrosis of the treated area, and proposes an optimal needle placement, has been developed. The 3D scenes are automatically reconstructed from enhanced spiral CT scans. The simulator takes into account the cooling effect of local vessels greater than 3 mm in diameter, making necrosis shapes more realistic. Optimal needle positioning can be automatically generated by the software to produce complete destruction of the tumor, with maximum respect of the healthy liver and of all major structures to avoid. We also studied how the use of virtual reality and haptic devices are valuable to make simulation and training realistic and effective. PMID:16298844

  14. The dynamics of student learning within a high school virtual reality design class

    NASA Astrophysics Data System (ADS)

    Morales, Teresa M.

    This mixed method study investigated knowledge and skill development of high school students in a project-based VR design class, in which 3-D projects were developed within a student-centered, student-directed environment. This investigation focused on student content learning, and problem solving. Additionally the social dynamics of the class and the role of peer mentoring were examined to determine how these factors influenced student behavior and learning. Finally, parent and teachers perceptions of the influence of the class were examined. The participants included freshmen through senior students, parents, teachers and the high school principal. Student interviews and classroom observations were used to collect data from students, while teachers and parents completed surveys. The results of this study suggested that this application of virtual reality (VR) learning environment promoted the development of; meaningful cognitive experiences, creativity, leadership, global socialization, problem solving and a deeper understanding of academic content. Further theoretical implications for 3-D virtual reality technology are exceedingly promising, and warrant additional research and development as an instructional tool for practical use.

  15. Sensor Spatial Distortion, Visual Latency, and Update Rate Effects on 3D Tracking in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Ellis, S. R.; Adelstein, B. D.; Baumeler, S.; Jense, G. J.; Jacoby, R. H.; Trejo, Leonard (Technical Monitor)

    1998-01-01

    Several common defects that we have sought to minimize in immersing virtual environments are: static sensor spatial distortion, visual latency, and low update rates. Human performance within our environments during large amplitude 3D tracking was assessed by objective and subjective methods in the presence and absence of these defects. Results show that 1) removal of our relatively small spatial sensor distortion had minor effects on the tracking activity, 2) an Adapted Cooper-Harper controllability scale proved the most sensitive subjective indicator of the degradation of dynamic fidelity caused by increasing latency and decreasing frame rates, and 3) performance, as measured by normalized RMS tracking error or subjective impressions, was more markedly influenced by changing visual latency than by update rate.

  16. Auditory-visual virtual reality as a diagnostic and therapeutic tool for cynophobia.

    PubMed

    Suied, Clara; Drettakis, George; Warusfel, Olivier; Viaud-Delmon, Isabelle

    2013-02-01

    Traditionally, virtual reality (VR) exposure-based treatment concentrates primarily on the presentation of a high-fidelity visual experience. However, adequately combining the visual and the auditory experience provides a powerful tool to enhance sensory processing and modulate attention. We present the design and usability testing of an auditory-visual interactive environment for investigating VR exposure-based treatment for cynophobia. The specificity of our application involves 3D sound, allowing the presentation and spatial manipulations of a fearful stimulus in the auditory modality and in the visual modality. We conducted an evaluation test with 10 participants who fear dogs to assess the capacity of our auditory-visual virtual environment (VE) to generate fear reactions. The specific perceptual characteristics of the dog model that were implemented in the VE were highly arousing, suggesting that VR is a promising tool to treat cynophobia. PMID:23425570

  17. Virtual Reality-based Telesurgery via Teleprogramming Scheme Combined with Semi-autonomous Control.

    PubMed

    Zhijiang, Du; Zhiheng, Jia; Minxiu, Kong

    2005-01-01

    Telesurgery systems have long been suffering variable and unpredictable Internet commutation time delay, operation fatigue, and other drawbacks. Based on virtual reality technology, a teleprogramming scheme combined with semi-autonomous control is introduced to guarantee the robustness and efficiency of teleoperation of HIT-RAOS, a robot-assisted orthopedic surgery system. In this system, without considering time delay, the operator can just interact with virtual environment which provides real-time 3D vision, stereophonic sound, and tactile and force feedback imitated by a parallel master manipulator. And several tasks can be managed simultaneously via semi-autonomous control. Finally, the method is experimentally demonstrated on an experiment of locking of intramedullary nails, and is shown to effectively provide stability and performances. PMID:17282656

  18. Virtual Reality-based Telesurgery via Teleprogramming Scheme Combined with Semi-autonomous Control.

    PubMed

    Zhijiang, Du; Zhiheng, Jia; Minxiu, Kong

    2005-01-01

    Telesurgery systems have long been suffering variable and unpredictable Internet commutation time delay, operation fatigue, and other drawbacks. Based on virtual reality technology, a teleprogramming scheme combined with semi-autonomous control is introduced to guarantee the robustness and efficiency of teleoperation of HIT-RAOS, a robot-assisted orthopedic surgery system. In this system, without considering time delay, the operator can just interact with virtual environment which provides real-time 3D vision, stereophonic sound, and tactile and force feedback imitated by a parallel master manipulator. And several tasks can be managed simultaneously via semi-autonomous control. Finally, the method is experimentally demonstrated on an experiment of locking of intramedullary nails, and is shown to effectively provide stability and performances.

  19. Combinatorial Pharmacophore-Based 3D-QSAR Analysis and Virtual Screening of FGFR1 Inhibitors

    PubMed Central

    Zhou, Nannan; Xu, Yuan; Liu, Xian; Wang, Yulan; Peng, Jianlong; Luo, Xiaomin; Zheng, Mingyue; Chen, Kaixian; Jiang, Hualiang

    2015-01-01

    The fibroblast growth factor/fibroblast growth factor receptor (FGF/FGFR) signaling pathway plays crucial roles in cell proliferation, angiogenesis, migration, and survival. Aberration in FGFRs correlates with several malignancies and disorders. FGFRs have proved to be attractive targets for therapeutic intervention in cancer, and it is of high interest to find FGFR inhibitors with novel scaffolds. In this study, a combinatorial three-dimensional quantitative structure-activity relationship (3D-QSAR) model was developed based on previously reported FGFR1 inhibitors with diverse structural skeletons. This model was evaluated for its prediction performance on a diverse test set containing 232 FGFR inhibitors, and it yielded a SD value of 0.75 pIC50 units from measured inhibition affinities and a Pearson’s correlation coefficient R2 of 0.53. This result suggests that the combinatorial 3D-QSAR model could be used to search for new FGFR1 hit structures and predict their potential activity. To further evaluate the performance of the model, a decoy set validation was used to measure the efficiency of the model by calculating EF (enrichment factor). Based on the combinatorial pharmacophore model, a virtual screening against SPECS database was performed. Nineteen novel active compounds were successfully identified, which provide new chemical starting points for further structural optimization of FGFR1 inhibitors. PMID:26110383

  20. Combinatorial Pharmacophore-Based 3D-QSAR Analysis and Virtual Screening of FGFR1 Inhibitors.

    PubMed

    Zhou, Nannan; Xu, Yuan; Liu, Xian; Wang, Yulan; Peng, Jianlong; Luo, Xiaomin; Zheng, Mingyue; Chen, Kaixian; Jiang, Hualiang

    2015-06-11

    The fibroblast growth factor/fibroblast growth factor receptor (FGF/FGFR) signaling pathway plays crucial roles in cell proliferation, angiogenesis, migration, and survival. Aberration in FGFRs correlates with several malignancies and disorders. FGFRs have proved to be attractive targets for therapeutic intervention in cancer, and it is of high interest to find FGFR inhibitors with novel scaffolds. In this study, a combinatorial three-dimensional quantitative structure-activity relationship (3D-QSAR) model was developed based on previously reported FGFR1 inhibitors with diverse structural skeletons. This model was evaluated for its prediction performance on a diverse test set containing 232 FGFR inhibitors, and it yielded a SD value of 0.75 pIC50 units from measured inhibition affinities and a Pearson's correlation coefficient R2 of 0.53. This result suggests that the combinatorial 3D-QSAR model could be used to search for new FGFR1 hit structures and predict their potential activity. To further evaluate the performance of the model, a decoy set validation was used to measure the efficiency of the model by calculating EF (enrichment factor). Based on the combinatorial pharmacophore model, a virtual screening against SPECS database was performed. Nineteen novel active compounds were successfully identified, which provide new chemical starting points for further structural optimization of FGFR1 inhibitors.

  1. 3D modeling of the Strasbourg's Cathedral basements for interdisciplinary research and virtual visits

    NASA Astrophysics Data System (ADS)

    Landes, T.; Kuhnle, G.; Bruna, R.

    2015-08-01

    On the occasion of the millennium celebration of Strasbourg Cathedral, a transdisciplinary research group composed of archaeologists, surveyors, architects, art historians and a stonemason revised the 1966-1972 excavations under the St. Lawrence's Chapel of the Cathedral having remains of Roman and medieval masonry. The 3D modeling of the Chapel has been realized based on the combination of conventional surveying techniques for the network creation, laser scanning for the model creation and photogrammetric techniques for the texturing of a few parts. According to the requirements and the end-user of the model, the level of detail and level of accuracy have been adapted and assessed for every floor. The basement has been acquired and modeled with more details and a higher accuracy than the other parts. Thanks to this modeling work, archaeologists can confront their assumptions to those of other disciplines by simulating constructions of other worship edifices on the massive stones composing the basement. The virtual reconstructions provided evidence in support of these assumptions and served for communication via virtual visits.

  2. Virtual Reality for the Psychophysiological Assessment of Phobic Fear: Responses during Virtual Tunnel Driving

    ERIC Educational Resources Information Center

    Muhlberger, Andreas; Bulthoff, Heinrich H.; Wiedemann, Georg; Pauli, Paul

    2007-01-01

    An overall assessment of phobic fear requires not only a verbal self-report of fear but also an assessment of behavioral and physiological responses. Virtual reality can be used to simulate realistic (phobic) situations and therefore should be useful for inducing emotions in a controlled, standardized way. Verbal and physiological fear reactions…

  3. 3D-ANTLERS: Virtual Reconstruction and Three-Dimensional Measurement

    NASA Astrophysics Data System (ADS)

    Barba, S.; Fiorillo, F.; De Feo, E.

    2013-02-01

    . In the ARTEC digital mock-up for example, it shows the ability to select the individual frames, already polygonal and geo-referenced at the time of capture; however, it is not possible to make an automated texturization differently from the low-cost environment which allows to produce a good graphics' definition. Once the final 3D models were obtained, we have proceeded to do a geometric and graphic comparison of the results. Therefore, in order to provide an accuracy requirement and an assessment for the 3D reconstruction we have taken into account the following benchmarks: cost, captured points, noise (local and global), shadows and holes, operability, degree of definition, quality and accuracy. Subsequently, these studies carried out in an empirical way on the virtual reconstructions, a 3D documentation was codified with a procedural method endorsing the use of terrestrial sensors for the documentation of antlers. The results thus pursued were compared with the standards set by the current provisions (see "Manual de medición" of Government of Andalusia-Spain); to date, in fact, the identification is based on data such as length, volume, colour, texture, openness, tips, structure, etc. Data, which is currently only appreciated with traditional instruments, such as tape measure, would be well represented by a process of virtual reconstruction and cataloguing.

  4. Astronaut Prepares for Mission With Virtual Reality Hardware

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Astronaut John M. Grunsfeld, STS-109 payload commander, uses virtual reality hardware at Johnson Space Center to rehearse some of his duties prior to the STS-109 mission. The most familiar form of virtual reality technology is some form of headpiece, which fits over your eyes and displays a three dimensional computerized image of another place. Turn your head left and right, and you see what would be to your sides; turn around, and you see what might be sneaking up on you. An important part of the technology is some type of data glove that you use to propel yourself through the virtual world. This technology allows NASA astronauts to practice International Space Station work missions in advance. Currently, the medical community is using the new technologies in four major ways: To see parts of the body more accurately, for study, to make better diagnosis of disease and to plan surgery in more detail; to obtain a more accurate picture of a procedure during surgery; to perform more types of surgery with the most noninvasive, accurate methods possible; and to model interactions among molecules at a molecular level.

  5. Fire training in a virtual-reality environment

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Jurgen; Bucken, Arno

    2005-03-01

    Although fire is very common in our daily environment - as a source of energy at home or as a tool in industry - most people cannot estimate the danger of a conflagration. Therefore it is important to train people in combating fire. Beneath training with propane simulators or real fires and real extinguishers, fire training can be performed in virtual reality, which means a pollution-free and fast way of training. In this paper we describe how to enhance a virtual-reality environment with a real-time fire simulation and visualisation in order to establish a realistic emergency-training system. The presented approach supports extinguishing of the virtual fire including recordable performance data as needed in teletraining environments. We will show how to get realistic impressions of fire using advanced particle-simulation and how to use the advantages of particles to trigger states in a modified cellular automata used for the simulation of fire-behaviour. Using particle systems that interact with cellular automata it is possible to simulate a developing, spreading fire and its reaction on different extinguishing agents like water, CO2 or oxygen. The methods proposed in this paper have been implemented and successfully tested on Cosimir, a commercial robot-and VR-simulation-system.

  6. Laparoscopic Varicocelectomy: Virtual Reality Training and Learning Curve

    PubMed Central

    Wang, Zheng; Ni, Yuhua; Jin, Xunbo; Xia, Qinghua; Wang, Hanbo

    2014-01-01

    Objectives: To explore the role that virtual reality training might play in the learning curve of laparoscopic varicocelectomy. Methods: A total of 1326 laparoscopic varicocelectomy cases performed by 16 participants from July 2005 to June 2012 were retrospectively analyzed. The participants were divided into 2 groups: group A was trained by laparoscopic trainer boxes; group B was trained by a virtual reality training course preoperatively. The operation time curves were drafted, and the learning, improving, and platform stages were divided and statistically confirmed. The operation time and number of cases in the learning and improving stages of both groups were compared. Testicular artery sparing failure and postoperative hydroceles rate were statistically analyzed for the confirmation of the learning curve. Results: The learning curve of laparoscopic varicocelectomy was 15 cases, and with 14 cases more, it came into the platform stage. The number of cases for the learning stages of both groups showed no statistical difference (P = .49), but the operation time of group B for the learning stage was less than that of group A (P < .00001). The number of cases of group B for the improving stage was significantly less than that of group A (P = .005), but the operation time of both groups in the improving stage showed no difference (P = .30). The difference of testicular artery sparing failure rates among these 3 stages was proved significant (P < .0001), the postoperative hydroceles rate showed no statistical difference (P = .60). Conclusions: The virtual reality training shortened the operation time in the learning stage and hastened the trainees' steps in the improving stage, but did not shorten the learning curve as expected to. PMID:25392625

  7. Virtual reality applications in improving postural control and minimizing falls.

    PubMed

    Virk, Sumandeep; McConville, Kristiina M Valter

    2006-01-01

    Maintaining balance under all conditions is an absolute requirement for humans. Orientation in space and balance maintenance requires inputs from the vestibular, the visual, the proprioceptive and the somatosensory systems. All the cues coming from these systems are integrated by the central nervous system (CNS) to employ different strategies for orientation and balance. How the CNS integrates all the inputs and makes cognitive decisions about balance strategies has been an area of interest for biomedical engineers for a long time. More interesting is the fact that in the absence of one or more cues, or when the input from one of the sensors is skewed, the CNS "adapts" to the new environment and gives less weight to the conflicting inputs [1]. The focus of this paper is a review of different strategies and models put forward by researchers to explain the integration of these sensory cues. Also, the paper compares the different approaches used by young and old adults in maintaining balance. Since with age the musculoskeletal, visual and vestibular system deteriorates, the older subjects have to compensate for these impaired sensory cues for postural stability. The paper also discusses the applications of virtual reality in rehabilitation programs not only for balance in the elderly but also in occupational falls. Virtual reality has profound applications in the field of balance rehabilitation and training because of its relatively low cost. Studies will be conducted to evaluate the effectiveness of virtual reality training in modifying the head and eye movement strategies, and determine the role of these responses in the maintenance of balance. PMID:17946975

  8. NanTroSEIZE in 3-D: Creating a Virtual Research Experience in Undergraduate Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Reed, D. L.; Bangs, N. L.; Moore, G. F.; Tobin, H.

    2009-12-01

    Marine research programs, both large and small, have increasingly added a web-based component to facilitate outreach to K-12 and the public, in general. These efforts have included, among other activities, information-rich websites, ship-to-shore communication with scientists during expeditions, blogs at sea, clips on YouTube, and information about daily shipboard activities. Our objective was to leverage a portion of the vast collection of data acquired through the NSF-MARGINS program to create a learning tool with a long lifespan for use in undergraduate geoscience courses. We have developed a web-based virtual expedition, NanTroSEIZE in 3-D, based on a seismic survey associated with the NanTroSEIZE program of NSF-MARGINS and IODP to study the properties of the plate boundary fault system in the upper limit of the seismogenic zone off Japan. The virtual voyage can be used in undergraduate classes at anytime, since it is not directly tied to the finite duration of a specific seagoing project. The website combines text, graphics, audio and video to place learning in an experiential framework as students participate on the expedition and carry out research. Students learn about the scientific background of the program, especially the critical role of international collaboration, and meet the chief scientists before joining the sea-going expedition. Students are presented with the principles of 3-D seismic imaging, data processing and interpretation while mapping and identifying the active faults that were the likely sources of devastating earthquakes and tsunamis in Japan in 1944 and 1948. They also learn about IODP drilling that began in 2007 and will extend through much of the next decade. The website is being tested in undergraduate classes in fall 2009 and will be distributed through the NSF-MARGINS website (http://www.nsf-margins.org/) and the MARGINS Mini-lesson section of the Science Education Resource Center (SERC) (http

  9. One's Colonies: a virtual reality environment of oriental residences

    NASA Astrophysics Data System (ADS)

    Chi, Catherine

    2013-03-01

    This paper is a statement about my virtual reality environment project, One's Colonies, and a description of the creative process of the project. I was inspired by the buildings in my hometown-Taiwan, which is really different from the architectural style in the United States. By analyzing the unique style of dwellings in Taiwan, I want to demonstrate how the difference between geography, weather and culture change the appearance of the living space. Through this project I want to express the relationship between architectural style and cultural difference, and how the emotional condition or characteristics of the residents are affected by their residencies.

  10. Three-Dimensional User Interfaces for Immersive Virtual Reality

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1997-01-01

    The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.

  11. A virtual reality browser for Space Station models

    NASA Technical Reports Server (NTRS)

    Goldsby, Michael; Pandya, Abhilash; Aldridge, Ann; Maida, James

    1993-01-01

    The Graphics Analysis Facility at NASA/JSC has created a visualization and learning tool by merging its database of detailed geometric models with a virtual reality system. The system allows an interactive walk-through of models of the Space Station and other structures, providing detailed realistic stereo images. The user can activate audio messages describing the function and connectivity of selected components within his field of view. This paper presents the issues and trade-offs involved in the implementation of the VR system and discusses its suitability for its intended purposes.

  12. Virtual reality applications to automated rendezvous and capture

    NASA Technical Reports Server (NTRS)

    Hale, Joseph; Oneil, Daniel

    1991-01-01

    Virtual Reality (VR) is a rapidly developing Human/Computer Interface (HCI) technology. The evolution of high-speed graphics processors and development of specialized anthropomorphic user interface devices, that more fully involve the human senses, have enabled VR technology. Recently, the maturity of this technology has reached a level where it can be used as a tool in a variety of applications. This paper provides an overview of: VR technology, VR activities at Marshall Space Flight Center (MSFC), applications of VR to Automated Rendezvous and Capture (AR&C), and identifies areas of VR technology that requires further development.

  13. Research on distributed virtual reality system in electronic commerce

    NASA Astrophysics Data System (ADS)

    Xue, Qiang; Wang, Jiening; Sun, Jizhou

    2004-03-01

    In this paper, Distributed Virtual Reality (DVR) technology applied in Electronical Commerce (EC) is discussed. DVR has the capability of providing a new means for human being to recognize, analyze and resolve the large scale, complex problems, which makes it develop quickly in EC fields. The technology of CSCW (Computer Supported Cooperative Work) and middleware is introduced into the development of EC-DVR system to meet the need of a platform which can provide the necessary cooperation and communication services to avoid developing the basic module repeatedly. Finally, the paper gives a platform structure of EC-DVR system.

  14. Transforming Experience: The Potential of Augmented Reality and Virtual Reality for Enhancing Personal and Clinical Change

    PubMed Central

    Riva, Giuseppe; Baños, Rosa M.; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea

    2016-01-01

    During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies – augmented reality (AR) and virtual reality (VR) – exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual’s worldview. PMID:27746747

  15. Chavir: Virtual reality simulation for interventions in nuclear installations

    SciTech Connect

    Thevenon, J. B.; Tirel, O.; Lopez, L.; Chodorge, L.; Desbats, P.

    2006-07-01

    Companies involved in the nuclear industry have to prepare for interventions by precisely analyzing the radiological risks and rapidly evaluating the consequences of their operational choices. They also need to consolidate the experiences gained in the field with greater responsiveness and lower costs. This paper brings out the advantages of using virtual reality technology to meet the demands in the industry. The CHAVIR software allows the operators to prepare (and repeat) all the operations they would have to do in a safe virtual world, before performing the actual work inside the facilities. Since the decommissioning or maintenance work is carried out in an environment where there is radiation, the amount of radiation that the operator would be exposed to is calculated and integrated into the simulator. (authors)

  16. Virtual reality robotic telesurgery simulations using MEMICA haptic system

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Mavroidis, Constantinos; Bouzit, Mourad; Dolgin, Benjamin; Harm, Deborah L.; Kopchok, George E.; White, Rodney

    2001-01-01

    The authors conceived a haptic mechanism called MEMICA (Remote Mechanical Mirroring using Controlled stiffness and Actuators) that can enable the design of high dexterity, rapid response, and large workspace haptic system. The development of a novel MEMICA gloves and virtual reality models are being explored to allow simulation of telesurgery and other applications. The MEMICA gloves are being designed to provide intuitive mirroring of the conditions at a virtual site where a robot simulates the presence of a human operator. The key components of MEMICA are miniature electrically controlled stiffness (ECS) elements and electrically controlled force and stiffness (ECFS) actuators that are based on the use of Electro-Rheological Fluids (ERF. In this paper the design of the MEMICA system and initial experimental results are presented.

  17. Virtual Reality of Sound Generated from Vibrating Structures

    NASA Astrophysics Data System (ADS)

    KIM, S. J.; SONG, J. Y.

    2002-11-01

    The advancement of virtual reality (VR) technology in cyberspace is amazing, but its development is mainly concentrated on the visual part. In this paper, the development of VR technology to produce sound based on the exact physics is studied. Our main concern is on the sound generated from vibrating structures. This may be useful, for example, in apprehending sound field characteristics of an aircraft cabin in design stage. To calculate sound pressure from curved surface of a structure, a new integration scheme is developed in boundary element method. Several example problems are solved to confirm our integration scheme. The pressure distributions on a uniformly driven sphere and cylinders are computed and compared with analytic solutions, and radiation efficiency of a vibrating plate under one-dimensional flow is also calculated. Also, to realize sound through computer simulation, two concepts, "structure-oriented analysis" and "human-oriented analysis", are proposed. Using these concepts, virtual sound field of an aircraft cabin is created.

  18. Dots and dashes: art, virtual reality, and the telegraph

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia; Chang, Ben

    2009-02-01

    Dots and Dashes is a virtual reality artwork that explores online romance over the telegraph, based on Ella Cheever Thayer's novel Wired Love - a Romance in Dots and Dashes (an Old Story Told in a New Way)1. The uncanny similarities between this story and the world of today's virtual environments provides the springboard for an exploration of a wealth of anxieties and dreams, including the construction of identities in an electronically mediated environment, the shifting boundaries between the natural and machine worlds, and the spiritual dimensions of science and technology. In this paper we examine the parallels between the telegraph networks and our current conceptions of cyberspace, as well as unique social and cultural impacts specific to the telegraph. These include the new opportunities and roles available to women in the telegraph industry and the connection between the telegraph and the Spiritualist movement. We discuss the development of the artwork, its structure and aesthetics, and the technical development of the work.

  19. Auditory cues increase the hippocampal response to unimodal virtual reality.

    PubMed

    Andreano, Joseph; Liang, Kevin; Kong, Lingjun; Hubbard, David; Wiederhold, Brenda K; Wiederhold, Mark D

    2009-06-01

    Previous research suggests that the effectiveness of virtual reality exposure therapy should increase as the experience becomes more immersive. However, the neural mechanisms underlying the experience of immersion are not yet well understood. To address this question, neural activity during exposure to two virtual worlds was measured by functional magnetic resonance imaging (fMRI). Two levels of immersion were used: unimodal (video only) and multimodal (video plus audio). The results indicated increased activity in both auditory and visual sensory cortices during multimodal presentation. Additionally, multimodal presentation elicited increased activity in the hippocampus, a region well known to be involved in learning and memory. The implications of this finding for exposure therapy are discussed. PMID:19500000

  20. Heard on The Street: GIS-Guided Immersive 3D Models as an Augmented Reality for Team Collaboration

    NASA Astrophysics Data System (ADS)

    Quinn, B. B.

    2007-12-01

    Grid computing can be configured to run physics simulations for spatially contiguous virtual 3D model spaces. Each cell is run by a single processor core simulating 1/16 square kilometer of surface and can contain up to 15,000 objects. In this work, a model of one urban block was constructed in the commercial 3D online digital world Second Life http://secondlife.com to prove concept that GIS data can guide the build of an accurate in-world model. Second Life simulators support terrain modeling at two-meter grid intervals. Access to the Second Life grid is worldwide if connections to the US-based servers are possible. This immersive 3D model allows visitors to explore the space at will, with physics simulated for object collisions, gravity, and wind forces about 40 times per second. Visitors view this world as renderings by their 3-D display card of graphic objects and raster textures that are streamed from the simulator grid to the Second Life client, based on that client's instantaneous field of view. Visitors to immersive 3D models experience a virtual world that engages their innate abilities to relate to the real immersive 3D world in which humans have evolved. These abilities enable far more complex and dynamic 3D environments to be quickly and accurately comprehended by more visitors than most non-immersive 3D environments. Objects of interest at ground surface and below can be walked around, possibly entered, viewed at arm's length or flown over at 500 meters above. Videos of renderings have been recorded (as machinima) to share a visit as part of public presentations. Key to this experience is that dozens of simultaneous visitors can experience the model at the same time, each exploring it at will and seeing (if not colliding with) one another---like twenty geology students on a virtual outcrop, where each student might fly if they chose to. This work modeled the downtown Berkeley, CA, transit station in the Second Life region "Gualala" near [170, 35, 35

  1. Evaluation of neuroanatomical training using a 3D visual reality model.

    PubMed

    Brewer, Danielle N; Wilson, Timothy D; Eagleson, Roy; de Ribaupierre, Sandrine

    2012-01-01

    As one of the more difficult components of any curricula, neuroanatomy poses many challenges to students - not only because of the numerous discrete structures, but also due to the complicated spatial relations between them, which must be learned. Traditional anatomical education uses 2D images with a focus on dissection. This approach tends to underestimate the cognitive leaps required between textbook, lecture, and dissection cases. With reduced anatomical teaching time available, and varying student spatial abilities, new techniques are needed for training. The goal of this study is to assess the improvement of trainee understanding of 3D brain anatomy, orientation, visualization, and navigation through the use of digital training regimes in comparison with current methods. Two subsets of health science and medical students were tested individually after being given a group lecture and either a pre- or post-dissection digital lab. Results suggest that exposure to a 3D digital lab may improve knowledge acquisition and understanding by the students, particularly for first time learners. PMID:22356963

  2. Assessing endocranial variations in great apes and humans using 3D data from virtual endocasts.

    PubMed

    Bienvenu, Thibaut; Guy, Franck; Coudyzer, Walter; Gilissen, Emmanuel; Roualdès, Georges; Vignaud, Patrick; Brunet, Michel

    2011-06-01

    Modern humans are characterized by their large, complex, and specialized brain. Human brain evolution can be addressed through direct evidence provided by fossil hominid endocasts (i.e. paleoneurology), or through indirect evidence of extant species comparative neurology. Here we use the second approach, providing an extant comparative framework for hominid paleoneurological studies. We explore endocranial size and shape differences among great apes and humans, as well as between sexes. We virtually extracted 72 endocasts, sampling all extant great ape species and modern humans, and digitized 37 landmarks on each for 3D generalized Procrustes analysis. All species can be differentiated by their endocranial shape. Among great apes, endocranial shapes vary from short (orangutans) to long (gorillas), perhaps in relation to different facial orientations. Endocranial shape differences among African apes are partly allometric. Major endocranial traits distinguishing humans from great apes are endocranial globularity, reflecting neurological reorganization, and features linked to structural responses to posture and bipedal locomotion. Human endocasts are also characterized by posterior location of foramina rotunda relative to optic canals, which could be correlated to lesser subnasal prognathism compared to living great apes. Species with larger brains (gorillas and humans) display greater sexual dimorphism in endocranial size, while sexual dimorphism in endocranial shape is restricted to gorillas, differences between males and females being at least partly due to allometry. Our study of endocranial variations in extant great apes and humans provides a new comparative dataset for studies of fossil hominid endocasts.

  3. The effects of task difficulty on visual search strategy in virtual 3D displays.

    PubMed

    Pomplun, Marc; Garaas, Tyler W; Carrasco, Marisa

    2013-01-01

    Analyzing the factors that determine our choice of visual search strategy may shed light on visual behavior in everyday situations. Previous results suggest that increasing task difficulty leads to more systematic search paths. Here we analyze observers' eye movements in an "easy" conjunction search task and a "difficult" shape search task to study visual search strategies in stereoscopic search displays with virtual depth induced by binocular disparity. Standard eye-movement variables, such as fixation duration and initial saccade latency, as well as new measures proposed here, such as saccadic step size, relative saccadic selectivity, and x-y target distance, revealed systematic effects on search dynamics in the horizontal-vertical plane throughout the search process. We found that in the "easy" task, observers start with the processing of display items in the display center immediately after stimulus onset and subsequently move their gaze outwards, guided by extrafoveally perceived stimulus color. In contrast, the "difficult" task induced an initial gaze shift to the upper-left display corner, followed by a systematic left-right and top-down search process. The only consistent depth effect was a trend of initial saccades in the easy task with smallest displays to the items closest to the observer. The results demonstrate the utility of eye-movement analysis for understanding search strategies and provide a first step toward studying search strategies in actual 3D scenarios. PMID:23986539

  4. The effects of task difficulty on visual search strategy in virtual 3D displays.

    PubMed

    Pomplun, Marc; Garaas, Tyler W; Carrasco, Marisa

    2013-08-28

    Analyzing the factors that determine our choice of visual search strategy may shed light on visual behavior in everyday situations. Previous results suggest that increasing task difficulty leads to more systematic search paths. Here we analyze observers' eye movements in an "easy" conjunction search task and a "difficult" shape search task to study visual search strategies in stereoscopic search displays with virtual depth induced by binocular disparity. Standard eye-movement variables, such as fixation duration and initial saccade latency, as well as new measures proposed here, such as saccadic step size, relative saccadic selectivity, and x-y target distance, revealed systematic effects on search dynamics in the horizontal-vertical plane throughout the search process. We found that in the "easy" task, observers start with the processing of display items in the display center immediately after stimulus onset and subsequently move their gaze outwards, guided by extrafoveally perceived stimulus color. In contrast, the "difficult" task induced an initial gaze shift to the upper-left display corner, followed by a systematic left-right and top-down search process. The only consistent depth effect was a trend of initial saccades in the easy task with smallest displays to the items closest to the observer. The results demonstrate the utility of eye-movement analysis for understanding search strategies and provide a first step toward studying search strategies in actual 3D scenarios.

  5. Finite element visualization in the cave virtual reality environment

    SciTech Connect

    Plaskacz, E.J.; Kuhn, M.A.

    1996-03-01

    Through the use of the post-processing software, Virtual Reality visualization (VRviz), and the Cave Automatic Virtual Environment (CAVE), finite element representations can be viewed as they would be in real life. VRviz is a program written in ANSI C to translate the mathematical results generated by finite element analysis programs into a virtual representation. This virtual representation is projected into the CAVE environment and the results are animated. The animation is fully controllable. A user is able to translate the image, rotate about any axis and scale the image at any time. The user is also able to freeze the animation at any time step and control the image update rate. This allows the user to navigate around, or even inside, the image in order to effectively analyze possible failure points and redesign as necessary. Through the use of the CAVE and the real life image that is being produced by VRviz, engineers are able to save considerable time, money, and effort in the design process.

  6. Cue reactivity in virtual reality: the role of context.

    PubMed

    Paris, Megan M; Carter, Brian L; Traylor, Amy C; Bordnick, Patrick S; Day, Susan X; Armsworth, Mary W; Cinciripini, Paul M

    2011-07-01

    Cigarette smokers in laboratory experiments readily respond to smoking stimuli with increased craving. An alternative to traditional cue-reactivity methods (e.g., exposure to cigarette photos), virtual reality (VR) has been shown to be a viable cue presentation method to elicit and assess cigarette craving within complex virtual environments. However, it remains poorly understood whether contextual cues from the environment contribute to craving increases in addition to specific cues, like cigarettes. This study examined the role of contextual cues in a VR environment to evoke craving. Smokers were exposed to a virtual convenience store devoid of any specific cigarette cues followed by exposure to the same convenience store with specific cigarette cues added. Smokers reported increased craving following exposure to the virtual convenience store without specific cues, and significantly greater craving following the convenience store with cigarette cues added. However, increased craving recorded after the second convenience store may have been due to the pre-exposure to the first convenience store. This study offers evidence that an environmental context where cigarette cues are normally present (but are not), elicits significant craving in the absence of specific cigarette cues. This finding suggests that VR may have stronger ecological validity over traditional cue reactivity exposure methods by exposing smokers to the full range of cigarette-related environmental stimuli, in addition to specific cigarette cues, that smokers typically experience in their daily lives. PMID:21349649

  7. Virtual Reality: Developing a VR space for Academic activities

    NASA Astrophysics Data System (ADS)

    Kaimaris, D.; Stylianidis, E.; Karanikolas, N.

    2014-05-01

    Virtual reality (VR) is extensively used in various applications; in industry, in academia, in business, and is becoming more and more affordable for end users from the financial point of view. At the same time, in academia and higher education more and more applications are developed, like in medicine, engineering, etc. and students are inquiring to be well-prepared for their professional life after their educational life cycle. Moreover, VR is providing the benefits having the possibility to improve skills but also to understand space as well. This paper presents the methodology used during a course, namely "Geoinformatics applications" at the School of Spatial Planning and Development (Eng.), Aristotle University of Thessaloniki, to create a virtual School space. The course design focuses on the methods and techniques to be used in order to develop the virtual environment. In addition the project aspires to become more and more effective for the students and provide a real virtual environment with useful information not only for the students but also for any citizen interested in the academic life at the School.

  8. Game controller modification for fMRI hyperscanning experiments in a cooperative virtual reality environment

    PubMed Central

    Trees, Jason; Snider, Joseph; Falahpour, Maryam; Guo, Nick; Lu, Kun; Johnson, Douglas C.; Poizner, Howard; Liu, Thomas T.

    2014-01-01

    Hyperscanning, an emerging technique in which data from multiple interacting subjects’ brains are simultaneously recorded, has become an increasingly popular way to address complex topics, such as “theory of mind.” However, most previous fMRI hyperscanning experiments have been limited to abstract social interactions (e.g. phone conversations). Our new method utilizes a virtual reality (VR) environment used for military training, Virtual Battlespace 2 (VBS2), to create realistic avatar-avatar interactions and cooperative tasks. To control the virtual avatar, subjects use a MRI compatible Playstation 3 game controller, modified by removing all extraneous metal components and replacing any necessary ones with 3D printed plastic models. Control of both scanners’ operation is initiated by a VBS2 plugin to sync scanner time to the known time within the VR environment. Our modifications include:•Modification of game controller to be MRI compatible.•Design of VBS2 virtual environment for cooperative interactions.•Syncing two MRI machines for simultaneous recording. PMID:26150964

  9. Game controller modification for fMRI hyperscanning experiments in a cooperative virtual reality environment.

    PubMed

    Trees, Jason; Snider, Joseph; Falahpour, Maryam; Guo, Nick; Lu, Kun; Johnson, Douglas C; Poizner, Howard; Liu, Thomas T

    2014-01-01

    Hyperscanning, an emerging technique in which data from multiple interacting subjects' brains are simultaneously recorded, has become an increasingly popular way to address complex topics, such as "theory of mind." However, most previous fMRI hyperscanning experiments have been limited to abstract social interactions (e.g. phone conversations). Our new method utilizes a virtual reality (VR) environment used for military training, Virtual Battlespace 2 (VBS2), to create realistic avatar-avatar interactions and cooperative tasks. To control the virtual avatar, subjects use a MRI compatible Playstation 3 game controller, modified by removing all extraneous metal components and replacing any necessary ones with 3D printed plastic models. Control of both scanners' operation is initiated by a VBS2 plugin to sync scanner time to the known time within the VR environment. Our modifications include:•Modification of game controller to be MRI compatible.•Design of VBS2 virtual environment for cooperative interactions.•Syncing two MRI machines for simultaneous recording. PMID:26150964

  10. Virtual reality anatomy: is it comparable with traditional methods in the teaching of human forearm musculoskeletal anatomy?

    PubMed

    Codd, Anthony M; Choudhury, Bipasha

    2011-01-01

    The use of cadavers to teach anatomy is well established, but limitations with this approach have led to the introduction of alternative teaching methods. One such method is the use of three-dimensional virtual reality computer models. An interactive, three-dimensional computer model of human forearm anterior compartment musculoskeletal anatomy was produced using the open source 3D imaging program "Blender." The aim was to evaluate the use of 3D virtual reality when compared with traditional anatomy teaching methods. Three groups were identified from the University of Manchester second year Human Anatomy Research Skills Module class: a "control" group (no prior knowledge of forearm anatomy), a "traditional methods" group (taught using dissection and textbooks), and a "model" group (taught solely using e-resource). The groups were assessed on anatomy of the forearm by a ten question practical examination. ANOVA analysis showed the model group mean test score to be significantly higher than the control group (mean 7.25 vs. 1.46, P < 0.001) and not significantly different to the traditional methods group (mean 6.87, P > 0.5). Feedback from all users of the e-resource was positive. Virtual reality anatomy learning can be used to compliment traditional teaching methods effectively.

  11. Surviving sepsis--a 3D integrative educational simulator.

    PubMed

    Ježek, Filip; Tribula, Martin; Kulhánek, Tomáš; Mateják, Marek; Privitzer, Pavol; Šilar, Jan; Kofránek, Jiří; Lhotská, Lenka

    2015-08-01

    Computer technology offers greater educational possibilities, notably simulation and virtual reality. This paper presents a technology which serves to integrate multiple modalities, namely 3D virtual reality, node-based simulator, Physiomodel explorer and explanatory physiological simulators employing Modelica language and Unity3D platform. This emerging tool chain should allow the authors to concentrate more on educational content instead of application development. The technology is demonstrated through Surviving sepsis educational scenario, targeted on Microsoft Windows Store platform. PMID:26737091

  12. Surviving sepsis--a 3D integrative educational simulator.

    PubMed

    Ježek, Filip; Tribula, Martin; Kulhánek, Tomáš; Mateják, Marek; Privitzer, Pavol; Šilar, Jan; Kofránek, Jiří; Lhotská, Lenka

    2015-08-01

    Computer technology offers greater educational possibilities, notably simulation and virtual reality. This paper presents a technology which serves to integrate multiple modalities, namely 3D virtual reality, node-based simulator, Physiomodel explorer and explanatory physiological simulators employing Modelica language and Unity3D platform. This emerging tool chain should allow the authors to concentrate more on educational content instead of application development. The technology is demonstrated through Surviving sepsis educational scenario, targeted on Microsoft Windows Store platform.

  13. VEVI: A Virtual Reality Tool For Robotic Planetary Explorations

    NASA Technical Reports Server (NTRS)

    Piguet, Laurent; Fong, Terry; Hine, Butler; Hontalas, Phil; Nygren, Erik

    1994-01-01

    The Virtual Environment Vehicle Interface (VEVI), developed by the NASA Ames Research Center's Intelligent Mechanisms Group, is a modular operator interface for direct teleoperation and supervisory control of robotic vehicles. Virtual environments enable the efficient display and visualization of complex data. This characteristic allows operators to perceive and control complex systems in a natural fashion, utilizing the highly-evolved human sensory system. VEVI utilizes real-time, interactive, 3D graphics and position / orientation sensors to produce a range of interface modalities from the flat panel (windowed or stereoscopic) screen displays to head mounted/head-tracking stereo displays. The interface provides generic video control capability and has been used to control wheeled, legged, air bearing, and underwater vehicles in a variety of different environments. VEVI was designed and implemented to be modular, distributed and easily operated through long-distance communication links, using a communication paradigm called SYNERGY.

  14. Hybrid virtual reality and telepresence utilizing mobile phone technology

    NASA Astrophysics Data System (ADS)

    Mair, Gordon M.; Clark, J.; Fryer, R.; Hardiman, R.; MacGregor, D.; Retik, A.; Retik, N.; Revie, Kenneth

    1998-12-01

    An overview of the design and application of a unique mobile hybrid telepresence and virtual reality system is first provided. This is followed by a description of each of the integrated sub-systems. These include the telepresence and teleoperation sub-system comprising display, control, and communication elements together with camera platforms and a mobile vehicle, a virtual reality module capable of modeling capable of modeling a 4D civil engineering environment, in this case a construction site, and the image compression and decompression techniques which allow the video from the remote site to be transmitted across a very low bandwidth mobile phone network. The mobile telepresence system can be located on a real world construction site to observe work in progress. This video information can be accessed by a user from any remote location and compared with the VR model of planned progress. The user can then guide the vehicle and camera system to any desired viewpoint. Illustrations of the first trials of the full system, comments on problems experienced, and suggestions for further work are provided.

  15. Low-Cost, Portable, Multi-Wall Virtual Reality

    NASA Technical Reports Server (NTRS)

    Miller, Samuel A.; Misch, Noah J.; Dalton, Aaron J.

    2005-01-01

    Virtual reality systems make compelling outreach displays, but some such systems, like the CAVE, have design features that make their use for that purpose inconvenient. In the case of the CAVE, the equipment is difficult to disassemble, transport, and reassemble, and typically CAVEs can only be afforded by large-budget research facilities. We implemented a system like the CAVE that costs less than $30,000, weighs about 500 pounds, and fits into a fifteen-passenger van. A team of six people have unpacked, assembled, and calibrated the system in less than two hours. This cost reduction versus similar virtual-reality systems stems from the unique approach we took to stereoscopic projection. We used an assembly of optical chopper wheels and commodity LCD projectors to create true active stereo at less than a fifth of the cost of comparable active-stereo technologies. The screen and frame design also optimized portability; the frame assembles in minutes with only two fasteners, and both it and the screen pack into small bundles for easy and secure shipment.

  16. Validation of a Novel Virtual Reality Simulator for Robotic Surgery

    PubMed Central

    Schreuder, Henk W. R.; Persson, Jan E. U.; Wolswijk, Richard G. H.; Ihse, Ingmar; Schijven, Marlies P.; Verheijen, René H. M.

    2014-01-01

    Objective. With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. Methods. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Results. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were “time to complete” and “economy of motion” (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Conclusions. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery. PMID:24600328

  17. An Australian and New Zealand Scoping Study on the Use of 3D Immersive Virtual Worlds in Higher Education

    ERIC Educational Resources Information Center

    Dalgarno, Barney; Lee, Mark J. W.; Carlson, Lauren; Gregory, Sue; Tynan, Belinda

    2011-01-01

    This article describes the research design of, and reports selected findings from, a scoping study aimed at examining current and planned applications of 3D immersive virtual worlds at higher education institutions across Australia and New Zealand. The scoping study is the first of its kind in the region, intended to parallel and complement a…

  18. The Effect of 3D Virtual Learning Environment on Secondary School Third Grade Students' Attitudes toward Mathematics

    ERIC Educational Resources Information Center

    Simsek, Irfan

    2016-01-01

    With this research, in Second Life environment which is a three dimensional online virtual world, it is aimed to reveal the effects of student attitudes toward mathematics courses and design activities which will enable the third grade students of secondary school (primary education seventh grade) to see the 3D objects in mathematics courses in a…

  19. Virtual Superheroes: Using Superpowers in Virtual Reality to Encourage Prosocial Behavior

    PubMed Central

    Rosenberg, Robin S.; Baughman, Shawnee L.; Bailenson, Jeremy N.

    2013-01-01

    Background Recent studies have shown that playing prosocial video games leads to greater subsequent prosocial behavior in the real world. However, immersive virtual reality allows people to occupy avatars that are different from them in a perceptually realistic manner. We examine how occupying an avatar with the superhero ability to fly increases helping behavior. Principal Findings Using a two-by-two design, participants were either given the power of flight (their arm movements were tracked to control their flight akin to Superman’s flying ability) or rode as a passenger in a helicopter, and were assigned one of two tasks, either to help find a missing diabetic child in need of insulin or to tour a virtual city. Participants in the “super-flight” conditions helped the experimenter pick up spilled pens after their virtual experience significantly more than those who were virtual passengers in a helicopter. Conclusion The results indicate that having the “superpower” of flight leads to greater helping behavior in the real world, regardless of how participants used that power. A possible mechanism for this result is that having the power of flight primed concepts and prototypes associated with superheroes (e.g., Superman). This research illustrates the potential of using experiences in virtual reality technology to increase prosocial behavior in the physical world. PMID:23383029

  20. From Vesalius to virtual reality: How embodied cognition facilitates the visualization of anatomy

    NASA Astrophysics Data System (ADS)

    Jang, Susan

    This study examines the facilitative effects of embodiment of a complex internal anatomical structure through three-dimensional ("3-D") interactivity in a virtual reality ("VR") program. Since Shepard and Metzler's influential 1971 study, it has been known that 3-D objects (e.g., multiple-armed cube or external body parts) are visually and motorically embodied in our minds. For example, people take longer to rotate mentally an image of their hand not only when there is a greater degree of rotation, but also when the images are presented in a manner incompatible with their natural body movement (Parsons, 1987a, 1994; Cooper & Shepard, 1975; Sekiyama, 1983). Such findings confirm the notion that our mental images and rotations of those images are in fact confined by the laws of physics and biomechanics, because we perceive, think and reason in an embodied fashion. With the advancement of new technologies, virtual reality programs for medical education now enable users to interact directly in a 3-D environment with internal anatomical structures. Given that such structures are not readily viewable to users and thus not previously susceptible to embodiment, coupled with the VR environment also affording all possible degrees of rotation, how people learn from these programs raises new questions. If we embody external anatomical parts we can see, such as our hands and feet, can we embody internal anatomical parts we cannot see? Does manipulating the anatomical part in virtual space facilitate the user's embodiment of that structure and therefore the ability to visualize the structure mentally? Medical students grouped in yoked-pairs were tasked with mastering the spatial configuration of an internal anatomical structure; only one group was allowed to manipulate the images of this anatomical structure in a 3-D VR environment, whereas the other group could only view the manipulation. The manipulation group outperformed the visual group, suggesting that the interactivity