Science.gov

Sample records for 3-d virtual reality

  1. 3D Virtual Reality Check: Learner Engagement and Constructivist Theory

    ERIC Educational Resources Information Center

    Bair, Richard A.

    2013-01-01

    The inclusion of three-dimensional (3D) virtual tools has created a need to communicate the engagement of 3D tools and specify learning gains that educators and the institutions, which are funding 3D tools, can expect. A review of literature demonstrates that specific models and theories for 3D Virtual Reality (VR) learning do not exist "per…

  2. Augmented Reality vs Virtual Reality for 3D Object Manipulation.

    PubMed

    Krichenbauer, Max; Yamamoto, Goshiro; Taketomi, Takafumi; Sandor, Christian; Kato, Hirokazu

    2017-01-25

    Virtual Reality (VR) Head-Mounted Displays (HMDs) are on the verge of becoming commodity hardware available to the average user and feasible to use as a tool for 3D work. Some HMDs include front-facing cameras, enabling Augmented Reality (AR) functionality. Apart from avoiding collisions with the environment, interaction with virtual objects may also be affected by seeing the real environment. However, whether these effects are positive or negative has not yet been studied extensively. For most tasks it is unknown whether AR has any advantage over VR. In this work we present the results of a user study in which we compared user performance measured in task completion time on a 9 degrees of freedom object selection and transformation task performed either in AR or VR, both with a 3D input device and a mouse. Our results show faster task completion time in AR over VR. When using a 3D input device, a purely VR environment increased task completion time by 22.5% on average compared to AR (p < 0:024). Surprisingly, a similar effect occurred when using a mouse: users were about 17.3% slower in VR than in AR (p < 0:04). Mouse and 3D input device produced similar task completion times in each condition (AR or VR) respectively. We further found no differences in reported comfort.

  3. Organizational Learning Goes Virtual?: A Study of Employees' Learning Achievement in Stereoscopic 3D Virtual Reality

    ERIC Educational Resources Information Center

    Lau, Kung Wong

    2015-01-01

    Purpose: This study aims to deepen understanding of the use of stereoscopic 3D technology (stereo3D) in facilitating organizational learning. The emergence of advanced virtual technologies, in particular to the stereo3D virtual reality, has fundamentally changed the ways in which organizations train their employees. However, in academic or…

  4. Virtual reality 3D headset based on DMD light modulators

    SciTech Connect

    Bernacki, Bruce E.; Evans, Allan; Tang, Edward

    2014-06-13

    We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micro-mirror devices (DMD). Our approach leverages silicon micro mirrors offering 720p resolution displays in a small form-factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high resolution and low power consumption. Applications include night driving, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design is described in which light from the DMD is imaged to infinity and the user’s own eye lens forms a real image on the user’s retina.

  5. The virtual reality 3D city of Ningbo

    NASA Astrophysics Data System (ADS)

    Chen, Weimin; Wu, Dun

    2010-11-01

    In 2005, Ningbo Design Research Institute of Mapping & Surveying started the development of concepts and an implementation of Virtual Reality Ningbo System (VRNS). VRNS is being developed under the digital city technological framework and well supported by computing advances, space technologies, and commercial innovations. It has become the best solution for integrating, managing, presenting, and distributing complex city information. VRNS is not only a 3D-GIS launch project but also a technology innovation. The traditional domain of surveying and mapping has changed greatly in Ningbo. Geo-information systems are developing towards a more reality-, three dimension- and Service-Oriented Architecture-based system. The VRNS uses technology such as 3D modeling, user interface design, view scene modeling, real-time rendering and interactive roaming under a virtual environment. Two applications of VRNS already being used are for city planning and high-rise buildings' security management. The final purpose is to develop VRNS into a powerful public information platform, and to achieve that heterogeneous city information resources share this one single platform.

  6. Sensorized Garment Augmented 3D Pervasive Virtual Reality System

    NASA Astrophysics Data System (ADS)

    Gulrez, Tauseef; Tognetti, Alessandro; de Rossi, Danilo

    Virtual reality (VR) technology has matured to a point where humans can navigate in virtual scenes; however, providing them with a comfortable fully immersive role in VR remains a challenge. Currently available sensing solutions do not provide ease of deployment, particularly in the seated position due to sensor placement restrictions over the body, and optic-sensing requires a restricted indoor environment to track body movements. Here we present a 52-sensor laden garment interfaced with VR, which offers both portability and unencumbered user movement in a VR environment. This chapter addresses the systems engineering aspects of our pervasive computing solution of the interactive sensorized 3D VR and presents the initial results and future research directions. Participants navigated in a virtual art gallery using natural body movements that were detected by their wearable sensor shirt and then mapped the signals to electrical control signals responsible for VR scene navigation. The initial results are positive, and offer many opportunities for use in computationally intelligentman-machine multimedia control.

  7. 3-D Sound for Virtual Reality and Multimedia

    NASA Technical Reports Server (NTRS)

    Begault, Durand R.; Trejo, Leonard J. (Technical Monitor)

    2000-01-01

    Technology and applications for the rendering of virtual acoustic spaces are reviewed. Chapter 1 deals with acoustics and psychoacoustics. Chapters 2 and 3 cover cues to spatial hearing and review psychoacoustic literature. Chapter 4 covers signal processing and systems overviews of 3-D sound systems. Chapter 5 covers applications to computer workstations, communication systems, aeronautics and space, and sonic arts. Chapter 6 lists resources. This TM is a reprint of the 1994 book from Academic Press.

  8. Virtual reality and 3D animation in forensic visualization.

    PubMed

    Ma, Minhua; Zheng, Huiru; Lallie, Harjinder

    2010-09-01

    Computer-generated three-dimensional (3D) animation is an ideal media to accurately visualize crime or accident scenes to the viewers and in the courtrooms. Based upon factual data, forensic animations can reproduce the scene and demonstrate the activity at various points in time. The use of computer animation techniques to reconstruct crime scenes is beginning to replace the traditional illustrations, photographs, and verbal descriptions, and is becoming popular in today's forensics. This article integrates work in the areas of 3D graphics, computer vision, motion tracking, natural language processing, and forensic computing, to investigate the state-of-the-art in forensic visualization. It identifies and reviews areas where new applications of 3D digital technologies and artificial intelligence could be used to enhance particular phases of forensic visualization to create 3D models and animations automatically and quickly. Having discussed the relationships between major crime types and level-of-detail in corresponding forensic animations, we recognized that high level-of-detail animation involving human characters, which is appropriate for many major crime types but has had limited use in courtrooms, could be useful for crime investigation.

  9. Anesthesiology training using 3D imaging and virtual reality

    NASA Astrophysics Data System (ADS)

    Blezek, Daniel J.; Robb, Richard A.; Camp, Jon J.; Nauss, Lee A.

    1996-04-01

    Current training for regional nerve block procedures by anesthesiology residents requires expert supervision and the use of cadavers; both of which are relatively expensive commodities in today's cost-conscious medical environment. We are developing methods to augment and eventually replace these training procedures with real-time and realistic computer visualizations and manipulations of the anatomical structures involved in anesthesiology procedures, such as nerve plexus injections (e.g., celiac blocks). The initial work is focused on visualizations: both static images and rotational renderings. From the initial results, a coherent paradigm for virtual patient and scene representation will be developed.

  10. The Learner Characteristics, Features of Desktop 3D Virtual Reality Environments, and College Chemistry Instruction: A Structural Equation Modeling Analysis

    ERIC Educational Resources Information Center

    Merchant, Zahira; Goetz, Ernest T.; Keeney-Kennicutt, Wendy; Kwok, Oi-man; Cifuentes, Lauren; Davis, Trina J.

    2012-01-01

    We examined a model of the impact of a 3D desktop virtual reality environment on the learner characteristics (i.e. perceptual and psychological variables) that can enhance chemistry-related learning achievements in an introductory college chemistry class. The relationships between the 3D virtual reality features and the chemistry learning test as…

  11. MAT3D: a virtual reality modeling language environment for the teaching and learning of mathematics.

    PubMed

    Pasqualotti, Adriano; dal Sasso Freitas, Carla Maria

    2002-10-01

    Virtual Reality Modeling Language (VRML) is an independent platform language that allows the creation of nonimmersive virtual environments (VEs) and their use through the Internet. In these VEs, the viewer may navigate and interact with virtual objects, moving around and visualizing them from different angles. Students can benefit from this technology, because it permits them access to objects, which describe the topics covered in their studies in addition to oral and written information. In this work, we investigate the aspects involved in the use of VEs in teaching and learning and propose a conceptual model, called MAT3D, as a learning environment that can be used for the teaching and learning of mathematics. A case study is also presented, in which students use a virtual environment modeled in VRML. Data resulting from this study is analyzed statistically to evaluate the impact of this prototype when applied to the actual teaching and learning of mathematics.

  12. Early pregnancy placental bed and fetal vascular volume measurements using 3-D virtual reality.

    PubMed

    Reus, Averil D; Klop-van der Aa, Josine; Rifouna, Maria S; Koning, Anton H J; Exalto, Niek; van der Spek, Peter J; Steegers, Eric A P

    2014-08-01

    In this study, a new 3-D Virtual Reality (3D VR) technique for examining placental and uterine vasculature was investigated. The validity of placental bed vascular volume (PBVV) and fetal vascular volume (FVV) measurements was assessed and associations of PBVV and FVV with embryonic volume, crown-rump length, fetal birth weight and maternal parity were investigated. One hundred thirty-two patients were included in this study, and measurements were performed in 100 patients. Using V-Scope software, 100 3-D Power Doppler data sets of 100 pregnancies at 12 wk of gestation were analyzed with 3D VR in the I-Space Virtual Reality system. Volume measurements were performed with semi-automatic, pre-defined parameters. The inter-observer and intra-observer agreement was excellent with all intra-class correlation coefficients >0.93. PBVVs of multiparous women were significantly larger than the PBVVs of primiparous women (p = 0.008). In this study, no other associations were found. In conclusion, V-Scope offers a reproducible method for measuring PBVV and FVV at 12 wk of gestation, although we are unsure whether the volume measured represents the true volume of the vasculature. Maternal parity influences PBVV.

  13. Using the CAVE virtual-reality environment as an aid to 3-D electromagnetic field computation

    SciTech Connect

    Turner, L.R.; Levine, D.; Huang, M.; Papka, M; Kettunen, L.

    1995-08-01

    One of the major problems in three-dimensional (3-D) field computation is visualizing the resulting 3-D field distributions. A virtual-reality environment, such as the CAVE, (CAVE Automatic Virtual Environment) is helping to overcome this problem, thus making the results of computation more usable for designers and users of magnets and other electromagnetic devices. As a demonstration of the capabilities of the CAVE, the elliptical multipole wiggler (EMW), an insertion device being designed for the Advanced Photon Source (APS) now being commissioned at Argonne National Laboratory (ANL), wa made visible, along with its fields and beam orbits. Other uses of the CAVE in preprocessing and postprocessing computation for electromagnetic applications are also discussed.

  14. Effects of 3D Virtual Reality of Plate Tectonics on Fifth Grade Students' Achievement and Attitude toward Science

    ERIC Educational Resources Information Center

    Kim, Paul

    2006-01-01

    This study examines the effects of a teaching method using 3D virtual reality simulations on achievement and attitude toward science. An experiment was conducted with fifth-grade students (N = 41) to examine the effects of 3D simulations, designed to support inquiry-based science curriculum. An ANOVA analysis revealed that the 3D group scored…

  15. 3D Visualization of Cultural Heritage Artefacts with Virtual Reality devices

    NASA Astrophysics Data System (ADS)

    Gonizzi Barsanti, S.; Caruso, G.; Micoli, L. L.; Covarrubias Rodriguez, M.; Guidi, G.

    2015-08-01

    Although 3D models are useful to preserve the information about historical artefacts, the potential of these digital contents are not fully accomplished until they are not used to interactively communicate their significance to non-specialists. Starting from this consideration, a new way to provide museum visitors with more information was investigated. The research is aimed at valorising and making more accessible the Egyptian funeral objects exhibited in the Sforza Castle in Milan. The results of the research will be used for the renewal of the current exhibition, at the Archaeological Museum in Milan, by making it more attractive. A 3D virtual interactive scenario regarding the "path of the dead", an important ritual in ancient Egypt, was realized to augment the experience and the comprehension of the public through interactivity. Four important artefacts were considered for this scope: two ushabty, a wooden sarcophagus and a heart scarab. The scenario was realized by integrating low-cost Virtual Reality technologies, as the Oculus Rift DK2 and the Leap Motion controller, and implementing a specific software by using Unity. The 3D models were implemented by adding responsive points of interest in relation to important symbols or features of the artefact. This allows highlighting single parts of the artefact in order to better identify the hieroglyphs and provide their translation. The paper describes the process for optimizing the 3D models, the implementation of the interactive scenario and the results of some test that have been carried out in the lab.

  16. Load Assembly of the Ignitor Machine with 3D Interactive Virtual Reality

    NASA Astrophysics Data System (ADS)

    Migliori, S.; Pierattini, S.

    2003-10-01

    The main purpose of this work is to assist the Ignitor team in every phase of the project using the new Virtual Reality Technology (VR). Through the VR it is possible to see, plan and test the machine assembly sequence and the total layout. We are also planning to simulate in VR the remote handling systems. The complexity of the system requires a large and powerful graphical device. The ENEA?s "Advanced Visualization Technology" team has implemented a repository file data structure integrated with the CATIA drawing cams from the designer of Ignitor. The 3D virtual mockup software is used to view and analyze all objects that compose the mockup and also to analyze the correct assembly sequences. The ENEA?s 3D immersive system and software are fully integrated in the ENEA?s supercomputing GRID infrastructure. At any time all members of the Ignitor Project can view the status of the mockup in 3D (draft and/or final objects) through the net. During the conference examples of the assembly sequence and load assembly structure will be presented.

  17. Design and implementation of a 3D ocean virtual reality and visualization engine

    NASA Astrophysics Data System (ADS)

    Chen, Ge; Li, Bo; Tian, Fenglin; Ji, Pengbo; Li, Wenqing

    2012-12-01

    In this study, a 3D virtual reality and visualization engine for rendering the ocean, named VV-Ocean, is designed for marine applications. The design goals of VV-Ocean aim at high fidelity simulation of ocean environment, visualization of massive and multidimensional marine data, and imitation of marine lives. VV-Ocean is composed of five modules, i.e. memory management module, resources management module, scene management module, rendering process management module and interaction management module. There are three core functions in VV-Ocean: reconstructing vivid virtual ocean scenes, visualizing real data dynamically in real time, imitating and simulating marine lives intuitively. Based on VV-Ocean, we establish a sea-land integration platform which can reproduce drifting and diffusion processes of oil spilling from sea bottom to surface. Environment factors such as ocean current and wind field have been considered in this simulation. On this platform oil spilling process can be abstracted as movements of abundant oil particles. The result shows that oil particles blend with water well and the platform meets the requirement for real-time and interactive rendering. VV-Ocean can be widely used in ocean applications such as demonstrating marine operations, facilitating maritime communications, developing ocean games, reducing marine hazards, forecasting the weather over oceans, serving marine tourism, and so on. Finally, further technological improvements of VV-Ocean are discussed.

  18. Virtual Reality

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This video presentation discusses how virtual reality enables scientists to 'explore' other worlds without leaving the laboratory. The applicability of virtual reality for scientific visualization is also discussed.

  19. Re-Dimensional Thinking in Earth Science: From 3-D Virtual Reality Panoramas to 2-D Contour Maps

    ERIC Educational Resources Information Center

    Park, John; Carter, Glenda; Butler, Susan; Slykhuis, David; Reid-Griffin, Angelia

    2008-01-01

    This study examines the relationship of gender and spatial perception on student interactivity with contour maps and non-immersive virtual reality. Eighteen eighth-grade students elected to participate in a six-week activity-based course called "3-D GeoMapping." The course included nine days of activities related to topographic mapping.…

  20. A 3-D Virtual Reality Model of the Sun and the Moon for E-Learning at Elementary Schools

    ERIC Educational Resources Information Center

    Sun, Koun-Tem; Lin, Ching-Ling; Wang, Sheng-Min

    2010-01-01

    The relative positions of the sun, moon, and earth, their movements, and their relationships are abstract and difficult to understand astronomical concepts in elementary school science. This study proposes a three-dimensional (3-D) virtual reality (VR) model named the "Sun and Moon System." This e-learning resource was designed by…

  1. The Input-Interface of Webcam Applied in 3D Virtual Reality Systems

    ERIC Educational Resources Information Center

    Sun, Huey-Min; Cheng, Wen-Lin

    2009-01-01

    Our research explores a virtual reality application based on Web camera (Webcam) input-interface. The interface can replace with the mouse to control direction intention of a user by the method of frame difference. We divide a frame into nine grids from Webcam and make use of the background registration to compute the moving object. In order to…

  2. Enhancing Time-Connectives with 3D Immersive Virtual Reality (IVR)

    ERIC Educational Resources Information Center

    Passig, David; Eden, Sigal

    2010-01-01

    This study sought to test the most efficient representation mode with which children with hearing impairment could express a story while producing connectives indicating relations of time and of cause and effect. Using Bruner's (1973, 1986, 1990) representation stages, we tested the comparative effectiveness of Virtual Reality (VR) as a mode of…

  3. The Effect Of 3D Audio And Other Audio Techniques On Virtual Reality Experience.

    PubMed

    Brinkman, Willem-Paul; Hoekstra, Allart R D; van Egmond, René

    2015-01-01

    Three studies were conducted to examine the effect of audio on people's experience in a virtual world. The first study showed that people could distinguish between mono, stereo, Dolby surround and 3D audio of a wasp. The second study found significant effects for audio techniques on people's self-reported anxiety, presence, and spatial perception. The third study found that adding sound to a visual virtual world had a significant effect on people's experience (including heart rate), while it found no difference in experience between stereo and 3D audio.

  4. Exploring 3-D Virtual Reality Technology for Spatial Ability and Chemistry Achievement

    ERIC Educational Resources Information Center

    Merchant, Z.; Goetz, E. T.; Keeney-Kennicutt, W.; Cifuentes, L.; Kwok, O.; Davis, T. J.

    2013-01-01

    We investigated the potential of Second Life® (SL), a three-dimensional (3-D) virtual world, to enhance undergraduate students' learning of a vital chemistry concept. A quasi-experimental pre-posttest control group design was used to conduct the study. A total of 387 participants completed three assignment activities either in SL or using…

  5. Generic precise augmented reality guiding system and its calibration method based on 3D virtual model.

    PubMed

    Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua

    2016-05-30

    Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique.

  6. M3D (Media 3D): a new programming language for web-based virtual reality in E-Learning and Edutainment

    NASA Astrophysics Data System (ADS)

    Chakaveh, Sepideh; Skaley, Detlef; Laine, Patricia; Haeger, Ralf; Maad, Soha

    2003-01-01

    Today, interactive multimedia educational systems are well established, as they prove useful instruments to enhance one's learning capabilities. Hitherto, the main difficulty with almost all E-Learning systems was latent in the rich media implementation techniques. This meant that each and every system should be created individually as reapplying the media, be it only a part, or the whole content was not directly possible, as everything must be applied mechanically i.e. by hand. Consequently making E-learning systems exceedingly expensive to generate, both in time and money terms. Media-3D or M3D is a new platform independent programming language, developed at the Fraunhofer Institute Media Communication to enable visualisation and simulation of E-Learning multimedia content. M3D is an XML-based language, which is capable of distinguishing between the3D models from that of the 3D scenes, as well as handling provisions for animations, within the programme. Here we give a technical account of M3D programming language and briefly describe two specific application scenarios where M3D is applied to create virtual reality E-Learning content for training of technical personnel.

  7. Using virtual reality technology and hand tracking technology to create software for training surgical skills in 3D game

    NASA Astrophysics Data System (ADS)

    Zakirova, A. A.; Ganiev, B. A.; Mullin, R. I.

    2015-11-01

    The lack of visible and approachable ways of training surgical skills is one of the main problems in medical education. Existing simulation training devices are not designed to teach students, and are not available due to the high cost of the equipment. Using modern technologies such as virtual reality and hands movements fixation technology we want to create innovative method of learning the technics of conducting operations in 3D game format, which can make education process interesting and effective. Creating of 3D format virtual simulator will allow to solve several conceptual problems at once: opportunity of practical skills improvement unlimited by the time without the risk for patient, high realism of environment in operational and anatomic body structures, using of game mechanics for information perception relief and memorization of methods acceleration, accessibility of this program.

  8. Hand Controlled Manipulation of Single Molecules via a Scanning Probe Microscope with a 3D Virtual Reality Interface.

    PubMed

    Leinen, Philipp; Green, Matthew F B; Esat, Taner; Wagner, Christian; Tautz, F Stefan; Temirov, Ruslan

    2016-10-02

    Considering organic molecules as the functional building blocks of future nanoscale technology, the question of how to arrange and assemble such building blocks in a bottom-up approach is still open. The scanning probe microscope (SPM) could be a tool of choice; however, SPM-based manipulation was until recently limited to two dimensions (2D). Binding the SPM tip to a molecule at a well-defined position opens an opportunity of controlled manipulation in 3D space. Unfortunately, 3D manipulation is largely incompatible with the typical 2D-paradigm of viewing and generating SPM data on a computer. For intuitive and efficient manipulation we therefore couple a low-temperature non-contact atomic force/scanning tunneling microscope (LT NC-AFM/STM) to a motion capture system and fully immersive virtual reality goggles. This setup permits "hand controlled manipulation" (HCM), in which the SPM tip is moved according to the motion of the experimenter's hand, while the tip trajectories as well as the response of the SPM junction are visualized in 3D. HCM paves the way to the development of complex manipulation protocols, potentially leading to a better fundamental understanding of nanoscale interactions acting between molecules on surfaces. Here we describe the setup and the steps needed to achieve successful hand-controlled molecular manipulation within the virtual reality environment.

  9. 3D graphics, virtual reality, and motion-onset visual evoked potentials in neurogaming.

    PubMed

    Beveridge, R; Wilson, S; Coyle, D

    2016-01-01

    A brain-computer interface (BCI) offers movement-free control of a computer application and is achieved by reading and translating the cortical activity of the brain into semantic control signals. Motion-onset visual evoked potentials (mVEP) are neural potentials employed in BCIs and occur when motion-related stimuli are attended visually. mVEP dynamics are correlated with the position and timing of the moving stimuli. To investigate the feasibility of utilizing the mVEP paradigm with video games of various graphical complexities including those of commercial quality, we conducted three studies over four separate sessions comparing the performance of classifying five mVEP responses with variations in graphical complexity and style, in-game distractions, and display parameters surrounding mVEP stimuli. To investigate the feasibility of utilizing contemporary presentation modalities in neurogaming, one of the studies compared mVEP classification performance when stimuli were presented using the oculus rift virtual reality headset. Results from 31 independent subjects were analyzed offline. The results show classification performances ranging up to 90% with variations in conditions in graphical complexity having limited effect on mVEP performance; thus, demonstrating the feasibility of using the mVEP paradigm within BCI-based neurogaming.

  10. 3D Elevation Program—Virtual USA in 3D

    USGS Publications Warehouse

    Lukas, Vicki; Stoker, J.M.

    2016-04-14

    The U.S. Geological Survey (USGS) 3D Elevation Program (3DEP) uses a laser system called ‘lidar’ (light detection and ranging) to create a virtual reality map of the Nation that is very accurate. 3D maps have many uses with new uses being discovered all the time.  

  11. Three‐dimensional immersive virtual reality for studying cellular compartments in 3D models from EM preparations of neural tissues

    PubMed Central

    Baghabra, Jumana; Boges, Daniya J.; Holst, Glendon R.; Kreshuk, Anna; Hamprecht, Fred A.; Srinivasan, Madhusudhanan; Lehväslaiho, Heikki

    2016-01-01

    ABSTRACT Advances in the application of electron microscopy (EM) to serial imaging are opening doors to new ways of analyzing cellular structure. New and improved algorithms and workflows for manual and semiautomated segmentation allow us to observe the spatial arrangement of the smallest cellular features with unprecedented detail in full three‐dimensions. From larger samples, higher complexity models can be generated; however, they pose new challenges to data management and analysis. Here we review some currently available solutions and present our approach in detail. We use the fully immersive virtual reality (VR) environment CAVE (cave automatic virtual environment), a room in which we are able to project a cellular reconstruction and visualize in 3D, to step into a world created with Blender, a free, fully customizable 3D modeling software with NeuroMorph plug‐ins for visualization and analysis of EM preparations of brain tissue. Our workflow allows for full and fast reconstructions of volumes of brain neuropil using ilastik, a software tool for semiautomated segmentation of EM stacks. With this visualization environment, we can walk into the model containing neuronal and astrocytic processes to study the spatial distribution of glycogen granules, a major energy source that is selectively stored in astrocytes. The use of CAVE was key to the observation of a nonrandom distribution of glycogen, and led us to develop tools to quantitatively analyze glycogen clustering and proximity to other subcellular features. J. Comp. Neurol. 524:23–38, 2016. © 2015 Wiley Periodicals, Inc. PMID:26179415

  12. 3D chromosome rendering from Hi-C data using virtual reality

    NASA Astrophysics Data System (ADS)

    Zhu, Yixin; Selvaraj, Siddarth; Weber, Philip; Fang, Jennifer; Schulze, Jürgen P.; Ren, Bing

    2015-01-01

    Most genome browsers display DNA linearly, using single-dimensional depictions that are useful to examine certain epigenetic mechanisms such as DNA methylation. However, these representations are insufficient to visualize intrachromosomal interactions and relationships between distal genome features. Relationships between DNA regions may be difficult to decipher or missed entirely if those regions are distant in one dimension but could be spatially proximal when mapped to three-dimensional space. For example, the visualization of enhancers folding over genes is only fully expressed in three-dimensional space. Thus, to accurately understand DNA behavior during gene expression, a means to model chromosomes is essential. Using coordinates generated from Hi-C interaction frequency data, we have created interactive 3D models of whole chromosome structures and its respective domains. We have also rendered information on genomic features such as genes, CTCF binding sites, and enhancers. The goal of this article is to present the procedure, findings, and conclusions of our models and renderings.

  13. A Learner-Centered Approach for Training Science Teachers through Virtual Reality and 3D Visualization Technologies: Practical Experience for Sharing

    ERIC Educational Resources Information Center

    Yeung, Yau-Yuen

    2004-01-01

    This paper presentation will report on how some science educators at the Science Department of The Hong Kong Institute of Education have successfully employed an array of innovative learning media such as three-dimensional (3D) and virtual reality (VR) technologies to create seven sets of resource kits, most of which are being placed on the…

  14. Visualization and Interpretation in 3D Virtual Reality of Topographic and Geophysical Data from the Chicxulub Impact Crater

    NASA Astrophysics Data System (ADS)

    Rosen, J.; Kinsland, G. L.; Borst, C.

    2011-12-01

    We have assembled Shuttle Radar Topography Mission (SRTM) data (Borst and Kinsland, 2005), gravity data (Bedard, 1977), horizontal gravity gradient data (Hildebrand et al., 1995), magnetic data (Pilkington et al., 2000) and GPS topography data (Borst and Kinsland, 2005) from the Chicxulub Impact Crater buried on the Yucatan Peninsula of Mexico. These data sets are imaged as gridded surfaces and are all georegistered, within an interactive 3D virtual reality (3DVR) visualization and interpretation system created and maintained in the Center for Advanced Computer Studies at the University of Louisiana at Lafayette. We are able to view and interpret the data sets individually or together and to scale and move the data or to move our physical head position so as to achieve the best viewing perspective for interpretation. A feature which is especially valuable for understanding the relationships between the various data sets is our ability to "interlace" the 3D images. "Interlacing" is a technique we have developed whereby the data surfaces are moved along a common axis so that they interpenetrate. This technique leads to rapid and positive identification of spatially corresponding features in the various data sets. We present several images from the 3D system, which demonstrate spatial relationships amongst the features in the data sets. Some of the anomalies in gravity are very nearly coincident with anomalies in the magnetic data as one might suspect if the causal bodies are the same. Other gravity and magnetic anomalies are not spatially coincident indicating different causal bodies. Topographic anomalies display a strong spatial correspondence with many gravity anomalies. In some cases small gravity anomalies and topographic valleys are caused by shallow dissolution within the Tertiary cover along faults or fractures propagated upward from the buried structure. In other cases the sources of the gravity anomalies are in the more deeply buried structure from which

  15. Transforming Clinical Imaging and 3D Data for Virtual Reality Learning Objects: HTML5 and Mobile Devices Implementation

    ERIC Educational Resources Information Center

    Trelease, Robert B.; Nieder, Gary L.

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android…

  16. Virtual Reality in Denmark

    DTIC Science & Technology

    2005-12-01

    technologies are available in CAVI: VIRTUAL REALITY IN DENMARK 2 - 2 RTO-TR-HFM-121-Part-I • 3D Panorama Cinema Curved screen Active stereo...glasses Tracking Figure 3: 3D Panorama Cinema at CAVI. • The Panorama cinema is a cylinder shaped screen placed in a room that seats approximately...15-20 persons. The size and shape of the screen mean that the visual angle of the spectators is almost covered by the screen . Models are displayed

  17. The use of a low-cost visible light 3D scanner to create virtual reality environment models of actors and objects

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    A low-cost 3D scanner has been developed with a parts cost of approximately USD $5,000. This scanner uses visible light sensing to capture both structural as well as texture and color data of a subject. This paper discusses the use of this type of scanner to create 3D models for incorporation into a virtual reality environment. It describes the basic scanning process (which takes under a minute for a single scan), which can be repeated to collect multiple positions, if needed for actor model creation. The efficacy of visible light versus other scanner types is also discussed.

  18. A New Approach to Improve Cognition, Muscle Strength, and Postural Balance in Community-Dwelling Elderly with a 3-D Virtual Reality Kayak Program.

    PubMed

    Park, Junhyuck; Yim, JongEun

    2016-01-01

    Aging is usually accompanied with deterioration of physical abilities, such as muscular strength, sensory sensitivity, and functional capacity. Recently, intervention methods with virtual reality have been introduced, providing an enjoyable therapy for elderly. The aim of this study was to investigate whether a 3-D virtual reality kayak program could improve the cognitive function, muscle strength, and balance of community-dwelling elderly. Importantly, kayaking involves most of the upper body musculature and needs the balance control. Seventy-two participants were randomly allocated into the kayak program group (n = 36) and the control group (n = 36). The two groups were well matched with respect to general characteristics at baseline. The participants in both groups performed a conventional exercise program for 30 min, and then the 3-D virtual reality kayak program was performed in the kayak program group for 20 min, two times a week for 6 weeks. Cognitive function was measured using the Montreal Cognitive Assessment. Muscle strength was measured using the arm curl and handgrip strength tests. Standing and sitting balance was measured using the Good Balance system. The post-test was performed in the same manner as the pre-test; the overall outcomes such as cognitive function (p < 0.05), muscle strength (p < 0.05), and balance (standing and sitting balance, p < 0.05) were significantly improved in kayak program group compared to the control group. We propose that the 3-D virtual reality kayak program is a promising intervention method for improving the cognitive function, muscle strength, and balance of elderly.

  19. On the Usability and Usefulness of 3d (geo)visualizations - a Focus on Virtual Reality Environments

    NASA Astrophysics Data System (ADS)

    Çöltekin, A.; Lokka, I.; Zahner, M.

    2016-06-01

    Whether and when should we show data in 3D is an on-going debate in communities conducting visualization research. A strong opposition exists in the information visualization (Infovis) community, and seemingly unnecessary/unwarranted use of 3D, e.g., in plots, bar or pie charts, is heavily criticized. The scientific visualization (Scivis) community, on the other hand, is more supportive of the use of 3D as it allows `seeing' invisible phenomena, or designing and printing things that are used in e.g., surgeries, educational settings etc. Geographic visualization (Geovis) stands between the Infovis and Scivis communities. In geographic information science, most visuo-spatial analyses have been sufficiently conducted in 2D or 2.5D, including analyses related to terrain and much of the urban phenomena. On the other hand, there has always been a strong interest in 3D, with similar motivations as in Scivis community. Among many types of 3D visualizations, a popular one that is exploited both for visual analysis and visualization is the highly realistic (geo)virtual environments. Such environments may be engaging and memorable for the viewers because they offer highly immersive experiences. However, it is not yet well-established if we should opt to show the data in 3D; and if yes, a) what type of 3D we should use, b) for what task types, and c) for whom. In this paper, we identify some of the central arguments for and against the use of 3D visualizations around these three considerations in a concise interdisciplinary literature review.

  20. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  1. 3D augmented reality with integral imaging display

    NASA Astrophysics Data System (ADS)

    Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-06-01

    In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.

  2. Transforming clinical imaging and 3D data for virtual reality learning objects: HTML5 and mobile devices implementation.

    PubMed

    Trelease, Robert B; Nieder, Gary L

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android tablets. This article describes complementary methods for creating comparable, multiplatform VR learning objects in the new HTML5 standard format, circumventing platform-specific limitations imposed by the QuickTime VR multimedia file format. Multiple types or "dimensions" of anatomical information can be embedded in such learning objects, supporting different kinds of online learning applications, including interactive atlases, examination questions, and complex, multi-structure presentations. Such HTML5 VR learning objects are usable on new mobile devices that do not support QuickTime VR, as well as on personal computers. Furthermore, HTML5 VR learning objects can be embedded in "ebook" document files, supporting the development of new types of electronic textbooks on mobile devices that are increasingly popular and self-adopted for mobile learning.

  3. 3D Virtual Reality Applied in Tectonic Geomorphic Study of the Gombori Range of Greater Caucasus Mountains

    NASA Astrophysics Data System (ADS)

    Sukhishvili, Lasha; Javakhishvili, Zurab

    2016-04-01

    Gombori Range represents the southern part of the young Greater Caucasus Mountains and stretches from NW to SE. The range separates Alazani and Iori basins within the eastern Georgian province of Kakheti. The active phase of Caucasian orogeny started in the Pliocene, but according to alluvial sediments of Gombori range (mapped in the Soviet geologic map), we observe its uplift process to be Quaternary event. The highest peak of the Gombori range has an absolute elevation of 1991 m, while its neighboring Alazani valley gains only 400 m. We assume the range has a very fast uplift rate and it could trigger streams flow direction course reverse in Quaternary. To check this preliminary assumptions we are going to use a tectonic and fluvial geomorphic and stratigraphic approaches including paleocurrent analyses and various affordable absolute dating techniques to detect the evidence of river course reverses and date them. For these purposes we have selected river Turdo outcrop. The river itself flows northwards from the Gombori range and nearby region`s main city of Telavi generates 30-40 m high continuous outcrop along 1 km section. Turdo outcrop has very steep walls and requires special climbing skills to work on it. The goal of this particularly study is to avoid time and resource consuming ground survey process of this steep, high and wide outcrop and test 3D aerial and ground base photogrammetric modelling and analyzing approaches in initial stage of the tectonic geomorphic study. Using this type of remote sensing and virtual lab analyses of 3D outcrop model, we roughly delineated stratigraphic layers, selected exact locations for applying various research techniques and planned safe and suitable climbing routes for getting to the investigation sites.

  4. Computer Vision Assisted Virtual Reality Calibration

    NASA Technical Reports Server (NTRS)

    Kim, W.

    1999-01-01

    A computer vision assisted semi-automatic virtual reality (VR) calibration technology has been developed that can accurately match a virtual environment of graphically simulated three-dimensional (3-D) models to the video images of the real task environment.

  5. New weather depiction technology for night vision goggle (NVG) training: 3D virtual/augmented reality scene-weather-atmosphere-target simulation

    NASA Astrophysics Data System (ADS)

    Folaron, Michelle; Deacutis, Martin; Hegarty, Jennifer; Vollmerhausen, Richard; Schroeder, John; Colby, Frank P.

    2007-04-01

    US Navy and Marine Corps pilots receive Night Vision Goggle (NVG) training as part of their overall training to maintain the superiority of our forces. This training must incorporate realistic targets; backgrounds; and representative atmospheric and weather effects they may encounter under operational conditions. An approach for pilot NVG training is to use the Night Imaging and Threat Evaluation Laboratory (NITE Lab) concept. The NITE Labs utilize a 10' by 10' static terrain model equipped with both natural and cultural lighting that are used to demonstrate various illumination conditions, and visual phenomena which might be experienced when utilizing night vision goggles. With this technology, the military can safely, systematically, and reliably expose pilots to the large number of potentially dangerous environmental conditions that will be experienced in their NVG training flights. A previous SPIE presentation described our work for NAVAIR to add realistic atmospheric and weather effects to the NVG NITE Lab training facility using the NVG - WDT(Weather Depiction Technology) system (Colby, et al.). NVG -WDT consist of a high end multiprocessor server with weather simulation software, and several fixed and goggle mounted Heads Up Displays (HUDs). Atmospheric and weather effects are simulated using state-of-the-art computer codes such as the WRF (Weather Research μ Forecasting) model; and the US Air Force Research Laboratory MODTRAN radiative transport model. Imagery for a variety of natural and man-made obscurations (e.g. rain, clouds, snow, dust, smoke, chemical releases) are being calculated and injected into the scene observed through the NVG via the fixed and goggle mounted HUDs. This paper expands on the work described in the previous presentation and will describe the 3D Virtual/Augmented Reality Scene - Weather - Atmosphere - Target Simulation part of the NVG - WDT. The 3D virtual reality software is a complete simulation system to generate realistic

  6. Virtual reality exposure therapy.

    PubMed

    Rothbaum, B O; Hodges, L; Kooper, R

    1997-01-01

    It has been proposed that virtual reality (VR) exposure may be an alternative to standard in vivo exposure. Virtual reality integrates real-time computer graphics, body tracking devices, visual displays, and other sensory input devices to immerse a participant in a computer-generated virtual environment. Virtual reality exposure is potentially an efficient and cost-effective treatment of anxiety disorders. VR exposure therapy reduced the fear of heights in the first controlled study of virtual reality in treatment of a psychiatric disorder. A case study supported the efficacy of VR exposure therapy for the fear of flying. The potential for virtual reality exposure treatment for these and other disorders is explored, and therapeutic issues surrounding the delivery of VR exposure are discussed.

  7. Virtual Reality Enhanced Instructional Learning

    ERIC Educational Resources Information Center

    Nachimuthu, K.; Vijayakumari, G.

    2009-01-01

    Virtual Reality (VR) is a creation of virtual 3D world in which one can feel and sense the world as if it is real. It is allowing engineers to design machines and Educationists to design AV [audiovisual] equipment in real time but in 3-dimensional hologram as if the actual material is being made and worked upon. VR allows a least-cost (energy…

  8. Virtual sound for virtual reality

    SciTech Connect

    Blattner, M.M. ||; Papp, A.L. III |

    1993-02-01

    The computational limitations of real-time interactive computing do not meet our requirements for producing realistic images for virtual reality in a convincing manner. Regardless of the real-time restrictions on virtual reality interfaces, the representations can be no better than the graphics. Computer graphics is still limited in its ability to generate complex objects such as landscapes and humans. Nevertheless, useful and convincing visualizations are made through a variety of techniques. The central theme of this article is that a similar situation is true with sound for virtual reality. It is beyond our abilityto create interactive soundscapes that create a faithful reproduction of real world sounds, however, by choosing one`s application carefully and using sound to enhance a display rather than only mimic real-world scenes, a very effective use of sound can be made.

  9. Virtual sound for virtual reality

    SciTech Connect

    Blattner, M.M. Cancer Center, Houston, TX . Dept. of Biomathematics Lawrence Livermore National Lab., CA California Univ., Davis, CA ); Papp, A.L. III Lawrence Livermore National Lab., CA )

    1993-02-01

    The computational limitations of real-time interactive computing do not meet our requirements for producing realistic images for virtual reality in a convincing manner. Regardless of the real-time restrictions on virtual reality interfaces, the representations can be no better than the graphics. Computer graphics is still limited in its ability to generate complex objects such as landscapes and humans. Nevertheless, useful and convincing visualizations are made through a variety of techniques. The central theme of this article is that a similar situation is true with sound for virtual reality. It is beyond our abilityto create interactive soundscapes that create a faithful reproduction of real world sounds, however, by choosing one's application carefully and using sound to enhance a display rather than only mimic real-world scenes, a very effective use of sound can be made.

  10. Inertial Motion-Tracking Technology for Virtual 3-D

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In the 1990s, NASA pioneered virtual reality research. The concept was present long before, but, prior to this, the technology did not exist to make a viable virtual reality system. Scientists had theories and ideas they knew that the concept had potential, but the computers of the 1970s and 1980s were not fast enough, sensors were heavy and cumbersome, and people had difficulty blending fluidly with the machines. Scientists at Ames Research Center built upon the research of previous decades and put the necessary technology behind them, making the theories of virtual reality a reality. Virtual reality systems depend on complex motion-tracking sensors to convey information between the user and the computer to give the user the feeling that he is operating in the real world. These motion-tracking sensors measure and report an object s position and orientation as it changes. A simple example of motion tracking would be the cursor on a computer screen moving in correspondence to the shifting of the mouse. Tracking in 3-D, necessary to create virtual reality, however, is much more complex. To be successful, the perspective of the virtual image seen on the computer must be an accurate representation of what is seen in the real world. As the user s head or camera moves, turns, or tilts, the computer-generated environment must change accordingly with no noticeable lag, jitter, or distortion. Historically, the lack of smooth and rapid tracking of the user s motion has thwarted the widespread use of immersive 3-D computer graphics. NASA uses virtual reality technology for a variety of purposes, mostly training of astronauts. The actual missions are costly and dangerous, so any opportunity the crews have to practice their maneuvering in accurate situations before the mission is valuable and instructive. For that purpose, NASA has funded a great deal of virtual reality research, and benefited from the results.

  11. Virtual reality via photogrammetry

    NASA Astrophysics Data System (ADS)

    Zahrt, John D.; Papcun, George; Childers, Randy A.; Rubin, Naama

    1996-03-01

    We wish to walk into a photograph just as Alice walked into the looking glass. From a mathematical perspective, this problem is exceedingly ill-posed (e.g. Is that a large, distant object or a small, nearby object?). A human expert can supply a large amount of a priori information that can function as mathematical constraints. The constrained problem can then be attacked with photogrammetry to obtain a great deal of quantitative information which is otherwise only qualitatively apparent. The user determines whether the object to be analyzed contains two or three vanishing points, then selects an appropriate number of points from the photon to enable the code to compute the locations of the vanishing points. Using this information and the standard photogrammetric geometric algorithms, the location of the camera, relative to the structure, is determined. The user must also enter information regarding an absolute sense of scale. As the vectors from the camera to the various points chosen from the photograph are determined, the vector components (coordinates) are handed to a virtual reality software package. Once the objects are entered, the appropriate surfaces of the 3D object are `wallpapered' with the surface from the photograph. The user is then able to move through the virtual scene. A video will demonstrate our work.

  12. Virtual Reality Lab Assistant

    NASA Technical Reports Server (NTRS)

    Saha, Hrishikesh; Palmer, Timothy A.

    1996-01-01

    Virtual Reality Lab Assistant (VRLA) demonstration model is aligned for engineering and material science experiments to be performed by undergraduate and graduate students in the course as a pre-lab simulation experience. This will help students to get a preview of how to use the lab equipment and run experiments without using the lab hardware/software equipment. The quality of the time available for laboratory experiments can be significantly improved through the use of virtual reality technology.

  13. Learning in Virtual Reality.

    ERIC Educational Resources Information Center

    Bricken, William

    The essence of the computer revolution is yet to come, for computers are essentially generators of realities. Virtual reality (VR) is the next step in the evolutionary path; the user is placed inside the image and becomes a participant within the computational space. A VR computer generates a direct experience of the computational environment. The…

  14. Spacecraft 3D Augmented Reality Mobile App

    NASA Technical Reports Server (NTRS)

    Hussey, Kevin J.; Doronila, Paul R.; Kumanchik, Brian E.; Chan, Evan G.; Ellison, Douglas J.; Boeck, Andrea; Moore, Justin M.

    2013-01-01

    The Spacecraft 3D application allows users to learn about and interact with iconic NASA missions in a new and immersive way using common mobile devices. Using Augmented Reality (AR) techniques to project 3D renditions of the mission spacecraft into real-world surroundings, users can interact with and learn about Curiosity, GRAIL, Cassini, and Voyager. Additional updates on future missions, animations, and information will be ongoing. Using a printed AR Target and camera on a mobile device, users can get up close with these robotic explorers, see how some move, and learn about these engineering feats, which are used to expand knowledge and understanding about space. The software receives input from the mobile device's camera to recognize the presence of an AR marker in the camera's field of view. It then displays a 3D rendition of the selected spacecraft in the user's physical surroundings, on the mobile device's screen, while it tracks the device's movement in relation to the physical position of the spacecraft's 3D image on the AR marker.

  15. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients

    PubMed Central

    Lledó, Luis D.; Díez, Jorge A.; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J.; Sabater-Navarro, José M.; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  16. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients.

    PubMed

    Lledó, Luis D; Díez, Jorge A; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J; Sabater-Navarro, José M; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  17. Virtual reality for emergency training

    SciTech Connect

    Altinkemer, K.

    1995-12-31

    Virtual reality is a sequence of scenes generated by a computer as a response to the five different senses. These senses are sight, sound, taste, touch, smell. Other senses that can be used in virtual reality include balance, pheromonal, and immunological senses. Many application areas include: leisure and entertainment, medicine, architecture, engineering, manufacturing, and training. Virtual reality is especially important when it is used for emergency training and management of natural disasters including earthquakes, floods, tornados and other situations which are hard to emulate. Classical training methods for these extraordinary environments lack the realistic surroundings that virtual reality can provide. In order for virtual reality to be a successful training tool the design needs to include certain aspects; such as how real virtual reality should be and how much fixed cost is entailed in setting up the virtual reality trainer. There are also pricing questions regarding the price per training session on virtual reality trainer, and the appropriate training time length(s).

  18. Virtual Reality Calibration for Telerobotic Servicing

    NASA Technical Reports Server (NTRS)

    Kim, W.

    1994-01-01

    A virtual reality calibration technique of matching a virtual environment of simulated graphics models in 3-D geometry and perspective with actual camera views of the remote site task environment has been developed to enable high-fidelity preview/predictive displays with calibrated graphics overlay on live video.

  19. Magical Stories: Blending Virtual Reality and Artificial Intelligence.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    Artificial intelligence (AI) techniques and virtual reality (VR) make possible powerful interactive stories, and this paper focuses on examples of virtual characters in three dimensional (3-D) worlds. Waldern, a virtual reality game designer, has theorized about and implemented software design of virtual teammates and opponents that incorporate AI…

  20. Virtual reality systems

    NASA Technical Reports Server (NTRS)

    Johnson, David W.

    1992-01-01

    Virtual realities are a type of human-computer interface (HCI) and as such may be understood from a historical perspective. In the earliest era, the computer was a very simple, straightforward machine. Interaction was human manipulation of an inanimate object, little more than the provision of an explicit instruction set to be carried out without deviation. In short, control resided with the user. In the second era of HCI, some level of intelligence and control was imparted to the system to enable a dialogue with the user. Simple context sensitive help systems are early examples, while more sophisticated expert system designs typify this era. Control was shared more equally. In this, the third era of the HCI, the constructed system emulates a particular environment, constructed with rules and knowledge about 'reality'. Control is, in part, outside the realm of the human-computer dialogue. Virtual reality systems are discussed.

  1. Virtual Reality in the Classroom.

    ERIC Educational Resources Information Center

    Pantelidis, Veronica S.

    1993-01-01

    Considers the concept of virtual reality; reviews its history; describes general uses of virtual reality, including entertainment, medicine, and design applications; discusses classroom uses of virtual reality, including a software program called Virtus WalkThrough for use with a computer monitor; and suggests future possibilities. (34 references)…

  2. [A survey of virtual reality research: From technology to psychology].

    PubMed

    Sakurai, K

    1995-10-01

    A technology of virtual reality enables us to immerse ourselves into 3D synthesized environments. In this paper, I review recent researches on virtual reality focusing on (a) the terminology used in this research area, (b) technological approaches to setting up different components of virtual reality autonomy, interaction, and presence, (c) objective measures and subjective ratings of a viewer's sense of presence in virtual environments, (d) present applications of virtual reality in different fields and their relation to pictorial communication. This review concludes that intermodality conflict and measurement of sense of presence are the crucial perceptual and cognitive topics in virtual reality research.

  3. Virtual Reality Hysteroscopy

    PubMed

    Levy

    1996-08-01

    New interactive computer technologies are having a significant influence on medical education, training, and practice. The newest innovation in computer technology, virtual reality, allows an individual to be immersed in a dynamic computer-generated, three-dimensional environment and can provide realistic simulations of surgical procedures. A new virtual reality hysteroscope passes through a sensing device that synchronizes movements with a three-dimensional model of a uterus. Force feedback is incorporated into this model, so the user actually experiences the collision of an instrument against the uterine wall or the sensation of the resistance or drag of a resectoscope as it cuts through a myoma in a virtual environment. A variety of intrauterine pathologies and procedures are simulated, including hyperplasia, cancer, resection of a uterine septum, polyp, or myoma, and endometrial ablation. This technology will be incorporated into comprehensive training programs that will objectively assess hand-eye coordination and procedural skills. It is possible that by incorporating virtual reality into hysteroscopic training programs, a decrease in the learning curve and the number of complications presently associated with the procedures may be realized. Prospective studies are required to assess these potential benefits.

  4. Virtual Representations in 3D Learning Environments

    ERIC Educational Resources Information Center

    Shonfeld, Miri; Kritz, Miki

    2013-01-01

    This research explores the extent to which virtual worlds can serve as online collaborative learning environments for students by increasing social presence and engagement. 3D environments enable learning, which simulates face-to-face encounters while retaining the advantages of online learning. Students in Education departments created avatars…

  5. Virtual reality in radiology: virtual intervention

    NASA Astrophysics Data System (ADS)

    Harreld, Michael R.; Valentino, Daniel J.; Duckwiler, Gary R.; Lufkin, Robert B.; Karplus, Walter J.

    1995-04-01

    Intracranial aneurysms are the primary cause of non-traumatic subarachnoid hemorrhage. Morbidity and mortality remain high even with current endovascular intervention techniques. It is presently impossible to identify which aneurysms will grow and rupture, however hemodynamics are thought to play an important role in aneurysm development. With this in mind, we have simulated blood flow in laboratory animals using three dimensional computational fluid dynamics software. The data output from these simulations is three dimensional, complex and transient. Visualization of 3D flow structures with standard 2D display is cumbersome, and may be better performed using a virtual reality system. We are developing a VR-based system for visualization of the computed blood flow and stress fields. This paper presents the progress to date and future plans for our clinical VR-based intervention simulator. The ultimate goal is to develop a software system that will be able to accurately model an aneurysm detected on clinical angiography, visualize this model in virtual reality, predict its future behavior, and give insight into the type of treatment necessary. An associated database will give historical and outcome information on prior aneurysms (including dynamic, structural, and categorical data) that will be matched to any current case, and assist in treatment planning (e.g., natural history vs. treatment risk, surgical vs. endovascular treatment risks, cure prediction, complication rates).

  6. Art in virtual reality 2010

    NASA Astrophysics Data System (ADS)

    Chang, Ben

    2010-01-01

    For decades, virtual reality artwork has existed in a small but highly influential niche in the world of electronic and new media art. Since the early 1990's, virtual reality installations have come to define an extreme boundary point of both aesthetic experience and technological sophistication. Classic virtual reality artworks have an almost mythological stature - powerful, exotic, and often rarely exhibited. Today, art in virtual environments continues to evolve and mature, encompassing everything from fully immersive CAVE experiences to performance art in Second Life to the use of augmented and mixed reality in public space. Art in Virtual Reality 2010 is a public exhibition of new artwork that showcases the diverse ways that contemporary artists use virtual environments to explore new aesthetic ground and investigate the continually evolving relationship between our selves and our virtual worlds.

  7. Virtual reality at work

    NASA Technical Reports Server (NTRS)

    Brooks, Frederick P., Jr.

    1991-01-01

    The utility of virtual reality computer graphics in telepresence applications is not hard to grasp and promises to be great. When the virtual world is entirely synthetic, as opposed to real but remote, the utility is harder to establish. Vehicle simulators for aircraft, vessels, and motor vehicles are proving their worth every day. Entertainment applications such as Disney World's StarTours are technologically elegant, good fun, and economically viable. Nevertheless, some of us have no real desire to spend our lifework serving the entertainment craze of our sick culture; we want to see this exciting technology put to work in medicine and science. The topics covered include the following: testing a force display for scientific visualization -- molecular docking; and testing a head-mounted display for scientific and medical visualization.

  8. Virtual VMASC: A 3D Game Environment

    NASA Technical Reports Server (NTRS)

    Manepalli, Suchitra; Shen, Yuzhong; Garcia, Hector M.; Lawsure, Kaleen

    2010-01-01

    The advantages of creating interactive 3D simulations that allow viewing, exploring, and interacting with land improvements, such as buildings, in digital form are manifold and range from allowing individuals from anywhere in the world to explore those virtual land improvements online, to training military personnel in dealing with war-time environments, and to making those land improvements available in virtual worlds such as Second Life. While we haven't fully explored the true potential of such simulations, we have identified a requirement within our organization to use simulations like those to replace our front-desk personnel and allow visitors to query, naVigate, and communicate virtually with various entities within the building. We implemented the Virtual VMASC 3D simulation of the Virginia Modeling Analysis and Simulation Center (VMASC) office building to not only meet our front-desk requirement but also to evaluate the effort required in designing such a simulation and, thereby, leverage the experience we gained in future projects of this kind. This paper describes the goals we set for our implementation, the software approach taken, the modeling contribution made, and the technologies used such as XNA Game Studio, .NET framework, Autodesk software packages, and, finally, the applicability of our implementation on a variety of architectures including Xbox 360 and PC. This paper also summarizes the result of our evaluation and the lessons learned from our effort.

  9. When Rural Reality Goes Virtual.

    ERIC Educational Resources Information Center

    Husain, Dilshad D.

    1998-01-01

    In rural towns where sparse population and few business are barriers, virtual reality may be the only way to bring work-based learning to students. A partnership between a small-town high school, the Ohio Supercomputer Center, and a high-tech business will enable students to explore the workplace using virtual reality. (JOW)

  10. Virtual Reality, Combat, and Communication.

    ERIC Educational Resources Information Center

    Thrush, Emily Austin; Bodary, Michael

    2000-01-01

    Presents a brief examination of the evolution of virtual reality devices that illustrates how the development of this new medium is influenced by emerging technologies and by marketing pressures. Notes that understanding these influences may help prepare for the role of technical communicators in building virtual reality applications for education…

  11. Reality and Surreality of 3-D Displays: Holodeck and Beyond

    DTIC Science & Technology

    2000-01-01

    Holodeck is the reality that significantly better 3D display systems are possible. Keywords: true 3D displays, multiplexed 2D display ( autostereoscopic ...displays still do not use them in their own offices. Thus, 3D approaches that are autostereoscopic (that is, no-head gear is required) are preferred. A...challenges noted throughout the aforegoing sections of this paper will be steadily overcome. True 3D , autostereoscopic (no head gear) monitors with usable

  12. Surgery applications of virtual reality

    NASA Technical Reports Server (NTRS)

    Rosen, Joseph

    1994-01-01

    Virtual reality is a computer-generated technology which allows information to be displayed in a simulated, bus lifelike, environment. In this simulated 'world', users can move and interact as if they were actually a part of that world. This new technology will be useful in many different fields, including the field of surgery. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations, simulate and perform surgical procedures (telesurgery), and predict the outcomes of surgery. The authors of this paper describe the basic components of a virtual reality surgical system. These components include: the virtual world, the virtual tools, the anatomical model, the software platform, the host computer, the interface, and the head-coupled display. In the chapter they also review the progress towards using virtual reality for surgical training, planning, telesurgery, and predicting outcomes. Finally, the authors present a training system being developed for the practice of new procedures in abdominal surgery.

  13. VIRTUAL REALITY HYPNOSIS

    PubMed Central

    Askay, Shelley Wiechman; Patterson, David R.; Sharar, Sam R.

    2010-01-01

    Scientific evidence for the viability of hypnosis as a treatment for pain has flourished over the past two decades (Rainville, Duncan, Price, Carrier and Bushnell, 1997; Montgomery, DuHamel and Redd, 2000; Lang and Rosen, 2002; Patterson and Jensen, 2003). However its widespread use has been limited by factors such as the advanced expertise, time and effort required by clinicians to provide hypnosis, and the cognitive effort required by patients to engage in hypnosis. The theory in developing virtual reality hypnosis was to apply three-dimensional, immersive, virtual reality technology to guide the patient through the same steps used when hypnosis is induced through an interpersonal process. Virtual reality replaces many of the stimuli that the patients have to struggle to imagine via verbal cueing from the therapist. The purpose of this paper is to explore how virtual reality may be useful in delivering hypnosis, and to summarize the scientific literature to date. We will also explore various theoretical and methodological issues that can guide future research. In spite of the encouraging scientific and clinical findings, hypnosis for analgesia is not universally used in medical centres. One reason for the slow acceptance is the extensive provider training required in order for hypnosis to be an effective pain management modality. Training in hypnosis is not commonly offered in medical schools or even psychology graduate curricula. Another reason is that hypnosis requires far more time and effort to administer than an analgesic pill or injection. Hypnosis requires training, skill and patience to deliver in medical centres that are often fast-paced and highly demanding of clinician time. Finally, the attention and cognitive effort required for hypnosis may be more than patients in an acute care setting, who may be under the influence of opiates and benzodiazepines, are able to impart. It is a challenge to make hypnosis a standard part of care in this environment

  14. Augmented Virtual Reality Laboratory

    NASA Technical Reports Server (NTRS)

    Tully-Hanson, Benjamin

    2015-01-01

    Real time motion tracking hardware has for the most part been cost prohibitive for research to regularly take place until recently. With the release of the Microsoft Kinect in November 2010, researchers now have access to a device that for a few hundred dollars is capable of providing redgreenblue (RGB), depth, and skeleton data. It is also capable of tracking multiple people in real time. For its original intended purposes, i.e. gaming, being used with the Xbox 360 and eventually Xbox One, it performs quite well. However, researchers soon found that although the sensor is versatile, it has limitations in real world applications. I was brought aboard this summer by William Little in the Augmented Virtual Reality (AVR) Lab at Kennedy Space Center to find solutions to these limitations.

  15. Overestimation of heights in virtual reality is influenced more by perceived distal size than by the 2-D versus 3-D dimensionality of the display

    NASA Technical Reports Server (NTRS)

    Dixon, Melissa W.; Proffitt, Dennis R.; Kaiser, M. K. (Principal Investigator)

    2002-01-01

    One important aspect of the pictorial representation of a scene is the depiction of object proportions. Yang, Dixon, and Proffitt (1999 Perception 28 445-467) recently reported that the magnitude of the vertical-horizontal illusion was greater for vertical extents presented in three-dimensional (3-D) environments compared to two-dimensional (2-D) displays. However, because all of the 3-D environments were large and all of the 2-D displays were small, the question remains whether the observed magnitude differences were due solely to the dimensionality of the displays (2-D versus 3-D) or to the perceived distal size of the extents (small versus large). We investigated this question by comparing observers' judgments of vertical relative to horizontal extents on a large but 2-D display compared to the large 3-D and the small 2-D displays used by Yang et al (1999). The results confirmed that the magnitude differences for vertical overestimation between display media are influenced more by the perceived distal object size rather than by the dimensionality of the display.

  16. A Collaborative Virtual Environment for Situated Language Learning Using VEC3D

    ERIC Educational Resources Information Center

    Shih, Ya-Chun; Yang, Mau-Tsuen

    2008-01-01

    A 3D virtually synchronous communication architecture for situated language learning has been designed to foster communicative competence among undergraduate students who have studied English as a foreign language (EFL). We present an innovative approach that offers better e-learning than the previous virtual reality educational applications. The…

  17. Convergent validity and sex differences in healthy elderly adults for performance on 3D virtual reality navigation learning and 2D hidden maze tasks.

    PubMed

    Tippett, William J; Lee, Jang-Han; Mraz, Richard; Zakzanis, Konstantine K; Snyder, Peter J; Black, Sandra E; Graham, Simon J

    2009-04-01

    This study assessed the convergent validity of a virtual environment (VE) navigation learning task, the Groton Maze Learning Test (GMLT), and selected traditional neuropsychological tests performed in a group of healthy elderly adults (n = 24). The cohort was divided equally between males and females to explore performance variability due to sex differences, which were subsequently characterized and reported as part of the analysis. To facilitate performance comparisons, specific "efficiency" scores were created for both the VE navigation task and the GMLT. Men reached peak performance more rapidly than women during VE navigation and on the GMLT and significantly outperformed women on the first learning trial in the VE. Results suggest reasonable convergent validity across the VE task, GMLT, and selected neuropsychological tests for assessment of spatial memory.

  18. Telemedicine, virtual reality, and surgery

    NASA Technical Reports Server (NTRS)

    Mccormack, Percival D.; Charles, Steve

    1994-01-01

    Two types of synthetic experience are covered: virtual reality (VR) and surgery, and telemedicine. The topics are presented in viewgraph form and include the following: geometric models; physiological sensors; surgical applications; virtual cadaver; VR surgical simulation; telesurgery; VR Surgical Trainer; abdominal surgery pilot study; advanced abdominal simulator; examples of telemedicine; and telemedicine spacebridge.

  19. Learning in 3-D Virtual Worlds: Rethinking Media Literacy

    ERIC Educational Resources Information Center

    Qian, Yufeng

    2008-01-01

    3-D virtual worlds, as a new form of learning environments in the 21st century, hold great potential in education. Learning in such environments, however, demands a broader spectrum of literacy skills. This article identifies a new set of media literacy skills required in 3-D virtual learning environments by reviewing exemplary 3-D virtual…

  20. A specification of 3D manipulation in virtual environments

    NASA Technical Reports Server (NTRS)

    Su, S. Augustine; Furuta, Richard

    1994-01-01

    In this paper we discuss the modeling of three basic kinds of 3-D manipulations in the context of a logical hand device and our virtual panel architecture. The logical hand device is a useful software abstraction representing hands in virtual environments. The virtual panel architecture is the 3-D component of the 2-D window systems. Both of the abstractions are intended to form the foundation for adaptable 3-D manipulation.

  1. Real-time 3D human capture system for mixed-reality art and entertainment.

    PubMed

    Nguyen, Ta Huynh Duy; Qui, Tran Cong Thien; Xu, Ke; Cheok, Adrian David; Teo, Sze Lee; Zhou, ZhiYing; Mallawaarachchi, Asitha; Lee, Shang Ping; Liu, Wei; Teo, Hui Siang; Thang, Le Nam; Li, Yu; Kato, Hirokazu

    2005-01-01

    A real-time system for capturing humans in 3D and placing them into a mixed reality environment is presented in this paper. The subject is captured by nine cameras surrounding her. Looking through a head-mounted-display with a camera in front pointing at a marker, the user can see the 3D image of this subject overlaid onto a mixed reality scene. The 3D images of the subject viewed from this viewpoint are constructed using a robust and fast shape-from-silhouette algorithm. The paper also presents several techniques to produce good quality and speed up the whole system. The frame rate of our system is around 25 fps using only standard Intel processor-based personal computers. Besides a remote live 3D conferencing and collaborating system, we also describe an application of the system in art and entertainment, named Magic Land, which is a mixed reality environment where captured avatars of human and 3D computer generated virtual animations can form an interactive story and play with each other. This system demonstrates many technologies in human computer interaction: mixed reality, tangible interaction, and 3D communication. The result of the user study not only emphasizes the benefits, but also addresses some issues of these technologies.

  2. Innovation Education Enabled through a Collaborative Virtual Reality Learning Environment

    ERIC Educational Resources Information Center

    Thorsteinsson, Gisli; Page, Tom; Lehtonen, Miika; Ha, Joong Gyu

    2006-01-01

    This article provides a descriptive account of the development of an approach to the support of design and technology education with 3D Virtual Reality (VR) technologies on an open and distance learning basis. This work promotes an understanding of the implications and possibilities of advanced virtual learning technologies in education for…

  3. Learning in 3D Virtual Environments: Collaboration and Knowledge Spirals

    ERIC Educational Resources Information Center

    Burton, Brian G.; Martin, Barbara N.

    2010-01-01

    The purpose of this case study was to determine if learning occurred within a 3D virtual learning environment by determining if elements of collaboration and Nonaka and Takeuchi's (1995) knowledge spiral were present. A key portion of this research was the creation of a Virtual Learning Environment. This 3D VLE utilized the Torque Game Engine…

  4. Molecular Rift: Virtual Reality for Drug Designers.

    PubMed

    Norrby, Magnus; Grebner, Christoph; Eriksson, Joakim; Boström, Jonas

    2015-11-23

    Recent advances in interaction design have created new ways to use computers. One example is the ability to create enhanced 3D environments that simulate physical presence in the real world--a virtual reality. This is relevant to drug discovery since molecular models are frequently used to obtain deeper understandings of, say, ligand-protein complexes. We have developed a tool (Molecular Rift), which creates a virtual reality environment steered with hand movements. Oculus Rift, a head-mounted display, is used to create the virtual settings. The program is controlled by gesture-recognition, using the gaming sensor MS Kinect v2, eliminating the need for standard input devices. The Open Babel toolkit was integrated to provide access to powerful cheminformatics functions. Molecular Rift was developed with a focus on usability, including iterative test-group evaluations. We conclude with reflections on virtual reality's future capabilities in chemistry and education. Molecular Rift is open source and can be downloaded from GitHub.

  5. Exploring Virtual Reality for Classroom Use: The Virtual Reality and Education Lab at East Carolina University.

    ERIC Educational Resources Information Center

    Auld, Lawrence W. S.; Pantelidis, Veronica S.

    1994-01-01

    Describes the Virtual Reality and Education Lab (VREL) established at East Carolina University to study the implications of virtual reality for elementary and secondary education. Highlights include virtual reality software evaluation; hardware evaluation; computer-based curriculum objectives which could use virtual reality; and keeping current…

  6. Embedding speech into virtual realities

    NASA Technical Reports Server (NTRS)

    Bohn, Christian-Arved; Krueger, Wolfgang

    1993-01-01

    In this work a speaker-independent speech recognition system is presented, which is suitable for implementation in Virtual Reality applications. The use of an artificial neural network in connection with a special compression of the acoustic input leads to a system, which is robust, fast, easy to use and needs no additional hardware, beside a common VR-equipment.

  7. Virtual reality and planetary exploration

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.

  8. Virtual reality and stereoscopic telepresence

    SciTech Connect

    Mertens, E.P.

    1994-12-01

    Virtual reality technology is commonly thought to have few, if any, applications beyond the national research laboratories, the aerospace industry, and the entertainment world. A team at Westinghouse Hanford Company (WHC) is developing applications for virtual reality technology that make it a practical, viable, portable, and cost-effective business and training tool. The technology transfer is particularly applicable to the waste management industry and has become a tool that can serve the entire work force spectrum, from industrial sites to business offices. For three and a half years, a small team of WHC personnel has been developing an effective and practical method of bringing virtual reality technology to the job site. The applications are practical, the results are repeatable, and the equipment costs are within the range of present-day office machines. That combination can evolve into a competitive advantage for commercial business interests. The WHC team has contained system costs by using commercially available equipment and personal computers to create effective virtual reality work stations for less than $20,000.

  9. Virtual Reality: Ready or Not!

    ERIC Educational Resources Information Center

    Lewis, Joan E.

    1994-01-01

    Describes the development and current status of virtual reality (VR) and VR research. Market potentials for VR are discussed, including the entertainment industry, health care and medical training, flight and other simulators, and educational possibilities. A glossary of VR-related terms is included. (LRW)

  10. Immersive virtual reality simulations in nursing education.

    PubMed

    Kilmon, Carol A; Brown, Leonard; Ghosh, Sumit; Mikitiuk, Artur

    2010-01-01

    This article explores immersive virtual reality as a potential educational strategy for nursing education and describes an immersive learning experience now being developed for nurses. This pioneering project is a virtual reality application targeting speed and accuracy of nurse response in emergency situations requiring cardiopulmonary resuscitation. Other potential uses and implications for the development of virtual reality learning programs are discussed.

  11. Virtual Reality in Education and Training.

    ERIC Educational Resources Information Center

    Andolsek, Diane L.

    1995-01-01

    Provides an overview of virtual reality from an education perspective. Defines the technology in terms of equipment and participatory experience, examines the potential applications of virtual reality in education and training, and considers the concerns and limitations of the technology. Overall, research indicates that virtual reality offers…

  12. Virtual Reality: You Are There

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Telepresence or "virtual reality," allows a person, with assistance from advanced technology devices, to figuratively project himself into another environment. This technology is marketed by several companies, among them Fakespace, Inc., a former Ames Research Center contractor. Fakespace developed a teleoperational motion platform for transmitting sounds and images from remote locations. The "Molly" matches the user's head motion and, when coupled with a stereo viewing device and appropriate software, creates the telepresence experience. Its companion piece is the BOOM-the user's viewing device that provides the sense of involvement in the virtual environment. Either system may be used alone. Because suits, gloves, headphones, etc. are not needed, a whole range of commercial applications is possible, including computer-aided design techniques and virtual reality visualizations. Customers include Sandia National Laboratories, Stanford Research Institute and Mattel Toys.

  13. Simulated maintenance a virtual reality

    SciTech Connect

    Lirvall, P.

    1995-10-01

    The article describes potential applications of personal computer-based virtual reality software. The applications are being investigated by Atomic Energy of Canada Limited`s (AECL) Chalk River Laboratories for the Canadian deuterium-uranium (Candu) reactor. Objectives include: (1) reduction of outage duration and improved safety, (2) cost-effective and safe maintenance of equipment, (3) reduction of exposure times and identification of overexposure situations, (4) cost-effective training in a virtual control room simulator, (5) human factors evaluation of design interface, and (6) visualization of conceptual and detailed designs of critical nuclear field environments. A demonstration model of a typical reactor control room, the use of virtual reality in outage planning, and safety issues are outlined.

  14. Marshall Engineers Use Virtual Reality

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  15. Virtual Libraries: Service Realities.

    ERIC Educational Resources Information Center

    Novak, Jan

    2002-01-01

    Discussion of changes in society that have resulted from information and communication technologies focuses on changes in libraries and a new market for library services with new styles of clients. Highlights client service issues to be considered when transitioning to a virtual library situation. (Author/LRW)

  16. Virtual Libraries: Service Realities.

    ERIC Educational Resources Information Center

    Novak, Jan

    This paper discusses client service issues to be considered when transitioning to a virtual library situation. Themes related to the transitional nature of society in the knowledge era are presented, including: paradox and a contradictory nature; blurring of boundaries; networks, systems, and holistic thinking; process/not product, becoming/not…

  17. Virtual Reality--Learning by Immersion.

    ERIC Educational Resources Information Center

    Dunning, Jeremy

    1998-01-01

    Discusses the use of virtual reality in educational software. Topics include CAVE (Computer-Assisted Virtual Environments); cost-effective virtual environment tools including QTVR (Quick Time Virtual Reality); interactive exercises; educational criteria for technology-based educational tools; and examples of screen displays. (LRW)

  18. Virtual reality visualization of accelerator magnets

    SciTech Connect

    Huang, M.; Papka, M.; DeFanti, T.; Levine, D.; Turner, L.; Kettunen, L.

    1995-05-01

    The authors describe the use of the CAVE virtual reality visualization environment as an aid to the design of accelerator magnets. They have modeled an elliptical multipole wiggler magnet being designed for use at the Advanced Photon Source at Argonne National Laboratory. The CAVE environment allows the authors to explore and interact with the 3-D visualization of the magnet. Capabilities include changing the number of periods the magnet displayed, changing the icons used for displaying the magnetic field, and changing the current in the electromagnet and observing the effect on the magnetic field and particle beam trajectory through the field.

  19. Direct Manipulation in Virtual Reality

    NASA Technical Reports Server (NTRS)

    Bryson, Steve

    2003-01-01

    Virtual Reality interfaces offer several advantages for scientific visualization such as the ability to perceive three-dimensional data structures in a natural way. The focus of this chapter is direct manipulation, the ability for a user in virtual reality to control objects in the virtual environment in a direct and natural way, much as objects are manipulated in the real world. Direct manipulation provides many advantages for the exploration of complex, multi-dimensional data sets, by allowing the investigator the ability to intuitively explore the data environment. Because direct manipulation is essentially a control interface, it is better suited for the exploration and analysis of a data set than for the publishing or communication of features found in that data set. Thus direct manipulation is most relevant to the analysis of complex data that fills a volume of three-dimensional space, such as a fluid flow data set. Direct manipulation allows the intuitive exploration of that data, which facilitates the discovery of data features that would be difficult to find using more conventional visualization methods. Using a direct manipulation interface in virtual reality, an investigator can, for example, move a data probe about in space, watching the results and getting a sense of how the data varies within its spatial volume.

  20. [Development of a software for 3D virtual phantom design].

    PubMed

    Zou, Lian; Xie, Zhao; Wu, Qi

    2014-02-01

    In this paper, we present a 3D virtual phantom design software, which was developed based on object-oriented programming methodology and dedicated to medical physics research. This software was named Magical Phan tom (MPhantom), which is composed of 3D visual builder module and virtual CT scanner. The users can conveniently construct any complex 3D phantom, and then export the phantom as DICOM 3.0 CT images. MPhantom is a user-friendly and powerful software for 3D phantom configuration, and has passed the real scene's application test. MPhantom will accelerate the Monte Carlo simulation for dose calculation in radiation therapy and X ray imaging reconstruction algorithm research.

  1. Motor rehabilitation using virtual reality

    PubMed Central

    Sveistrup, Heidi

    2004-01-01

    Virtual Reality (VR) provides a unique medium suited to the achievement of several requirements for effective rehabilitation intervention. Specifically, therapy can be provided within a functional, purposeful and motivating context. Many VR applications present opportunities for individuals to participate in experiences, which are engaging and rewarding. In addition to the value of the rehabilitation experience for the user, both therapists and users benefit from the ability to readily grade and document the therapeutic intervention using various systems. In VR, advanced technologies are used to produce simulated, interactive and multi-dimensional environments. Visual interfaces including desktop monitors and head-mounted displays (HMDs), haptic interfaces, and real-time motion tracking devices are used to create environments allowing users to interact with images and virtual objects in real-time through multiple sensory modalities. Opportunities for object manipulation and body movement through virtual space provide frameworks that, in varying degrees, are perceived as comparable to similar opportunities in the real world. This paper reviews current work on motor rehabilitation using virtual environments and virtual reality and where possible, compares outcomes with those achieved in real-world applications. PMID:15679945

  2. 3D Viewing: Odd Perception - Illusion? reality? or both?

    NASA Astrophysics Data System (ADS)

    Kisimoto, K.; Iizasa, K.

    2008-12-01

    We live in the three dimensional space, don't we? It could be at least four dimensions, but that is another story. In either way our perceptual capability of 3D-Viewing is constrained by our 2D-perception (our intrinsic tools of perception). I carried out a few visual experiments using topographic data to show our intrinsic (or biological) disability (or shortcoming) in 3D-recognition of our world. Results of the experiments suggest: (1) 3D-surface model displayed on a 2D-computer screen (or paper) always has two interpretations of the 3D- surface geometry, if we choose one of the interpretation (in other word, if we are hooked by one perception of the two), we maintain its perception even if the 3D-model changes its viewing perspective in time shown on the screen, (2) more interesting is that 3D-real solid object (e.g.,made of clay) also gives above mentioned two interpretations of the geometry of the object, if we observe the object with one-eye. Most famous example of this viewing illusion is exemplified by a magician, who died in 2007, Jerry Andrus who made a super-cool paper crafted dragon which causes visual illusion to one-eyed viewer. I, by the experiments, confirmed this phenomenon in another perceptually persuasive (deceptive?) way. My conclusion is that this illusion is intrinsic, i.e. reality for human, because, even if we live in 3D-space, our perceptional tool (eyes) is composed of 2D sensors whose information is reconstructed or processed to 3D by our experience-based brain. So, (3) when we observe the 3D-surface-model on the computer screen, we are always one eye short even if we use both eyes. One last suggestion from my experiments is that recent highly sophisticated 3D- models might include too many information that human perceptions cannot handle properly, i.e. we might not be understanding the 3D world (geospace) at all, just illusioned.

  3. Development of visual 3D virtual environment for control software

    NASA Technical Reports Server (NTRS)

    Hirose, Michitaka; Myoi, Takeshi; Amari, Haruo; Inamura, Kohei; Stark, Lawrence

    1991-01-01

    Virtual environments for software visualization may enable complex programs to be created and maintained. A typical application might be for control of regional electric power systems. As these encompass broader computer networks than ever, construction of such systems becomes very difficult. Conventional text-oriented environments are useful in programming individual processors. However, they are obviously insufficient to program a large and complicated system, that includes large numbers of computers connected to each other; such programming is called 'programming in the large.' As a solution for this problem, the authors are developing a graphic programming environment wherein one can visualize complicated software in virtual 3D world. One of the major features of the environment is the 3D representation of concurrent process. 3D representation is used to supply both network-wide interprocess programming capability (capability for 'programming in the large') and real-time programming capability. The authors' idea is to fuse both the block diagram (which is useful to check relationship among large number of processes or processors) and the time chart (which is useful to check precise timing for synchronization) into a single 3D space. The 3D representation gives us a capability for direct and intuitive planning or understanding of complicated relationship among many concurrent processes. To realize the 3D representation, a technology to enable easy handling of virtual 3D object is a definite necessity. Using a stereo display system and a gesture input device (VPL DataGlove), our prototype of the virtual workstation has been implemented. The workstation can supply the 'sensation' of the virtual 3D space to a programmer. Software for the 3D programming environment is implemented on the workstation. According to preliminary assessments, a 50 percent reduction of programming effort is achieved by using the virtual 3D environment. The authors expect that the 3D

  4. Mixed reality orthognathic surgical simulation by entity model manipulation and 3D-image display

    NASA Astrophysics Data System (ADS)

    Shimonagayoshi, Tatsunari; Aoki, Yoshimitsu; Fushima, Kenji; Kobayashi, Masaru

    2005-12-01

    In orthognathic surgery, the framing of 3D-surgical planning that considers the balance between the front and back positions and the symmetry of the jawbone, as well as the dental occlusion of teeth, is essential. In this study, a support system for orthodontic surgery to visualize the changes in the mandible and the occlusal condition and to determine the optimum position in mandibular osteotomy has been developed. By integrating the operating portion of a tooth model that is to determine the optimum occlusal position by manipulating the entity tooth model and the 3D-CT skeletal images (3D image display portion) that are simultaneously displayed in real-time, the determination of the mandibular position and posture in which the improvement of skeletal morphology and occlusal condition is considered, is possible. The realistic operation of the entity model and the virtual 3D image display enabled the construction of a surgical simulation system that involves augmented reality.

  5. What Are the Learning Affordances of 3-D Virtual Environments?

    ERIC Educational Resources Information Center

    Dalgarno, Barney; Lee, Mark J. W.

    2010-01-01

    This article explores the potential learning benefits of three-dimensional (3-D) virtual learning environments (VLEs). Drawing on published research spanning two decades, it identifies a set of unique characteristics of 3-D VLEs, which includes aspects of their representational fidelity and aspects of the learner-computer interactivity they…

  6. ESL Teacher Training in 3D Virtual Worlds

    ERIC Educational Resources Information Center

    Kozlova, Iryna; Priven, Dmitri

    2015-01-01

    Although language learning in 3D Virtual Worlds (VWs) has become a focus of recent research, little is known about the knowledge and skills teachers need to acquire to provide effective task-based instruction in 3D VWs and the type of teacher training that best prepares instructors for such an endeavor. This study employs a situated learning…

  7. Educational Visualizations in 3D Collaborative Virtual Environments: A Methodology

    ERIC Educational Resources Information Center

    Fominykh, Mikhail; Prasolova-Forland, Ekaterina

    2012-01-01

    Purpose: Collaborative virtual environments (CVEs) have become increasingly popular in educational settings and the role of 3D content is becoming more and more important. Still, there are many challenges in this area, such as lack of empirical studies that provide design for educational activities in 3D CVEs and lack of norms of how to support…

  8. Three-dimensional scene capturing for the virtual reality display

    NASA Astrophysics Data System (ADS)

    Dong, Jingsheng; Sang, Xinzhu; Guo, Nan; Chen, Duo; Yan, Binbin; Wang, Kuiru; Dou, Wenhua; Xiao, Liquan

    2016-10-01

    A virtual reality shooting and display system based on multiple degrees of freedom camera is designed and demonstrated. Three-dimensional scene display and the wide angle display can be achieved easily and quickly through the construction with the proposed system. The range of the viewing scene can be broaden with the image stitching process, and the display in the demonstrated system can achieve the effect of wide angle for applications of image mosaic. In the meantime, the system can realize 3D scene display, which can effectively reduce the complexity of the 3D scene generation, and provide a foundation for adding interactive characteristics for the 3D scene in the future. The system includes an adjustable bracket, computer software, and a virtual reality device. Multiple degrees of freedom of the adjustable bracket are developed to obtain 3D scene source images and mosaic source images easily. 5 degrees of freedom are realized, including rotation, lifting, translation, convergence and pitching. To realize the generation and display of three-dimensional scenes, two cameras are adjusted into a parallel state. With the process of image distortion eliminating and calibration, the image is transferred to the virtual reality device for display. In order to realize wide angle display, the cameras are adjusted into "V" type. The preprocessing includes image matching and fusion to realize image stitching. The mosaic image is transferred for virtual reality display with its image reading and display functions. The wide angle 3D scene display is realized by adjusting different states.

  9. Game-Like Language Learning in 3-D Virtual Environments

    ERIC Educational Resources Information Center

    Berns, Anke; Gonzalez-Pardo, Antonio; Camacho, David

    2013-01-01

    This paper presents our recent experiences with the design of game-like applications in 3-D virtual environments as well as its impact on student motivation and learning. Therefore our paper starts with a brief analysis of the motivational aspects of videogames and virtual worlds (VWs). We then go on to explore the possible benefits of both in the…

  10. Virtual reality: Avatars in human spaceflight training

    NASA Astrophysics Data System (ADS)

    Osterlund, Jeffrey; Lawrence, Brad

    2012-02-01

    With the advancements in high spatial and temporal resolution graphics, along with advancements in 3D display capabilities to model, simulate, and analyze human-to-machine interfaces and interactions, the world of virtual environments is being used to develop everything from gaming, movie special affects and animations to the design of automobiles. The use of multiple object motion capture technology and digital human tools in aerospace has demonstrated to be a more cost effective alternative to the cost of physical prototypes, provides a more efficient, flexible and responsive environment to changes in the design and training, and provides early human factors considerations concerning the operation of a complex launch vehicle or spacecraft. United Space Alliance (USA) has deployed this technique and tool under Research and Development (R&D) activities on both spacecraft assembly and ground processing operations design and training on the Orion Crew Module. USA utilizes specialized products that were chosen based on functionality, including software and fixed based hardware (e.g., infrared and visible red cameras), along with cyber gloves to ensure fine motor dexterity of the hands. The key findings of the R&D were: mock-ups should be built to not obstruct cameras from markers being tracked; a mock-up toolkit be assembled to facilitate dynamic design changes; markers should be placed in accurate positions on humans and flight hardware to help with tracking; 3D models used in the virtual environment be striped of non-essential data; high computational capable workstations are required to handle the large model data sets; and Technology Interchange Meetings with vendors and other industries also utilizing virtual reality applications need to occur on a continual basis enabling USA to maintain its leading edge within this technology. Parameters of interest and benefit in human spaceflight simulation training that utilizes virtual reality technologies are to

  11. Rapid prototyping 3D virtual world interfaces within a virtual factory environment

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.

  12. Integration of the virtual 3D model of a control system with the virtual controller

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2015-11-01

    Nowadays the design process includes simulation analysis of different components of a constructed object. It involves the need for integration of different virtual object to simulate the whole investigated technical system. The paper presents the issues related to the integration of a virtual 3D model of a chosen control system of with a virtual controller. The goal of integration is to verify the operation of an adopted object of in accordance with the established control program. The object of the simulation work is the drive system of a tunneling machine for trenchless work. In the first stage of work was created an interactive visualization of functioning of the 3D virtual model of a tunneling machine. For this purpose, the software of the VR (Virtual Reality) class was applied. In the elaborated interactive application were created adequate procedures allowing controlling the drive system of a translatory motion, a rotary motion and the drive system of a manipulator. Additionally was created the procedure of turning on and off the output crushing head, mounted on the last element of the manipulator. In the elaborated interactive application have been established procedures for receiving input data from external software, on the basis of the dynamic data exchange (DDE), which allow controlling actuators of particular control systems of the considered machine. In the next stage of work, the program on a virtual driver, in the ladder diagram (LD) language, was created. The control program was developed on the basis of the adopted work cycle of the tunneling machine. The element integrating the virtual model of the tunneling machine for trenchless work with the virtual controller is the application written in a high level language (Visual Basic). In the developed application was created procedures responsible for collecting data from the running, in a simulation mode, virtual controller and transferring them to the interactive application, in which is verified the

  13. Virtual reality in laparoscopic surgery.

    PubMed

    Uranüs, Selman; Yanik, Mustafa; Bretthauer, Georg

    2004-01-01

    Although the many advantages of laparoscopic surgery have made it an established technique, training in laparoscopic surgery posed problems not encountered in conventional surgical training. Virtual reality simulators open up new perspectives for training in laparoscopic surgery. Under realistic conditions in real time, trainees can tailor their sessions with the VR simulator to suit their needs and goals, and can repeat exercises as often as they wish. VR simulators reduce the number of experimental animals needed for training purposes and are suited to the pursuit of research in laparoscopic surgery.

  14. Transforming Clinical Imaging Data for Virtual Reality Learning Objects

    ERIC Educational Resources Information Center

    Trelease, Robert B.; Rosset, Antoine

    2008-01-01

    Advances in anatomical informatics, three-dimensional (3D) modeling, and virtual reality (VR) methods have made computer-based structural visualization a practical tool for education. In this article, the authors describe streamlined methods for producing VR "learning objects," standardized interactive software modules for anatomical sciences…

  15. PC-Based Virtual Reality for CAD Model Viewing

    ERIC Educational Resources Information Center

    Seth, Abhishek; Smith, Shana S.-F.

    2004-01-01

    Virtual reality (VR), as an emerging visualization technology, has introduced an unprecedented communication method for collaborative design. VR refers to an immersive, interactive, multisensory, viewer-centered, 3D computer-generated environment and the combination of technologies required to build such an environment. This article introduces the…

  16. New Desktop Virtual Reality Technology in Technical Education

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2008-01-01

    Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…

  17. 3D interactive augmented reality-enhanced digital learning systems for mobile devices

    NASA Astrophysics Data System (ADS)

    Feng, Kai-Ten; Tseng, Po-Hsuan; Chiu, Pei-Shuan; Yang, Jia-Lin; Chiu, Chun-Jie

    2013-03-01

    With enhanced processing capability of mobile platforms, augmented reality (AR) has been considered a promising technology for achieving enhanced user experiences (UX). Augmented reality is to impose virtual information, e.g., videos and images, onto a live-view digital display. UX on real-world environment via the display can be e ectively enhanced with the adoption of interactive AR technology. Enhancement on UX can be bene cial for digital learning systems. There are existing research works based on AR targeting for the design of e-learning systems. However, none of these work focuses on providing three-dimensional (3-D) object modeling for en- hanced UX based on interactive AR techniques. In this paper, the 3-D interactive augmented reality-enhanced learning (IARL) systems will be proposed to provide enhanced UX for digital learning. The proposed IARL systems consist of two major components, including the markerless pattern recognition (MPR) for 3-D models and velocity-based object tracking (VOT) algorithms. Realistic implementation of proposed IARL system is conducted on Android-based mobile platforms. UX on digital learning can be greatly improved with the adoption of proposed IARL systems.

  18. Visualizing Compound Rotations with Virtual Reality

    ERIC Educational Resources Information Center

    Flanders, Megan; Kavanagh, Richard C.

    2013-01-01

    Mental rotations are among the most difficult of all spatial tasks to perform, and even those with high levels of spatial ability can struggle to visualize the result of compound rotations. This pilot study investigates the use of the virtual reality-based Rotation Tool, created using the Virtual Reality Modeling Language (VRML) together with…

  19. Sweaty Palms! Virtual Reality Applied to Training.

    ERIC Educational Resources Information Center

    Treiber, Karin

    A qualitative case study approach was used to identify the psychosocial effects of the high-fidelity, virtual reality simulation provided in the college-level air traffic control (ATC) training program offered at the Minnesota Air Traffic Control Training Center and to evaluate the applicability of virtual reality to academic/training situations.…

  20. The SEE Experience: Edutainment in 3D Virtual Worlds.

    ERIC Educational Resources Information Center

    Di Blas, Nicoletta; Paolini, Paolo; Hazan, Susan

    Shared virtual worlds are innovative applications where several users, represented by Avatars, simultaneously access via Internet a 3D space. Users cooperate through interaction with the environment and with each other, manipulating objects and chatting as they go. Apart from in the well documented online action games industry, now often played…

  1. Web-based Three-dimensional Virtual Body Structures: W3D-VBS

    PubMed Central

    Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex

    2002-01-01

    Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user’s progress through evaluation tools helps customize lesson plans. A self-guided “virtual tour” of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it. PMID:12223495

  2. Cognitive Aspects of Collaboration in 3d Virtual Environments

    NASA Astrophysics Data System (ADS)

    Juřík, V.; Herman, L.; Kubíček, P.; Stachoň, Z.; Šašinka, Č.

    2016-06-01

    Human-computer interaction has entered the 3D era. The most important models representing spatial information — maps — are transferred into 3D versions regarding the specific content to be displayed. Virtual worlds (VW) become promising area of interest because of possibility to dynamically modify content and multi-user cooperation when solving tasks regardless to physical presence. They can be used for sharing and elaborating information via virtual images or avatars. Attractiveness of VWs is emphasized also by possibility to measure operators' actions and complex strategies. Collaboration in 3D environments is the crucial issue in many areas where the visualizations are important for the group cooperation. Within the specific 3D user interface the operators' ability to manipulate the displayed content is explored regarding such phenomena as situation awareness, cognitive workload and human error. For such purpose, the VWs offer a great number of tools for measuring the operators' responses as recording virtual movement or spots of interest in the visual field. Study focuses on the methodological issues of measuring the usability of 3D VWs and comparing them with the existing principles of 2D maps. We explore operators' strategies to reach and interpret information regarding the specific type of visualization and different level of immersion.

  3. Measuring Knowledge Acquisition in 3D Virtual Learning Environments.

    PubMed

    Nunes, Eunice P dos Santos; Roque, Licínio G; Nunes, Fatima de Lourdes dos Santos

    2016-01-01

    Virtual environments can contribute to the effective learning of various subjects for people of all ages. Consequently, they assist in reducing the cost of maintaining physical structures of teaching, such as laboratories and classrooms. However, the measurement of how learners acquire knowledge in such environments is still incipient in the literature. This article presents a method to evaluate the knowledge acquisition in 3D virtual learning environments (3D VLEs) by using the learner's interactions in the VLE. Three experiments were conducted that demonstrate the viability of using this method and its computational implementation. The results suggest that it is possible to automatically assess learning in predetermined contexts and that some types of user interactions in 3D VLEs are correlated with the user's learning differential.

  4. An Onboard ISS Virtual Reality Trainer

    NASA Technical Reports Server (NTRS)

    Miralles, Evelyn

    2013-01-01

    Prior to the retirement of the Space Shuttle, many exterior repairs on the International Space Station (ISS) were carried out by shuttle astronauts, trained on the ground and flown to the Station to perform these specific repairs. With the retirement of the shuttle, this is no longer an available option. As such, the need for ISS crew members to review scenarios while on flight, either for tasks they already trained for on the ground or for contingency operations has become a very critical issue. NASA astronauts prepare for Extra-Vehicular Activities (EVA) or Spacewalks through numerous training media, such as: self-study, part task training, underwater training in the Neutral Buoyancy Laboratory (NBL), hands-on hardware reviews and training at the Virtual Reality Laboratory (VRLab). In many situations, the time between the last session of a training and an EVA task might be 6 to 8 months. EVA tasks are critical for a mission and as time passes the crew members may lose proficiency on previously trained tasks and their options to refresh or learn a new skill while on flight are limited to reading training materials and watching videos. In addition, there is an increased need for unplanned contingency repairs to fix problems arising as the Station ages. In order to help the ISS crew members maintain EVA proficiency or train for contingency repairs during their mission, the Johnson Space Center's VRLab designed an immersive ISS Virtual Reality Trainer (VRT). The VRT incorporates a unique optical system that makes use of the already successful Dynamic On-board Ubiquitous Graphics (DOUG) software to assist crew members with procedure reviews and contingency EVAs while on board the Station. The need to train and re-train crew members for EVAs and contingency scenarios is crucial and extremely demanding. ISS crew members are now asked to perform EVA tasks for which they have not been trained and potentially have never seen before. The Virtual Reality Trainer (VRT

  5. Virtual image display as a backlight for 3D.

    PubMed

    Travis, Adrian; MacCrann, Niall; Emerton, Neil; Kollin, Joel; Georgiou, Andreas; Lanier, Jaron; Bathiche, Stephen

    2013-07-29

    We describe a device which has the potential to be used both as a virtual image display and as a backlight. The pupil of the emitted light fills the device approximately to its periphery and the collimated emission can be scanned both horizontally and vertically in the manner needed to illuminate an eye in any position. The aim is to reduce the power needed to illuminate a liquid crystal panel but also to enable a smooth transition from 3D to a virtual image as the user nears the screen.

  6. Virtual reality: an intuitive approach to robotics

    NASA Astrophysics Data System (ADS)

    Natonek, Emerico; Flueckiger, Lorenzo; Zimmerman, Thierry; Baur, Charles

    1995-12-01

    Tasks definition for manipulators or robotic systems (conventional or mobile) usually lack on performance and are sometimes impossible to design. The `On-line' programming methods are often time expensive or risky for the human operator or the robot itself. On the other hand, `Off-line' techniques are tedious and complex. In a virtual reality robotics environment (VRRE), users are not asked to write down complicated functions to specify robotic tasks. However a VRRE is only effective if all the environment changes and object movements are fed-back to the virtual manipulating system. Thus some kind of visual or multi-sensor feedback is needed. This paper describes a semi autonomous robot system composed of an industrial 5-axis robot and its virtual equivalent. The user is immersed in a 3-D space built out of the robot's environment models. He directly interacts with the virtual `components' in an intuitive way creating trajectories, tasks, and dynamically optimizing them. A vision system is used to recognize the position and orientation of the objects in the real robot workspace, and updates the VRRE through a bi-directional communication link. Once the tasks have been optimized on the VRRE, they are sent to the real robot and a semi autonomous process ensures their correct execution thanks to a camera directly mounted on the robot's end effector. Therefore, errors and drifts due to transmission delays can be locally processed and successfully avoided. The system can execute the tasks autonomously, independently of small environmental changes due to transmission delays. If the environmental changes are too important the robot stops re-actualizes the VRRE with the new environmental configuration and waits for task redesign.

  7. Virtual reality training improves balance function

    PubMed Central

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-01-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function. PMID:25368651

  8. Virtual reality training improves balance function.

    PubMed

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-09-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function.

  9. Research of the Remote Experiment System Based on Virtual Reality

    NASA Astrophysics Data System (ADS)

    Lei, Liangyu; Liu, Jianjun; Yang, Xiufang

    The remote education based on Virtual Reality technology is one of the leading developmental ways in modern education. The present researching status of VR technology's application in the remote experiment is analyzed and the characteristics are summarized in this paper. Then the remote experiment system is designed and the learning mode of the 3-D virtual experiment, the virtual experiment model based on Internet, the functional modules of virtual experiment system are studied. The network-based system of remote virtual experiment is built with the programming languages VRML and JavaScript. Furthermore, the remote experiment system on fatigue test of the drive axle is developed and some key problems in the remote virtual experiment are realized.

  10. STS-133 Crew Trains in Virtual Reality

    NASA Video Gallery

    In this episode of NASA "Behind the Scenes," STS-133 Pilot Eric Boe and space station Flight Director Royce Renfrew discuss how the virtual reality laboratory at the Johnson Space Center is helping...

  11. Using Virtual Reality For Outreach Purposes in Planetology

    NASA Astrophysics Data System (ADS)

    Civet, François; Le Mouélic, Stéphane; Le Menn, Erwan; Beaunay, Stéphanie

    2016-10-01

    2016 has been a year marked by a technological breakthrough : the availability for the first time to the general public of technologically mature virtual reality devices. Virtual Reality consists in visually immerging a user in a 3D environment reproduced either from real and/or imaginary data, with the possibility to move and eventually interact with the different elements. In planetology, most of the places will remain inaccessible to the public for a while, but a fleet of dedicated spacecraft's such as orbiters, landers and rovers allow the possibility to virtually reconstruct the environments, using image processing, cartography and photogrammetry. Virtual reality can then bridge the gap to virtually "send" any user into the place and enjoy the exploration.We are investigating several type of devices to render orbital or ground based data of planetological interest, mostly from Mars. The most simple system consists of a "cardboard" headset, on which the user can simply use his cellphone as the screen. A more comfortable experience is obtained with more complex systems such as the HTC vive or Oculus Rift headsets, which include a tracking system important to minimize motion sickness. The third environment that we have developed is based on the CAVE concept, were four 3D video projectors are used to project on three 2x3m walls plus the ground. These systems can be used for scientific data analysis, but also prove to be perfectly suited for outreach and education purposes.

  12. [Virtual reality therapy in anxiety disorders].

    PubMed

    Mitrousia, V; Giotakos, O

    2016-01-01

    During the last decade a number of studies have been conducted in order to examine if virtual reality exposure therapy can be an alternative form of therapy for the treatment of mental disorders and particularly for the treatment of anxiety disorders. Imaginal exposure therapy, which is one of the components of Cognitive Behavioral Therapy, cannot be easily applied to all patients and in cases like those virtual reality can be used as an alternative or a supportive psychotherapeutic technique. Most studies using virtual reality have focused on anxiety disorders, mainly in specific phobias, but some extend to other disorders such as eating disorders, drug dependence, pain control and palliative care and rehabilitation. Main characteristics of virtual reality therapy are: "interaction", "immersion", and "presence". High levels of "immersion" and "presence" are associated with increased response to exposure therapy in virtual environments, as well as better therapeutic outcomes and sustained therapeutic gains. Typical devices that are used in order patient's immersion to be achieved are the Head-Mounted Displays (HMD), which are only for individual use, and the computer automatic virtual environment (CAVE), which is a multiuser. Virtual reality therapy's disadvantages lie in the difficulties that arise due to the demanded specialized technology skills, devices' cost and side effects. Therapists' training is necessary in order for them to be able to manipulate the software and the hardware and to adjust it to each case's needs. Devices' cost is high but as technology continuously improves it constantly decreases. Immersion during virtual reality therapy can induce mild and temporary side effects such as nausea, dizziness or headache. Until today, however, experience shows that virtual reality offers several advantages. Patient's avoidance to be exposed in phobic stimuli is reduced via the use of virtual reality since the patient is exposed to them as many times as he

  13. Virtual Reality at the PC Level

    NASA Technical Reports Server (NTRS)

    Dean, John

    1998-01-01

    The main objective of my research has been to incorporate virtual reality at the desktop level; i.e., create virtual reality software that can be run fairly inexpensively on standard PC's. The standard language used for virtual reality on PC's is VRML (Virtual Reality Modeling Language). It is a new language so it is still undergoing a lot of changes. VRML 1.0 came out only a couple years ago and VRML 2.0 came out around last September. VRML is an interpreted language that is run by a web browser plug-in. It is fairly flexible in terms of allowing you to create different shapes and animations. Before this summer, I knew very little about virtual reality and I did not know VRML at all. I learned the VRML language by reading two books and experimenting on a PC. The following topics are presented: CAD to VRML, VRML 1.0 to VRML 2.0, VRML authoring tools, VRML browsers, finding virtual reality applications, the AXAF project, the VRML generator program, web communities and future plans.

  14. Data Visualization Using Immersive Virtual Reality Tools

    NASA Astrophysics Data System (ADS)

    Cioc, Alexandru; Djorgovski, S. G.; Donalek, C.; Lawler, E.; Sauer, F.; Longo, G.

    2013-01-01

    The growing complexity of scientific data poses serious challenges for an effective visualization. Data sets, e.g., catalogs of objects detected in sky surveys, can have a very high dimensionality, ~ 100 - 1000. Visualizing such hyper-dimensional data parameter spaces is essentially impossible, but there are ways of visualizing up to ~ 10 dimensions in a pseudo-3D display. We have been experimenting with the emerging technologies of immersive virtual reality (VR) as a platform for a scientific, interactive, collaborative data visualization. Our initial experiments used the virtual world of Second Life, and more recently VR worlds based on its open source code, OpenSimulator. There we can visualize up to ~ 100,000 data points in ~ 7 - 8 dimensions (3 spatial and others encoded as shapes, colors, sizes, etc.), in an immersive virtual space where scientists can interact with their data and with each other. We are now developing a more scalable visualization environment using the popular (practically an emerging standard) Unity 3D Game Engine, coded using C#, JavaScript, and the Unity Scripting Language. This visualization tool can be used through a standard web browser, or a standalone browser of its own. Rather than merely plotting data points, the application creates interactive three-dimensional objects of various shapes, colors, and sizes, and of course the XYZ positions, encoding various dimensions of the parameter space, that can be associated interactively. Multiple users can navigate through this data space simultaneously, either with their own, independent vantage points, or with a shared view. At this stage ~ 100,000 data points can be easily visualized within seconds on a simple laptop. The displayed data points can contain linked information; e.g., upon a clicking on a data point, a webpage with additional information can be rendered within the 3D world. A range of functionalities has been already deployed, and more are being added. We expect to make this

  15. Brave New (Interactive) Worlds: A Review of the Design Affordances and Constraints of Two 3D Virtual Worlds as Interactive Learning Environments

    ERIC Educational Resources Information Center

    Dickey, Michele D.

    2005-01-01

    Three-dimensional virtual worlds are an emerging medium currently being used in both traditional classrooms and for distance education. Three-dimensional (3D) virtual worlds are a combination of desk-top interactive Virtual Reality within a chat environment. This analysis provides an overview of Active Worlds Educational Universe and Adobe…

  16. Virtual performer: single camera 3D measuring system for interaction in virtual space

    NASA Astrophysics Data System (ADS)

    Sakamoto, Kunio; Taneji, Shoto

    2006-10-01

    The authors developed interaction media systems in the 3D virtual space. In these systems, the musician virtually plays an instrument like the theremin in the virtual space or the performer plays a show using the virtual character such as a puppet. This interactive virtual media system consists of the image capture, measuring performer's position, detecting and recognizing motions and synthesizing video image using the personal computer. In this paper, we propose some applications of interaction media systems; a virtual musical instrument and superimposing CG character. Moreover, this paper describes the measuring method of the positions of the performer, his/her head and both eyes using a single camera.

  17. D3D augmented reality imaging system: proof of concept in mammography

    PubMed Central

    Douglas, David B; Petricoin, Emanuel F; Liotta, Lance; Wilson, Eugene

    2016-01-01

    Purpose The purpose of this article is to present images from simulated breast microcalcifications and assess the pattern of the microcalcifications with a technical development called “depth 3-dimensional (D3D) augmented reality”. Materials and methods A computer, head display unit, joystick, D3D augmented reality software, and an in-house script of simulated data of breast microcalcifications in a ductal distribution were used. No patient data was used and no statistical analysis was performed. Results The D3D augmented reality system demonstrated stereoscopic depth perception by presenting a unique image to each eye, focal point convergence, head position tracking, 3D cursor, and joystick fly-through. Conclusion The D3D augmented reality imaging system offers image viewing with depth perception and focal point convergence. The D3D augmented reality system should be tested to determine its utility in clinical practice. PMID:27563261

  18. Virtual reality and hallucination: a technoetic perspective

    NASA Astrophysics Data System (ADS)

    Slattery, Diana R.

    2008-02-01

    Virtual Reality (VR), especially in a technologically focused discourse, is defined by a class of hardware and software, among them head-mounted displays (HMDs), navigation and pointing devices; and stereoscopic imaging. This presentation examines the experiential aspect of VR. Putting "virtual" in front of "reality" modifies the ontological status of a class of experience-that of "reality." Reality has also been modified [by artists, new media theorists, technologists and philosophers] as augmented, mixed, simulated, artificial, layered, and enhanced. Modifications of reality are closely tied to modifications of perception. Media theorist Roy Ascott creates a model of three "VR's": Verifiable Reality, Virtual Reality, and Vegetal (entheogenically induced) Reality. The ways in which we shift our perceptual assumptions, create and verify illusions, and enter "the willing suspension of disbelief" that allows us entry into imaginal worlds is central to the experience of VR worlds, whether those worlds are explicitly representational (robotic manipulations by VR) or explicitly imaginal (VR artistic creations). The early rhetoric surrounding VR was interwoven with psychedelics, a perception amplified by Timothy Leary's presence on the historic SIGGRAPH panel, and the Wall Street Journal's tag of VR as "electronic LSD." This paper discusses the connections-philosophical, social-historical, and psychological-perceptual between these two domains.

  19. Research on 3D virtual campus scene modeling based on 3ds Max and VRML

    NASA Astrophysics Data System (ADS)

    Kang, Chuanli; Zhou, Yanliu; Liang, Xianyue

    2015-12-01

    With the rapid development of modem technology, the digital information management and the virtual reality simulation technology has become a research hotspot. Virtual campus 3D model can not only express the real world objects of natural, real and vivid, and can expand the campus of the reality of time and space dimension, the combination of school environment and information. This paper mainly uses 3ds Max technology to create three-dimensional model of building and on campus buildings, special land etc. And then, the dynamic interactive function is realized by programming the object model in 3ds Max by VRML .This research focus on virtual campus scene modeling technology and VRML Scene Design, and the scene design process in a variety of real-time processing technology optimization strategy. This paper guarantees texture map image quality and improve the running speed of image texture mapping. According to the features and architecture of Guilin University of Technology, 3ds Max, AutoCAD and VRML were used to model the different objects of the virtual campus. Finally, the result of virtual campus scene is summarized.

  20. Virtual Reality: A Syllabus for a Course on Virtual Reality and Education.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    This document contains a slightly-revised syllabus for a Virtual Reality course taught in spring 1994. The syllabus begins with an introduction which contains information on the software used in the course and examples of schools that have introduced virtual reality technology in the curriculum. The remainder of the document is composed of the…

  1. Comparing 3D virtual methods for hemimandibular body reconstruction.

    PubMed

    Benazzi, Stefano; Fiorenza, Luca; Kozakowski, Stephanie; Kullmer, Ottmar

    2011-07-01

    Reconstruction of fractured, distorted, or missing parts in human skeleton presents an equal challenge in the fields of paleoanthropology, bioarcheology, forensics, and medicine. This is particularly important within the disciplines such as orthodontics and surgery, when dealing with mandibular defects due to tumors, developmental abnormalities, or trauma. In such cases, proper restorations of both form (for esthetic purposes) and function (restoration of articulation, occlusion, and mastication) are required. Several digital approaches based on three-dimensional (3D) digital modeling, computer-aided design (CAD)/computer-aided manufacturing techniques, and more recently geometric morphometric methods have been used to solve this problem. Nevertheless, comparisons among their outcomes are rarely provided. In this contribution, three methods for hemimandibular body reconstruction have been tested. Two bone defects were virtually simulated in a 3D digital model of a human hemimandible. Accordingly, 3D digital scaffolds were obtained using the mirror copy of the unaffected hemimandible (Method 1), the thin plate spline (TPS) interpolation (Method 2), and the combination between TPS and CAD techniques (Method 3). The mirror copy of the unaffected hemimandible does not provide a suitable solution for bone restoration. The combination between TPS interpolation and CAD techniques (Method 3) produces an almost perfect-fitting 3D digital model that can be used for biocompatible custom-made scaffolds generated by rapid prototyping technologies.

  2. Virtual environment display for a 3D audio room simulation

    NASA Technical Reports Server (NTRS)

    Chapin, William L.; Foster, Scott H.

    1992-01-01

    The development of a virtual environment simulation system integrating a 3D acoustic audio model with an immersive 3D visual scene is discussed. The system complements the acoustic model and is specified to: allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; reinforce the listener's feeling of telepresence in the acoustical environment with visual and proprioceptive sensations; enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations.

  3. Enhanced LOD Concepts for Virtual 3d City Models

    NASA Astrophysics Data System (ADS)

    Benner, J.; Geiger, A.; Gröger, G.; Häfele, K.-H.; Löwner, M.-O.

    2013-09-01

    Virtual 3D city models contain digital three dimensional representations of city objects like buildings, streets or technical infrastructure. Because size and complexity of these models continuously grow, a Level of Detail (LoD) concept effectively supporting the partitioning of a complete model into alternative models of different complexity and providing metadata, addressing informational content, complexity and quality of each alternative model is indispensable. After a short overview on various LoD concepts, this paper discusses the existing LoD concept of the CityGML standard for 3D city models and identifies a number of deficits. Based on this analysis, an alternative concept is developed and illustrated with several examples. It differentiates between first, a Geometric Level of Detail (GLoD) and a Semantic Level of Detail (SLoD), and second between the interior building and its exterior shell. Finally, a possible implementation of the new concept is demonstrated by means of an UML model.

  4. Augmented reality system for oral surgery using 3D auto stereoscopic visualization.

    PubMed

    Tran, Huy Hoang; Suenaga, Hideyuki; Kuwana, Kenta; Masamune, Ken; Dohi, Takeyoshi; Nakajima, Susumu; Liao, Hongen

    2011-01-01

    We present an augmented reality system for oral and maxillofacial surgery in this paper. Instead of being displayed on a separated screen, three-dimensional (3D) virtual presentations of osseous structures and soft tissues are projected onto the patient's body, providing surgeons with exact knowledge of depth information of high risk tissues inside the bone. We employ a 3D integral imaging technique which produce motion parallax in both horizontal and vertical direction over a wide viewing area in this study. In addition, surgeons are able to check the progress of the operation in real-time through an intuitive 3D based interface which is content-rich, hardware accelerated. These features prevent surgeons from penetrating into high risk areas and thus help improve the quality of the operation. Operational tasks such as hole drilling, screw fixation were performed using our system and showed an overall positional error of less than 1 mm. Feasibility of our system was also verified with a human volunteer experiment.

  5. Augmented Reality Imaging System: 3D Viewing of a Breast Cancer

    PubMed Central

    Douglas, David B.; Boone, John M.; Petricoin, Emanuel; Liotta, Lance; Wilson, Eugene

    2016-01-01

    Objective To display images of breast cancer from a dedicated breast CT using Depth 3-Dimensional (D3D) augmented reality. Methods A case of breast cancer imaged using contrast-enhanced breast CT (Computed Tomography) was viewed with the augmented reality imaging, which uses a head display unit (HDU) and joystick control interface. Results The augmented reality system demonstrated 3D viewing of the breast mass with head position tracking, stereoscopic depth perception, focal point convergence and the use of a 3D cursor and joy-stick enabled fly through with visualization of the spiculations extending from the breast cancer. Conclusion The augmented reality system provided 3D visualization of the breast cancer with depth perception and visualization of the mass's spiculations. The augmented reality system should be further researched to determine the utility in clinical practice. PMID:27774517

  6. Mobile Virtual Reality : A Solution for Big Data Visualization

    NASA Astrophysics Data System (ADS)

    Marshall, E.; Seichter, N. D.; D'sa, A.; Werner, L. A.; Yuen, D. A.

    2015-12-01

    Pursuits in geological sciences and other branches of quantitative sciences often require data visualization frameworks that are in continual need of improvement and new ideas. Virtual reality is a medium of visualization that has large audiences originally designed for gaming purposes; Virtual reality can be captured in Cave-like environment but they are unwieldy and expensive to maintain. Recent efforts by major companies such as Facebook have focussed more on a large market , The Oculus is the first of such kind of mobile devices The operating system Unity makes it possible for us to convert the data files into a mesh of isosurfaces and be rendered into 3D. A user is immersed inside of the virtual reality and is able to move within and around the data using arrow keys and other steering devices, similar to those employed in XBox.. With introductions of products like the Oculus Rift and Holo Lens combined with ever increasing mobile computing strength, mobile virtual reality data visualization can be implemented for better analysis of 3D geological and mineralogical data sets. As more new products like the Surface Pro 4 and other high power yet very mobile computers are introduced to the market, the RAM and graphics card capacity necessary to run these models is more available, opening doors to this new reality. The computing requirements needed to run these models are a mere 8 GB of RAM and 2 GHz of CPU speed, which many mobile computers are starting to exceed. Using Unity 3D software to create a virtual environment containing a visual representation of the data, any data set converted into FBX or OBJ format which can be traversed by wearing the Oculus Rift device. This new method for analysis in conjunction with 3D scanning has potential applications in many fields, including the analysis of precious stones or jewelry. Using hologram technology to capture in high-resolution the 3D shape, color, and imperfections of minerals and stones, detailed review and

  7. Development of a Virtual Museum Including a 4d Presentation of Building History in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Tschirschwitz, F.; Deggim, S.

    2017-02-01

    In the last two decades the definition of the term "virtual museum" changed due to rapid technological developments. Using today's available 3D technologies a virtual museum is no longer just a presentation of collections on the Internet or a virtual tour of an exhibition using panoramic photography. On one hand, a virtual museum should enhance a museum visitor's experience by providing access to additional materials for review and knowledge deepening either before or after the real visit. On the other hand, a virtual museum should also be used as teaching material in the context of museum education. The laboratory for Photogrammetry & Laser Scanning of the HafenCity University Hamburg has developed a virtual museum (VM) of the museum "Alt-Segeberger Bürgerhaus", a historic town house. The VM offers two options for visitors wishing to explore the museum without travelling to the city of Bad Segeberg, Schleswig-Holstein, Germany. Option a, an interactive computer-based, tour for visitors to explore the exhibition and to collect information of interest or option b, to immerse into virtual reality in 3D with the HTC Vive Virtual Reality System.

  8. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  9. Special Section: New Ways to Detect Colon Cancer 3-D virtual screening now being used

    MedlinePlus

    ... New Ways to Detect Colon Cancer 3-D virtual screening now being used Past Issues / Spring 2009 ... showcases a 3-D image generated by the virtual colonoscopy software he invented with a team of ...

  10. Learning Rationales and Virtual Reality Technology in Education.

    ERIC Educational Resources Information Center

    Chiou, Guey-Fa

    1995-01-01

    Defines and describes virtual reality technology and differentiates between virtual learning environment, learning material, and learning tools. Links learning rationales to virtual reality technology to pave conceptual foundations for application of virtual reality technology education. Constructivism, case-based learning, problem-based learning,…

  11. The virtues of virtual reality in exposure therapy.

    PubMed

    Gega, Lina

    2017-04-01

    Virtual reality can be more effective and less burdensome than real-life exposure. Optimal virtual reality delivery should incorporate in situ direct dialogues with a therapist, discourage safety behaviours, allow for a mismatch between virtual and real exposure tasks, and encourage self-directed real-life practice between and beyond virtual reality sessions.

  12. 3D Technology Selection for a Virtual Learning Environment by Blending ISO 9126 Standard and AHP

    ERIC Educational Resources Information Center

    Cetin, Aydin; Guler, Inan

    2011-01-01

    Web3D presents many opportunities for learners in a virtual world or virtual environment over the web. This is a great opportunity for open-distance education institutions to benefit from web3d technologies to create courses with interactive 3d materials. There are many open source and commercial products offering 3d technologies over the web…

  13. Virtual reality in rehabilitation and therapy.

    PubMed

    Matijević, Valentina; Secić, Ana; Masić, Valentina; Sunić, Martina; Kolak, Zeljka; Znika, Mateja

    2013-12-01

    This paper describes virtual reality and some of its potential applications in rehabilitation and therapy. Some aspects of this technology are discussed with respect to different problem areas (sensorimotor impairments, autism, learning difficulties), as well as previous research which investigated changes within some motor and motivation parameters in relation to rehabilitation of children with motor impairments. Emphasis is on the positive effects of virtual reality as a method in which rehabilitation and therapy can be offered and evaluated within a functional, purposeful and motivating context.

  14. From Vesalius to Virtual Reality: How Embodied Cognition Facilitates the Visualization of Anatomy

    ERIC Educational Resources Information Center

    Jang, Susan

    2010-01-01

    This study examines the facilitative effects of embodiment of a complex internal anatomical structure through three-dimensional ("3-D") interactivity in a virtual reality ("VR") program. Since Shepard and Metzler's influential 1971 study, it has been known that 3-D objects (e.g., multiple-armed cube or external body parts) are visually and…

  15. Using Virtual Reality Computer Models to Support Student Understanding of Astronomical Concepts

    ERIC Educational Resources Information Center

    Barnett, Michael; Yamagata-Lynch, Lisa; Keating, Tom; Barab, Sasha A.; Hay, Kenneth E.

    2005-01-01

    The purpose of this study was to examine how 3-dimensional (3-D) models of the Solar System supported student development of conceptual understandings of various astronomical phenomena that required a change in frame of reference. In the course described in this study, students worked in teams to design and construct 3-D virtual reality computer…

  16. ICCE/ICCAI 2000 Full & Short Papers (Virtual Reality in Education).

    ERIC Educational Resources Information Center

    2000

    This document contains the full text of the following full and short papers on virtual reality in education from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A CAL System for Appreciation of 3D Shapes by Surface Development (C3D-SD)" (Stephen C. F. Chan, Andy…

  17. Second Life, a 3-D Animated Virtual World: An Alternative Platform for (Art) Education

    ERIC Educational Resources Information Center

    Han, Hsiao-Cheng

    2011-01-01

    3-D animated virtual worlds are no longer only for gaming. With the advance of technology, animated virtual worlds not only are found on every computer, but also connect users with the internet. Today, virtual worlds are created not only by companies, but also through the collaboration of users. Online 3-D animated virtual worlds provide a new…

  18. Virtual 3D tumor marking-exact intraoperative coordinate mapping improve post-operative radiotherapy

    PubMed Central

    2011-01-01

    The quality of the interdisciplinary interface in oncological treatment between surgery, pathology and radiotherapy is mainly dependent on reliable anatomical three-dimensional (3D) allocation of specimen and their context sensitive interpretation which defines further treatment protocols. Computer-assisted preoperative planning (CAPP) allows for outlining macroscopical tumor size and margins. A new technique facilitates the 3D virtual marking and mapping of frozen sections and resection margins or important surgical intraoperative information. These data could be stored in DICOM format (Digital Imaging and Communication in Medicine) in terms of augmented reality and transferred to communicate patient's specific tumor information (invasion to vessels and nerves, non-resectable tumor) to oncologists, radiotherapists and pathologists. PMID:22087558

  19. Applied virtual reality at the Research Triangle Institute

    NASA Technical Reports Server (NTRS)

    Montoya, R. Jorge

    1994-01-01

    Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.

  20. Importance of Virtual Reality to a Controlled Stimulus

    DTIC Science & Technology

    2013-10-01

    AD_________________ Award Number: W81XWH-08-1-0755 TITLE: Importance of Virtual Reality to a...Importance of Virtual Reality to a Controlled Stimulus 5a. CONTRACT NUMBER 5b. GRANT NUMBER W81XWH-08-1-0755 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR... Virtual Reality (VR) simulator used in Virtual Reality Exposure Therapy (VRET) is the active component when using the technique to treat combat

  1. Virtual Reality Training Environments: Contexts and Concerns.

    ERIC Educational Resources Information Center

    Harmon, Stephen W.; Kenney, Patrick J.

    1994-01-01

    Discusses the contexts where virtual reality (VR) training environments might be appropriate; examines the advantages and disadvantages of VR as a training technology; and presents a case study of a VR training environment used at the NASA Johnson Space Center in preparation for the repair of the Hubble Space Telescope. (AEF)

  2. Are Learning Styles Relevant to Virtual Reality?

    ERIC Educational Resources Information Center

    Chen, Chwen Jen; Toh, Seong Chong; Ismail, Wan Mohd Fauzy Wan

    2005-01-01

    This study aims to investigate the effects of a virtual reality (VR)-based learning environment on learners with different learning styles. The findings of the aptitude-by-treatment interaction study have shown that learners benefit most from the VR (guided exploration) mode, irrespective of their learning styles. This shows that the VR-based…

  3. Virtual Reality: Visualization in Three Dimensions.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    Virtual reality is a newly emerging tool for scientific visualization that makes possible multisensory, three-dimensional modeling of scientific data. While the emphasis is on visualization, the other senses are added to enhance what the scientist can visualize. Researchers are working to extend the sensory range of what can be perceived in…

  4. Evaluation of Virtual Reality Training Using Affect

    ERIC Educational Resources Information Center

    Tichon, Jennifer

    2012-01-01

    Training designed to support and strengthen higher-order mental abilities now often involves immersion in Virtual Reality (VR) where dangerous real world scenarios can be safely replicated. However, despite the growing popularity of VR to train cognitive skills such as decision-making and situation awareness, methods for evaluating their use rely…

  5. Applications of Virtual Reality to Nuclear Safeguards

    SciTech Connect

    Stansfield, S.

    1998-11-03

    This paper explores two potential applications of Virtual Reality (VR) to international nuclear safeguards: training and information organization and navigation. The applications are represented by two existing prototype systems, one for training nuclear weapons dismantlement and one utilizing a VR model to facilitate intuitive access to related sets of information.

  6. Surgery, virtual reality, and the future.

    PubMed

    Vosburgh, Kirby G; Golby, Alexandra; Pieper, Steven D

    2013-01-01

    MMVR has provided the leading forum for the multidisciplinary interaction and development of the use of Virtual Reality (VR) techniques in medicine, particularly in surgical practice. Here we look back at the foundations of our field, focusing on the use of VR in Surgery and similar interventional procedures, sum up the current status, and describe the challenges and opportunities going forward.

  7. Physics and 3D in Flash Simulations: Open Source Reality

    NASA Astrophysics Data System (ADS)

    Harold, J. B.; Dusenbery, P.

    2009-12-01

    Over the last decade our ability to deliver simulations over the web has steadily advanced. The improvements in speed of the Adobe Flash engine, and the development of open source tools to expand it, allow us to deliver increasingly sophisticated simulation based games through the browser, with no additional downloads required. In this paper we will present activities we are developing as part of two asteroids education projects: Finding NEO (funded through NSF and NASA SMD), and Asteroids! (funded through NSF). The first activity is Rubble!, an asteroids deflection game built on the open source Box2D physics engine. This game challenges players to push asteroids in to safe orbits before they crash in to the Earth. The Box2D engine allows us to go well beyond simple 2-body orbital calculations and incorporate “rubble piles”. These objects, which are representative of many asteroids, are composed of 50 or more individual rocks which gravitationally bind and separate in realistic ways. Even bombs can be modeled with sufficient physical accuracy to convince players of the hazards of trying to “blow up” incoming asteroids. The ability to easily build games based on underlying physical models allows us to address physical misconceptions in a natural way: by having the player operate in a world that directly collides with those misconceptions. Rubble! provides a particularly compelling example of this due to the variety of well documented misconceptions regarding gravity. The second activity is a Light Curve challenge, which uses the open source PaperVision3D tools to analyze 3D asteroid models. The goal of this activity is to introduce the player to the concept of “light curves”, measurements of asteroid brightness over time which are used to calculate the asteroid’s period. These measurements can even be inverted to generate three dimensional models of asteroids that are otherwise too small and distant to directly image. Through the use of the Paper

  8. Integrated Data Visualization and Virtual Reality Tool

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  9. Controlling social stress in virtual reality environments.

    PubMed

    Hartanto, Dwi; Kampmann, Isabel L; Morina, Nexhmedin; Emmelkamp, Paul G M; Neerincx, Mark A; Brinkman, Willem-Paul

    2014-01-01

    Virtual reality exposure therapy has been proposed as a viable alternative in the treatment of anxiety disorders, including social anxiety disorder. Therapists could benefit from extensive control of anxiety eliciting stimuli during virtual exposure. Two stimuli controls are studied in this study: the social dialogue situation, and the dialogue feedback responses (negative or positive) between a human and a virtual character. In the first study, 16 participants were exposed in three virtual reality scenarios: a neutral virtual world, blind date scenario, and job interview scenario. Results showed a significant difference between the three virtual scenarios in the level of self-reported anxiety and heart rate. In the second study, 24 participants were exposed to a job interview scenario in a virtual environment where the ratio between negative and positive dialogue feedback responses of a virtual character was systematically varied on-the-fly. Results yielded that within a dialogue the more positive dialogue feedback resulted in less self-reported anxiety, lower heart rate, and longer answers, while more negative dialogue feedback of the virtual character resulted in the opposite. The correlations between on the one hand the dialogue stressor ratio and on the other hand the means of SUD score, heart rate and audio length in the eight dialogue conditions showed a strong relationship: r(6) = 0.91, p = 0.002; r(6) = 0.76, p = 0.028 and r(6) = -0.94, p = 0.001 respectively. Furthermore, more anticipatory anxiety reported before exposure was found to coincide with more self-reported anxiety, and shorter answers during the virtual exposure. These results demonstrate that social dialogues in a virtual environment can be effectively manipulated for therapeutic purposes.

  10. Controlling Social Stress in Virtual Reality Environments

    PubMed Central

    Hartanto, Dwi; Kampmann, Isabel L.; Morina, Nexhmedin; Emmelkamp, Paul G. M.; Neerincx, Mark A.; Brinkman, Willem-Paul

    2014-01-01

    Virtual reality exposure therapy has been proposed as a viable alternative in the treatment of anxiety disorders, including social anxiety disorder. Therapists could benefit from extensive control of anxiety eliciting stimuli during virtual exposure. Two stimuli controls are studied in this study: the social dialogue situation, and the dialogue feedback responses (negative or positive) between a human and a virtual character. In the first study, 16 participants were exposed in three virtual reality scenarios: a neutral virtual world, blind date scenario, and job interview scenario. Results showed a significant difference between the three virtual scenarios in the level of self-reported anxiety and heart rate. In the second study, 24 participants were exposed to a job interview scenario in a virtual environment where the ratio between negative and positive dialogue feedback responses of a virtual character was systematically varied on-the-fly. Results yielded that within a dialogue the more positive dialogue feedback resulted in less self-reported anxiety, lower heart rate, and longer answers, while more negative dialogue feedback of the virtual character resulted in the opposite. The correlations between on the one hand the dialogue stressor ratio and on the other hand the means of SUD score, heart rate and audio length in the eight dialogue conditions showed a strong relationship: r(6) = 0.91, p = 0.002; r(6) = 0.76, p = 0.028 and r(6) = −0.94, p = 0.001 respectively. Furthermore, more anticipatory anxiety reported before exposure was found to coincide with more self-reported anxiety, and shorter answers during the virtual exposure. These results demonstrate that social dialogues in a virtual environment can be effectively manipulated for therapeutic purposes. PMID:24671006

  11. Virtual Reality Educational Tool for Human Anatomy.

    PubMed

    Izard, Santiago González; Juanes Méndez, Juan A; Palomera, Pablo Ruisoto

    2017-05-01

    Virtual Reality is becoming widespread in our society within very different areas, from industry to entertainment. It has many advantages in education as well, since it allows visualizing almost any object or going anywhere in a unique way. We will be focusing on medical education, and more specifically anatomy, where its use is especially interesting because it allows studying any structure of the human body by placing the user inside each one. By allowing virtual immersion in a body structure such as the interior of the cranium, stereoscopic vision goggles make these innovative teaching technologies a powerful tool for training in all areas of health sciences. The aim of this study is to illustrate the teaching potential of applying Virtual Reality in the field of human anatomy, where it can be used as a tool for education in medicine. A Virtual Reality Software was developed as an educational tool. This technological procedure is based entirely on software which will run in stereoscopic goggles to give users the sensation of being in a virtual environment, clearly showing the different bones and foramina which make up the cranium, and accompanied by audio explanations. Throughout the results the structure of the cranium is described in detailed from both inside and out. Importance of an exhaustive morphological knowledge of cranial fossae is further discussed. Application for the design of microsurgery is also commented.

  12. Web Reference: A Virtual Reality.

    ERIC Educational Resources Information Center

    Foster, Janet

    1999-01-01

    Presents ideas and strategies to enhance digital reference services available via the Internet in public libraries. Describes print publications which include Web reference columns; subject guides, both print and online; and the resources of the Internet Public Library and other virtual reference desks. (LRW)

  13. Transportation planning: A virtual reality

    SciTech Connect

    Bradley, J.; Hefele, J.; Dolin, R.M.

    1994-07-01

    An important factor in the development of any base technology is generating it in such a way that these technologies will continue to be useful through systems upgrades and implementation philosophy metamorphoses. Base technologies of traffic engineering including transportation modeling, traffic impact forecasting, traffic operation management, emergency situation routing and re-routing, and signal systems optimization should all be designed with the future in mind. Advanced Traffic Engineering topics, such as Intelligent Vehicle Highway Systems, are designed with advanced engineering concepts such as rules-based design and artificial intelligence. All aspects of development of base technologies must include Total Quality Engineering as the primary factor in order to succeed. This philosophy for development of base technologies for the County of Los Alamos is being developed leveraging the resources of the Center for Advanced Engineering Technology (CAET) at the Los Alamos National Laboratory. The mission of the CAET is to develop next-generation engineering technology that supports the Los Alamos National Laboratory`s mission and to transfer that technology to industry and academia. The CAET`s goal is to promote industrial, academic, and government interactions in diverse areas of engineering technology, such as, design, analysis, manufacturing, virtual enterprise, robotics, telepresence, rapid prototyping, and virtual environment technology. The Center is expanding, enhancing, and increasing core competencies at the Los Alamos National Laboratory. The CAET has three major thrust areas: development of base technologies, virtual environment technology applications, and educational outreach and training. Virtual environment technology immerses a user in a nonexistent or augmented environment for research or training purposes. Virtual environment technology illustrates the axiom, ``The best way to learn is by doing.``

  14. A novel 3D guidance system using augmented reality for percutaneous vertebroplasty: technical note.

    PubMed

    Abe, Yuichiro; Sato, Shigenobu; Kato, Koji; Hyakumachi, Takahiko; Yanagibashi, Yasushi; Ito, Manabu; Abumi, Kuniyoshi

    2013-10-01

    Augmented reality (AR) is an imaging technology by which virtual objects are overlaid onto images of real objects captured in real time by a tracking camera. This study aimed to introduce a novel AR guidance system called virtual protractor with augmented reality (VIPAR) to visualize a needle trajectory in 3D space during percutaneous vertebroplasty (PVP). The AR system used for this study comprised a head-mount display (HMD) with a tracking camera and a marker sheet. An augmented scene was created by overlaying the preoperatively generated needle trajectory path onto a marker detected on the patient using AR software, thereby providing the surgeon with augmented views in real time through the HMD. The accuracy of the system was evaluated by using a computer-generated simulation model in a spine phantom and also evaluated clinically in 5 patients. In the 40 spine phantom trials, the error of the insertion angle (EIA), defined as the difference between the attempted angle and the insertion angle, was evaluated using 3D CT scanning. Computed tomography analysis of the 40 spine phantom trials showed that the EIA in the axial plane significantly improved when VIPAR was used compared with when it was not used (0.96° ± 0.61° vs 4.34° ± 2.36°, respectively). The same held true for EIA in the sagittal plane (0.61° ± 0.70° vs 2.55° ± 1.93°, respectively). In the clinical evaluation of the AR system, 5 patients with osteoporotic vertebral fractures underwent VIPAR-guided PVP from October 2011 to May 2012. The postoperative EIA was evaluated using CT. The clinical results of the 5 patients showed that the EIA in all 10 needle insertions was 2.09° ± 1.3° in the axial plane and 1.98° ± 1.8° in the sagittal plane. There was no pedicle breach or leakage of polymethylmethacrylate. VIPAR was successfully used to assist in needle insertion during PVP by providing the surgeon with an ideal insertion point and needle trajectory through the HMD. The findings indicate

  15. Objective and subjective quality assessment of geometry compression of reconstructed 3D humans in a 3D virtual room

    NASA Astrophysics Data System (ADS)

    Mekuria, Rufael; Cesar, Pablo; Doumanis, Ioannis; Frisiello, Antonella

    2015-09-01

    Compression of 3D object based video is relevant for 3D Immersive applications. Nevertheless, the perceptual aspects of the degradation introduced by codecs for meshes and point clouds are not well understood. In this paper we evaluate the subjective and objective degradations introduced by such codecs in a state of art 3D immersive virtual room. In the 3D immersive virtual room, users are captured with multiple cameras, and their surfaces are reconstructed as photorealistic colored/textured 3D meshes or point clouds. To test the perceptual effect of compression and transmission, we render degraded versions with different frame rates in different contexts (near/far) in the scene. A quantitative subjective study with 16 users shows that negligible distortion of decoded surfaces compared to the original reconstructions can be achieved in the 3D virtual room. In addition, a qualitative task based analysis in a full prototype field trial shows increased presence, emotion, user and state recognition of the reconstructed 3D Human representation compared to animated computer avatars.

  16. Participatory Gis: Experimentations for a 3d Social Virtual Globe

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Zamboni, G.

    2013-08-01

    The dawn of GeoWeb 2.0, the geographic extension of Web 2.0, has opened new possibilities in terms of online dissemination and sharing of geospatial contents, thus laying the foundations for a fruitful development of Participatory GIS (PGIS). The purpose of the study is to investigate the extension of PGIS applications, which are quite mature in the traditional bi-dimensional framework, up to the third dimension. More in detail, the system should couple a powerful 3D visualization with an increase of public participation by means of a tool allowing data collecting from mobile devices (e.g. smartphones and tablets). The PGIS application, built using the open source NASA World Wind virtual globe, is focussed on the cultural and tourism heritage of Como city, located in Northern Italy. An authentication mechanism was implemented, which allows users to create and manage customized projects through cartographic mash-ups of Web Map Service (WMS) layers. Saved projects populate a catalogue which is available to the entire community. Together with historical maps and the current cartography of the city, the system is also able to manage geo-tagged multimedia data, which come from user field-surveys performed through mobile devices and report POIs (Points Of Interest). Each logged user can then contribute to POIs characterization by adding textual and multimedia information (e.g. images, audios and videos) directly on the globe. All in all, the resulting application allows users to create and share contributions as it usually happens on social platforms, additionally providing a realistic 3D representation enhancing the expressive power of data.

  17. VSViewer3D: a tool for interactive data mining of three-dimensional virtual screening data.

    PubMed

    Diller, Kyle I; Diller, David J

    2014-12-22

    The VSviewer3D is a simple Java tool for visual exploration of three-dimensional (3D) virtual screening data. The VSviewer3D brings together the ability to explore numerical data, such as calculated properties and virtual screening scores, structure depiction, interactive topological and 3D similarity searching, and 3D visualization. By doing so the user is better able to quickly identify outliers, assess tractability of large numbers of compounds, visualize hits of interest, annotate hits, and mix and match interesting scaffolds. We demonstrate the utility of the VSviewer3D by describing a use case in a docking based virtual screen.

  18. Implementation of virtual models from sheet metal forming simulation into physical 3D colour models using 3D printing

    NASA Astrophysics Data System (ADS)

    Junk, S.

    2016-08-01

    Today the methods of numerical simulation of sheet metal forming offer a great diversity of possibilities for optimization in product development and in process design. However, the results from simulation are only available as virtual models. Because there are any forming tools available during the early stages of product development, physical models that could serve to represent the virtual results are therefore lacking. Physical 3D-models can be created using 3D-printing and serve as an illustration and present a better understanding of the simulation results. In this way, the results from the simulation can be made more “comprehensible” within a development team. This paper presents the possibilities of 3D-colour printing with particular consideration of the requirements regarding the implementation of sheet metal forming simulation. Using concrete examples of sheet metal forming, the manufacturing of 3D colour models will be expounded upon on the basis of simulation results.

  19. [3D virtual imaging of the upper airways].

    PubMed

    Ferretti, G; Coulomb, M

    2000-04-01

    The different three dimensional reconstructions of the upper airways that can be obtained with spiral computed tomograpy (CT) are presented here. The parameters indispensable to achieve as real as possible spiral CT images are recalled together with the advantages and disadvantages of the different techniues. Multislice reconstruction (MSR) produces slices in different planes of space with the high contrast of CT slices. They provide information similar to that obtained for the rare indications for thoracic MRI. Thick slice reconstructions with maximum intensity projection (MIP) or minimum intensity projection (minIP) give projection views where the contrast can be modified by selecting the more dense (MIP) or less dense (minIP) voxels. They find their application in the exploration of the upper airways. Surface and volume external 3D reconstructions can be obtained. They give an overall view of the upper airways, similar to a bronchogram. Virtual endoscopy reproduces real endoscopic images but cannot provide information on the aspect of the mucosa or biopsy specimens. It offers possible applications for preparing, guiding and controlling interventional fibroscopy procedures.

  20. Fully Three-Dimensional Virtual-Reality System

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1994-01-01

    Proposed virtual-reality system presents visual displays to simulate free flight in three-dimensional space. System, virtual space pod, is testbed for control and navigation schemes. Unlike most virtual-reality systems, virtual space pod would not depend for orientation on ground plane, which hinders free flight in three dimensions. Space pod provides comfortable seating, convenient controls, and dynamic virtual-space images for virtual traveler. Controls include buttons plus joysticks with six degrees of freedom.

  1. Virtual reality for dragline planners

    SciTech Connect

    Cobcroft, T.

    2007-03-15

    3d-Dig as an invaluable mine planning and communication tool, developed by Earth Technology Pty Ltd., that makes it possible to easily communicate a mine plan through the use of animations and other graphics. An Australian company has been using it to plan in-detail pits and strips for up to five years in advance; a US operator is using it to optimise dragline stripping around inside corners and to accurately plan the traverse of ramps. The new system offers a better predication of rehandled volumes, linear coal advance and dig time within a strip. It is useful for optimising waste stripping and timing of uncovered coal to enhance blending and shipping reliability. It presents volumetric, spoil placement and positioning data while generating animations that communicate the plan. 5 figs.

  2. The Virtual Tablet: Virtual Reality as a Control System

    NASA Technical Reports Server (NTRS)

    Chronister, Andrew

    2016-01-01

    In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of

  3. Virtual reality in behavioral neuroscience and beyond.

    PubMed

    Tarr, Michael J; Warren, William H

    2002-11-01

    Virtual reality (VR) has finally come of age for serious applications in the behavioral neurosciences. After capturing the public imagination a decade ago, enthusiasm for VR flagged due to hardware limitations, an absent commercial market and manufacturers who dropped the mass-market products that normally drive technological development. Recently, however, improvements in computer speed, quality of head-mounted displays and wide-area tracking systems have made VR attractive for both research and real-world applications in neuroscience, cognitive science and psychology. New and exciting applications for VR have emerged in research, training, rehabilitation, teleoperation, virtual archeology and tele-immersion.

  4. An optical tracking system for virtual reality

    NASA Astrophysics Data System (ADS)

    Hrimech, Hamid; Merienne, Frederic

    2009-03-01

    In this paper we present a low-cost 3D tracking system which we have developed and tested in order to move away from traditional 2D interaction techniques (keyboard and mouse) in an attempt to improve user's experience while using a CVE. Such a tracking system is used to implement 3D interaction techniques that augment user experience, promote user's sense of transportation in the virtual world as well as user's awareness of their partners. The tracking system is a passive optical tracking system using stereoscopy a technique allowing the reconstruction of three-dimensional information from a couple of images. We have currently deployed our 3D tracking system on a collaborative research platform for investigating 3D interaction techniques in CVEs.

  5. Applying Virtual Reality to commercial Edutainment

    NASA Technical Reports Server (NTRS)

    Grissom, F.; Goza, Sharon P.; Goza, S. Michael

    1994-01-01

    Virtual reality (VR) when defined as a computer generated, immersive, three dimensional graphics environment which provides varying degrees of interactivity, remains an expensive, highly specialized application, yet to find its way into the school, home, or business. As a novel approach to a theme park-type attraction, though, its use can be justified. This paper describes how a virtual reality 'tour of the human digestive system' was created for the Omniplex Science Museum of Oklahoma City, Oklahoma. The customers main objectives were: (1) to educate; (2) to entertain; (3) to draw visitors; and (4) to generate revenue. The 'Edutainment' system ultimately delivered met these goals. As more such systems come into existence the resulting library of licensable programs will greatly reduce development costs to individual institutions.

  6. Feedback from video for virtual reality Navigation

    SciTech Connect

    Tsap, L V

    2000-10-27

    Important preconditions for wide acceptance of virtual reality (VR) systems include their comfort, ease and naturalness to use. Most existing trackers super from discomfort-related issues. For example, body-based trackers (hand controllers, joysticks, helmet attachments, etc.) restrict spontaneity and naturalness of motion, while ground-based devices (e.g., hand controllers) limit the workspace by literally binding an operator to the ground. There are similar problems with controls. This paper describes using real-time video with registered depth information (from a commercially available camera) for virtual reality navigation. Camera-based setup can replace cumbersome trackers. The method includes selective depth processing for increased speed, and a robust skin-color segmentation for accounting illumination variations.

  7. Virtual reality disaster training: translation to practice.

    PubMed

    Farra, Sharon L; Miller, Elaine T; Hodgson, Eric

    2015-01-01

    Disaster training is crucial to the mitigation of both mortality and morbidity associated with disasters. Just as clinical practice needs to be grounded in evidence, effective disaster education is dependent upon the development and use of andragogic and pedagogic evidence. Educational research findings must be transformed into useable education strategies. Virtual reality simulation is a teaching methodology that has the potential to be a powerful educational tool. The purpose of this article is to translate research findings related to the use of virtual reality simulation in disaster training into education practice. The Ace Star Model serves as a valuable framework to translate the VRS teaching methodology and improve disaster training of healthcare professionals. Using the Ace Star Model as a framework to put evidence into practice, strategies for implementing a virtual reality simulation are addressed. Practice guidelines, implementation recommendations, integration to practice and evaluation are discussed. It is imperative that health educators provide more exemplars of how research evidence can be moved through the various stages of the model to advance practice and sustain learning outcomes.

  8. Sound For Animation And Virtual Reality

    NASA Technical Reports Server (NTRS)

    Hahn, James K.; Docter, Pete; Foster, Scott H.; Mangini, Mark; Myers, Tom; Wenzel, Elizabeth M.; Null, Cynthia (Technical Monitor)

    1995-01-01

    Sound is an integral part of the experience in computer animation and virtual reality. In this course, we will present some of the important technical issues in sound modeling, rendering, and synchronization as well as the "art" and business of sound that are being applied in animations, feature films, and virtual reality. The central theme is to bring leading researchers and practitioners from various disciplines to share their experiences in this interdisciplinary field. The course will give the participants an understanding of the problems and techniques involved in producing and synchronizing sounds, sound effects, dialogue, and music. The problem spans a number of domains including computer animation and virtual reality. Since sound has been an integral part of animations and films much longer than for computer-related domains, we have much to learn from traditional animation and film production. By bringing leading researchers and practitioners from a wide variety of disciplines, the course seeks to give the audience a rich mixture of experiences. It is expected that the audience will be able to apply what they have learned from this course in their research or production.

  9. Development of real-time motion capture system for 3D on-line games linked with virtual character

    NASA Astrophysics Data System (ADS)

    Kim, Jong Hyeong; Ryu, Young Kee; Cho, Hyung Suck

    2004-10-01

    Motion tracking method is being issued as essential part of the entertainment, medical, sports, education and industry with the development of 3-D virtual reality. Virtual human character in the digital animation and game application has been controlled by interfacing devices; mouse, joysticks, midi-slider, and so on. Those devices could not enable virtual human character to move smoothly and naturally. Furthermore, high-end human motion capture systems in commercial market are expensive and complicated. In this paper, we proposed a practical and fast motion capturing system consisting of optic sensors, and linked the data with 3-D game character with real time. The prototype experiment setup is successfully applied to a boxing game which requires very fast movement of human character.

  10. ETeach3D: Designing a 3D Virtual Environment for Evaluating the Digital Competence of Preservice Teachers

    ERIC Educational Resources Information Center

    Esteve-Mon, Francesc M.; Cela-Ranilla, Jose María; Gisbert-Cervera, Mercè

    2016-01-01

    The acquisition of teacher digital competence is a key aspect in the initial training of teachers. However, most existing evaluation instruments do not provide sufficient evidence of this teaching competence. In this study, we describe the design and development process of a three-dimensional (3D) virtual environment for evaluating the teacher…

  11. Virtual reality simulators and training in laparoscopic surgery.

    PubMed

    Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos

    2015-01-01

    Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training.

  12. iVirtualWorld: A Domain-Oriented End-User Development Environment for Building 3D Virtual Chemistry Experiments

    ERIC Educational Resources Information Center

    Zhong, Ying

    2013-01-01

    Virtual worlds are well-suited for building virtual laboratories for educational purposes to complement hands-on physical laboratories. However, educators may face technical challenges because developing virtual worlds requires skills in programming and 3D design. Current virtual world building tools are developed for users who have programming…

  13. Selected Applications of Virtual Reality in Manufacturing

    NASA Astrophysics Data System (ADS)

    Novak-Marcincin, Jozef

    2011-01-01

    Virtual reality (VR) has become an important and useful tool in science and engineering. VR applications cover a wide range of industrial areas from product design to analysis, from product prototyping to manufacturing. The design and manufacturing of a product can be viewed, evaluated and improved in a virtual environment before its prototype is made, which is an enormous cost saving. Virtual Manufacturing (VM) is the use of computer models and simulations of manufacturing processes to aid in the design and production of manufactured products. VM is the use of manufacturing-based simulations to optimize the design of product and processes for a specific manufacturing goal such as: design for assembly; quality; lean operations; and/or flexibility.

  14. Virtual reality applications to agoraphobia: a protocol.

    PubMed

    Cárdenas, Georgina; Muñoz, Sandra; González, Maribel; Uribarren, Guillermo

    2006-04-01

    Recently, educators and instructional designers have focused on the development and implementation of virtual learning environments that effectively combine theoretical and applied knowledge to teach university students. One of the trusts of the Psychology Virtual Teaching Laboratory in collaboration with the IXTLI observatory is to develop dissemination programs to promote the insertion of virtual reality (VR) technologies applied to rehabilitation in their clinical practice. This paper describes the development of (1) agoraphobia VR learning objects to be use as a teaching support tools in class and (2) a multimedia teaching program that incorporate digital video and VR scenarios address to students in the field of mental health. Promotion among professors and students about the use of this technology will allow us to initiate research in our country as well as to validate contextualized applications for our culture, therefore contributing with new advances in this field.

  15. Avatars, Affordances, and Interfaces: Virtual Reality Tools for Learning.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    Virtual reality technology offers the promise of interaction with a computer-based environment that engages visual, auditory, and tactile perception. Three interrelated virtual reality design topics are particularly relevant to visual literacy. The first is the concept of avatars. Avatars are agents that appear in a virtual world representing the…

  16. Intelligent virtual reality in the setting of fuzzy sets

    NASA Technical Reports Server (NTRS)

    Dockery, John; Littman, David

    1992-01-01

    The authors have previously introduced the concept of virtual reality worlds governed by artificial intelligence. Creation of an intelligent virtual reality was further proposed as a universal interface for the handicapped. This paper extends consideration of intelligent virtual realty to a context in which fuzzy set principles are explored as a major tool for implementing theory in the domain of applications to the disabled.

  17. Virtual Reality: A Dream Come True or a Nightmare.

    ERIC Educational Resources Information Center

    Cornell, Richard; Bailey, Dan

    Virtual Reality (VR) is a new medium which allows total stimulation of one's senses through human/computer interfaces. VR has applications in training simulators, nano-science, medicine, entertainment, electronic technology, and manufacturing. This paper focuses on some current and potential problems of virtual reality and virtual environments…

  18. Visual landmarks facilitate rodent spatial navigation in virtual reality environments

    PubMed Central

    Youngstrom, Isaac A.; Strowbridge, Ben W.

    2012-01-01

    Because many different sensory modalities contribute to spatial learning in rodents, it has been difficult to determine whether spatial navigation can be guided solely by visual cues. Rodents moving within physical environments with visual cues engage a variety of nonvisual sensory systems that cannot be easily inhibited without lesioning brain areas. Virtual reality offers a unique approach to ask whether visual landmark cues alone are sufficient to improve performance in a spatial task. We found that mice could learn to navigate between two water reward locations along a virtual bidirectional linear track using a spherical treadmill. Mice exposed to a virtual environment with vivid visual cues rendered on a single monitor increased their performance over a 3-d training regimen. Training significantly increased the percentage of time avatars controlled by the mice spent near reward locations in probe trials without water rewards. Neither improvement during training or spatial learning for reward locations occurred with mice operating a virtual environment without vivid landmarks or with mice deprived of all visual feedback. Mice operating the vivid environment developed stereotyped avatar turning behaviors when alternating between reward zones that were positively correlated with their performance on the probe trial. These results suggest that mice are able to learn to navigate to specific locations using only visual cues presented within a virtual environment rendered on a single computer monitor. PMID:22345484

  19. Effects of 3D Virtual Simulators in the Introductory Wind Energy Course: A Tool for Teaching Engineering Concepts

    SciTech Connect

    Do, Phuong T.; Moreland, John R.; Delgado, Catherine; Wilson, Kristina; Wang, Xiuling; Zhou, Chenn; Ice, Phil

    2013-01-01

    Our research provides an innovative solution for optimizing learning effectiveness and improving postsecondary education through the development of virtual simulators that can be easily used and integrated into existing wind energy curriculum. Two 3D virtual simulators are developed in our laboratory for use in an immersive 3D virtual reality (VR) system or for 3D display on a 2D screen. Our goal is to apply these prototypical simulators to train postsecondary students and professionals in wind energy education; and to offer experiential learning opportunities in 3D modeling, simulation, and visualization. The issue of transferring learned concepts to practical applications is a widespread problem in postsecondary education. Related to this issue is a critical demand to educate and train a generation of professionals for the wind energy industry. With initiatives such as the U.S. Department of Energy's “20% Wind Energy by 2030” outlining an exponential increase of wind energy capacity over the coming years, revolutionary educational reform is needed to meet the demand for education in the field of wind energy. These developments and implementation of Virtual Simulators and accompanying curriculum will propel national reforms, meeting the needs of the wind energy industrial movement and addressing broader educational issues that affect a number of disciplines.

  20. Effects of 3D Virtual Simulators in the Introductory Wind Energy Course: A Tool for Teaching Engineering Concepts

    DOE PAGES

    Do, Phuong T.; Moreland, John R.; Delgado, Catherine; ...

    2013-01-01

    Our research provides an innovative solution for optimizing learning effectiveness and improving postsecondary education through the development of virtual simulators that can be easily used and integrated into existing wind energy curriculum. Two 3D virtual simulators are developed in our laboratory for use in an immersive 3D virtual reality (VR) system or for 3D display on a 2D screen. Our goal is to apply these prototypical simulators to train postsecondary students and professionals in wind energy education; and to offer experiential learning opportunities in 3D modeling, simulation, and visualization. The issue of transferring learned concepts to practical applications is amore » widespread problem in postsecondary education. Related to this issue is a critical demand to educate and train a generation of professionals for the wind energy industry. With initiatives such as the U.S. Department of Energy's “20% Wind Energy by 2030” outlining an exponential increase of wind energy capacity over the coming years, revolutionary educational reform is needed to meet the demand for education in the field of wind energy. These developments and implementation of Virtual Simulators and accompanying curriculum will propel national reforms, meeting the needs of the wind energy industrial movement and addressing broader educational issues that affect a number of disciplines.« less

  1. The Virtual Radiopharmacy Laboratory: A 3-D Simulation for Distance Learning

    ERIC Educational Resources Information Center

    Alexiou, Antonios; Bouras, Christos; Giannaka, Eri; Kapoulas, Vaggelis; Nani, Maria; Tsiatsos, Thrasivoulos

    2004-01-01

    This article presents Virtual Radiopharmacy Laboratory (VR LAB), a virtual laboratory accessible through the Internet. VR LAB is designed and implemented in the framework of the VirRAD European project. This laboratory represents a 3D simulation of a radio-pharmacy laboratory, where learners, represented by 3D avatars, can experiment on…

  2. 3D Inhabited Virtual Worlds: Interactivity and Interaction between Avatars, Autonomous Agents, and Users.

    ERIC Educational Resources Information Center

    Jensen, Jens F.

    This paper addresses some of the central questions currently related to 3-Dimensional Inhabited Virtual Worlds (3D-IVWs), their virtual interactions, and communication, drawing from the theory and methodology of sociology, interaction analysis, interpersonal communication, semiotics, cultural studies, and media studies. First, 3D-IVWs--seen as a…

  3. Virtual reality and consciousness inference in dreaming

    PubMed Central

    Hobson, J. Allan; Hong, Charles C.-H.; Friston, Karl J.

    2014-01-01

    This article explores the notion that the brain is genetically endowed with an innate virtual reality generator that – through experience-dependent plasticity – becomes a generative or predictive model of the world. This model, which is most clearly revealed in rapid eye movement (REM) sleep dreaming, may provide the theater for conscious experience. Functional neuroimaging evidence for brain activations that are time-locked to rapid eye movements (REMs) endorses the view that waking consciousness emerges from REM sleep – and dreaming lays the foundations for waking perception. In this view, the brain is equipped with a virtual model of the world that generates predictions of its sensations. This model is continually updated and entrained by sensory prediction errors in wakefulness to ensure veridical perception, but not in dreaming. In contrast, dreaming plays an essential role in maintaining and enhancing the capacity to model the world by minimizing model complexity and thereby maximizing both statistical and thermodynamic efficiency. This perspective suggests that consciousness corresponds to the embodied process of inference, realized through the generation of virtual realities (in both sleep and wakefulness). In short, our premise or hypothesis is that the waking brain engages with the world to predict the causes of sensations, while in sleep the brain’s generative model is actively refined so that it generates more efficient predictions during waking. We review the evidence in support of this hypothesis – evidence that grounds consciousness in biophysical computations whose neuronal and neurochemical infrastructure has been disclosed by sleep research. PMID:25346710

  4. Virtual reality simulation mechanism on WWW

    NASA Astrophysics Data System (ADS)

    Wang, Fei; Feng, Yuncheng; Wei, Youshuang

    2000-06-01

    This paper addresses a fundamental, easy but powerful mechanism of Virtual Reality Simulation system on World Wide Web. The basic idea is to use Virtual Reality Modeling Language (VRML) to build the virtual world, and design a specific simulator to complete the common simulation work and drive the VR animation. According to the achievable mathematics and animation function, two types of this VR Simulation system are founded. The first one is to use VRMLScript or JavaScript to code the specific simulator. This mechanism really can be realized, however, the mathematical operations and the simulation model scale are limited. The other is to apply Java for coding the simulator, then use HTML to combine the VR scene and the simulator in the same WebPage, which can harmonize the VR animation running according to the simulation logic. Because Java is fully mathematics functioned and the Java code modules are entirely reusable, this VR simulation system, which is mainly recommended, can be easily realized on desktop PC and meet the basic interactive requirements of VR Technology without any extra hardware. A VR M/M/1/k Queuing simulation system is given to explain the mechanism. Finally, the overall Integrated Development Environment for this VR simulation system is also discussed.

  5. Sensorimotor Training in Virtual Reality: A Review

    PubMed Central

    Adamovich, Sergei V.; Fluet, Gerard G.; Tunik, Eugene; Merians, Alma S.

    2010-01-01

    Recent experimental evidence suggests that rapid advancement of virtual reality (VR) technologies has great potential for the development of novel strategies for sensorimotor training in neurorehabilitation. We discuss what the adaptive and engaging virtual environments can provide for massive and intensive sensorimotor stimulation needed to induce brain reorganization. Second, discrepancies between the veridical and virtual feedback can be introduced in VR to facilitate activation of targeted brain networks, which in turn can potentially speed up the recovery process. Here we review the existing experimental evidence regarding the beneficial effects of training in virtual environments on the recovery of function in the areas of gait, upper extremity function and balance, in various patient populations. We also discuss possible mechanisms underlying these effects. We feel that future research in the area of virtual rehabilitation should follow several important paths. Imaging studies to evaluate the effects of sensory manipulation on brain activation patterns and the effect of various training parameters on long term changes in brain function are needed to guide future clinical inquiry. Larger clinical studies are also needed to establish the efficacy of sensorimotor rehabilitation using VR approaches in various clinical populations and most importantly, to identify VR training parameters that are associated with optimal transfer into real-world functional improvements. PMID:19713617

  6. The Engelbourg's ruins: from 3D TLS point cloud acquisition to 3D virtual and historic models

    NASA Astrophysics Data System (ADS)

    Koehl, Mathieu; Berger, Solveig; Nobile, Sylvain

    2014-05-01

    The Castle of Engelbourg was built at the beginning of the 13th century, at the top of the Schlossberg. It is situated on the territory of the municipality of Thann (France), at the crossroads of Alsace and Lorraine, and dominates the outlet of the valley of Thur. Its strategic position was one of the causes of its systematic destructions during the 17th century, and Louis XIV finished his fate by ordering his demolition in 1673. Today only few vestiges remain, of which a section of the main tower from about 7m of diameter and 4m of wide laying on its slice, unique characteristic in the regional castral landscape. It is visible since the valley, was named "the Eye of the witch", and became a key attraction of the region. The site, which extends over approximately one hectare, is for several years the object of numerous archaeological studies and is at the heart of a project of valuation of the vestiges today. It was indeed a key objective, among the numerous planned works, to realize a 3D model of the site in its current state, in other words, a virtual model "such as seized", exploitable as well from a cultural and tourist point of view as by scientists and in archaeological researches. The team of the ICube/INSA lab had in responsibility the realization of this model, the acquisition of the data until the delivery of the virtual model, thanks to 3D TLS and topographic surveying methods. It was also planned to integrate into this 3D model, data of 2D archives, stemming from series of former excavations. The objectives of this project were the following ones: • Acquisition of 3D digital data of the site and 3D modelling • Digitization of the 2D archaeological data and integration in the 3D model • Implementation of a database connected to the 3D model • Virtual Visit of the site The obtained results allowed us to visualize every 3D object individually, under several forms (point clouds, 3D meshed objects and models, etc.) and at several levels of detail

  7. Interaction Design and Usability of Learning Spaces in 3D Multi-user Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Minocha, Shailey; Reeves, Ahmad John

    Three-dimensional virtual worlds are multimedia, simulated environments, often managed over the Web, which users can 'inhabit' and interact via their own graphical, self-representations known as 'avatars'. 3D virtual worlds are being used in many applications: education/training, gaming, social networking, marketing and commerce. Second Life is the most widely used 3D virtual world in education. However, problems associated with usability, navigation and way finding in 3D virtual worlds may impact on student learning and engagement. Based on empirical investigations of learning spaces in Second Life, this paper presents design guidelines to improve the usability and ease of navigation in 3D spaces. Methods of data collection include semi-structured interviews with Second Life students, educators and designers. The findings have revealed that design principles from the fields of urban planning, Human- Computer Interaction, Web usability, geography and psychology can influence the design of spaces in 3D multi-user virtual environments.

  8. Effectiveness of Collaborative Learning with 3D Virtual Worlds

    ERIC Educational Resources Information Center

    Cho, Young Hoan; Lim, Kenneth Y. T.

    2017-01-01

    Virtual worlds have affordances to enhance collaborative learning in authentic contexts. Despite the potential of collaborative learning with a virtual world, few studies investigated whether it is more effective in student achievements than teacher-directed instruction. This study investigated the effectiveness of collaborative problem solving…

  9. Contextual EFL Learning in a 3D Virtual Environment

    ERIC Educational Resources Information Center

    Lan, Yu-Ju

    2015-01-01

    The purposes of the current study are to develop virtually immersive EFL learning contexts for EFL learners in Taiwan to pre- and review English materials beyond the regular English class schedule. A 2-iteration action research lasting for one semester was conducted to evaluate the effects of virtual contexts on learners' EFL learning. 132…

  10. Virtual and Printed 3D Models for Teaching Crystal Symmetry and Point Groups

    ERIC Educational Resources Information Center

    Casas, Lluís; Estop, Euge`nia

    2015-01-01

    Both, virtual and printed 3D crystal models can help students and teachers deal with chemical education topics such as symmetry and point groups. In the present paper, two freely downloadable tools (interactive PDF files and a mobile app) are presented as examples of the application of 3D design to study point-symmetry. The use of 3D printing to…

  11. A rapid algorithm for realistic human reaching and its use in a virtual reality system

    NASA Technical Reports Server (NTRS)

    Aldridge, Ann; Pandya, Abhilash; Goldsby, Michael; Maida, James

    1994-01-01

    The Graphics Analysis Facility (GRAF) at JSC has developed a rapid algorithm for computing realistic human reaching. The algorithm was applied to GRAF's anthropometrically correct human model and used in a 3D computer graphics system and a virtual reality system. The nature of the algorithm and its uses are discussed.

  12. Photographer: Digital Telepresence: Dr Murial Ross's Virtual Reality Application for Neuroscience

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Photographer: Digital Telepresence: Dr Murial Ross's Virtual Reality Application for Neuroscience Research Biocomputation. To study human disorders of balance and space motion sickness. Shown here is a 3D reconstruction of a nerve ending in inner ear, nature's wiring of balance organs.

  13. Linking Audio and Visual Information while Navigating in a Virtual Reality Kiosk Display

    ERIC Educational Resources Information Center

    Sullivan, Briana; Ware, Colin; Plumlee, Matthew

    2006-01-01

    3D interactive virtual reality museum exhibits should be easy to use, entertaining, and informative. If the interface is intuitive, it will allow the user more time to learn the educational content of the exhibit. This research deals with interface issues concerning activating audio descriptions of images in such exhibits while the user is…

  14. Virtual reality for automotive design evaluation

    NASA Technical Reports Server (NTRS)

    Dodd, George G.

    1995-01-01

    A general description of Virtual Reality technology and possible applications was given from publicly available material. A video tape was shown demonstrating the use of multiple large-screen stereoscopic displays, configured in a 10' x 10' x 10' room, to allow a person to evaluate and interact with a vehicle which exists only as mathematical data, and is made only of light. The correct viewpoint of the vehicle is maintained by tracking special glasses worn by the subject. Interior illumination was changed by moving a virtual light around by hand; interior colors are changed by pointing at a color on a color palette, then pointing at the desired surface to change. We concluded by discussing research needed to move this technology forward.

  15. Virtual reality for health care: a survey.

    PubMed

    Moline, J

    1997-01-01

    This report surveys the state of the art in applications of virtual environments and related technologies for health care. Applications of these technologies are being developed for health care in the following areas: surgical procedures (remote surgery or telepresence, augmented or enhanced surgery, and planning and simulation of procedures before surgery); medical therapy; preventive medicine and patient education; medical education and training; visualization of massive medical databases; skill enhancement and rehabilitation; and architectural design for health-care facilities. To date, such applications have improved the quality of health care, and in the future they will result in substantial cost savings. Tools that respond to the needs of present virtual environment systems are being refined or developed. However, additional large-scale research is necessary in the following areas: user studies, use of robots for telepresence procedures, enhanced system reality, and improved system functionality.

  16. Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine.

    PubMed

    Lee, S; Lee, J; Lee, A; Park, N; Lee, S; Song, S; Seo, A; Lee, H; Kim, J-I; Eom, K

    2013-05-01

    Augmented reality (AR) is a technology which enables users to see the real world, with virtual objects superimposed upon or composited with it. AR simulators have been developed and used in human medicine, but not in veterinary medicine. The aim of this study was to develop an AR intravenous (IV) injection simulator to train veterinary and pre-veterinary students to perform canine venipuncture. Computed tomographic (CT) images of a beagle dog were scanned using a 64-channel multidetector. The CT images were transformed into volumetric data sets using an image segmentation method and were converted into a stereolithography format for creating 3D models. An AR-based interface was developed for an AR simulator for IV injection. Veterinary and pre-veterinary student volunteers were randomly assigned to an AR-trained group or a control group trained using more traditional methods (n = 20/group; n = 8 pre-veterinary students and n = 12 veterinary students in each group) and their proficiency at IV injection technique in live dogs was assessed after training was completed. Students were also asked to complete a questionnaire which was administered after using the simulator. The group that was trained using an AR simulator were more proficient at IV injection technique using real dogs than the control group (P ≤ 0.01). The students agreed that they learned the IV injection technique through the AR simulator. Although the system used in this study needs to be modified before it can be adopted for veterinary educational use, AR simulation has been shown to be a very effective tool for training medical personnel. Using the technology reported here, veterinary AR simulators could be developed for future use in veterinary education.

  17. Reaching to virtual targets: The oblique effect reloaded in 3-D.

    PubMed

    Kaspiris-Rousellis, Christos; Siettos, Constantinos I; Evdokimidis, Ioannis; Smyrnis, Nikolaos

    2017-02-20

    Perceiving and reproducing direction of visual stimuli in 2-D space produces the visual oblique effect, which manifests as increased precision in the reproduction of cardinal compared to oblique directions. A second cognitive oblique effect emerges when stimulus information is degraded (such as when reproducing stimuli from memory) and manifests as a systematic distortion where reproduced directions close to the cardinal axes deviate toward the oblique, leading to space expansion at cardinal and contraction at oblique axes. We studied the oblique effect in 3-D using a virtual reality system to present a large number of stimuli, covering the surface of an imaginary half sphere, to which subjects had to reach. We used two conditions, one with no delay (no-memory condition) and one where a three-second delay intervened between stimulus presentation and movement initiation (memory condition). A visual oblique effect was observed for the reproduction of cardinal directions compared to oblique, which did not differ with memory condition. A cognitive oblique effect also emerged, which was significantly larger in the memory compared to the no-memory condition, leading to distortion of directional space with expansion near the cardinal axes and compression near the oblique axes on the hemispherical surface. This effect provides evidence that existing models of 2-D directional space categorization could be extended in the natural 3-D space.

  18. Evaluation of Home Delivery of Lectures Utilizing 3D Virtual Space Infrastructure

    ERIC Educational Resources Information Center

    Nishide, Ryo; Shima, Ryoichi; Araie, Hiromu; Ueshima, Shinichi

    2007-01-01

    Evaluation experiments have been essential in exploring home delivery of lectures for which users can experience campus lifestyle and distant learning through 3D virtual space. This paper discusses the necessity of virtual space for distant learners by examining the effects of virtual space. The authors have pursued the possibility of…

  19. Virtual reality applications in robotic simulations

    NASA Technical Reports Server (NTRS)

    Homan, David J.; Gott, Charles J.; Goza, S. Michael

    1994-01-01

    Virtual reality (VR) provides a means to practice integrated extravehicular activities (EVA)/remote manipulator system (RMS) operations in the on-orbit configuration with no discomfort or risk to crewmembers. VR afforded the STS-61 crew the luxury of practicing the integrated EVA/RMS operations in an on-orbit configuration prior to the actual flight. The VR simulation was developed by the Automation and Robotics Division's Telepresence/Virtual Reality Lab and Integrated Graphics, Operations, and Analysis Lab (IGOAL) at JSC. The RMS Part Task Trainer (PTT) was developed by the IGOAL for RMS training in 1988 as a fully functional, kinematic simulation of the shuttle RMS and served as the RMS portion of the integrated VR simulation. Because the EVA crewmember could get a realistic view of the shuttle and payload bay in the VR simulation, he/she could explore different positions and views to determine the best method for performing a specific task, thus greatly increasing the efficiency of use of the neutral buoyancy facilities.

  20. Virtual Application of Darul Arif Palace from Serdang Sultanate using Virtual Reality

    NASA Astrophysics Data System (ADS)

    Syahputra, M. F.; Annisa, T.; Rahmat, R. F.; Muchtar, M. A.

    2017-01-01

    Serdang Sultanate is one of Malay Sultanate in Sumatera Utara. In the 18th century, many Malay Aristocrats have developed in Sumatera Utara. Social revolution has happened in 1946, many sultanates were overthrown and member of PKI (Communist Party of Indonesia) did mass killing on members of the sultanate families. As the results of this incident, many cultural and historical heritage destroyed. The integration of heritage preservation and the digital technology has become recent trend. The digital technology is not only able to record, preserve detailed documents and information of heritage completely, but also effectively bring the value-added. In this research, polygonal modelling techniques from 3D modelling technology is used to reconstruct Darul Arif Palace of Serdang Sultanate. After modelling the palace, it will be combined with virtual reality technology to allow user to explore the palace and the environment around the palace. Virtual technology is simulation of real objects in virtual world. The results in this research is that virtual reality application can run using Head-Mounted Display.

  1. Virtual Reality Exposure Therapy Using a Virtual Iraq: Case Report

    PubMed Central

    Gerardi, Maryrose; Rothbaum, Barbara Olasov; Ressler, Kerry; Heekin, Mary; Rizzo, Albert

    2013-01-01

    Posttraumatic stress disorder (PTSD) has been estimated to affect up to 18% of returning Operation Iraqi Freedom (OIF) veterans. Soldiers need to maintain constant vigilance to deal with unpredictable threats, and an unprecedented number of soldiers are surviving serious wounds. These risk factors are significant for development of PTSD; therefore, early and efficient intervention options must be identified and presented in a form acceptable to military personnel. This case report presents the results of treatment utilizing virtual reality exposure (VRE) therapy (virtual Iraq) to treat an OIF veteran with PTSD. Following brief VRE treatment, the veteran demonstrated improvement in PTSD symptoms as indicated by clinically and statistically significant changes in scores on the Clinician Administered PTSD Scale (CAPS; Blake et al., 1990) and the PTSD Symptom Scale Self-Report (PSS-SR; Foa, Riggs, Dancu, & Rothbaum, 1993). These results indicate preliminary promise for this treatment. PMID:18404648

  2. Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality.

    PubMed

    Park, Youngmin; Lepetit, Vincent; Woo, Woontack

    2012-09-01

    The contribution of this paper is two-fold. First, we show how to extend the ESM algorithm to handle motion blur in 3D object tracking. ESM is a powerful algorithm for template matching-based tracking, but it can fail under motion blur. We introduce an image formation model that explicitly consider the possibility of blur, and shows its results in a generalization of the original ESM algorithm. This allows to converge faster, more accurately and more robustly even under large amount of blur. Our second contribution is an efficient method for rendering the virtual objects under the estimated motion blur. It renders two images of the object under 3D perspective, and warps them to create many intermediate images. By fusing these images we obtain a final image for the virtual objects blurred consistently with the captured image. Because warping is much faster than 3D rendering, we can create realistically blurred images at a very low computational cost.

  3. Virtual reality and the unfolding of higher dimensions

    NASA Astrophysics Data System (ADS)

    Aguilera, Julieta C.

    2006-02-01

    As virtual/augmented reality evolves, the need for spaces that are responsive to structures independent from three dimensional spatial constraints, become apparent. The visual medium of computer graphics may also challenge these self imposed constraints. If one can get used to how projections affect 3D objects in two dimensions, it may also be possible to compose a situation in which to get used to the variations that occur while moving through higher dimensions. The presented application is an enveloping landscape of concave and convex forms, which are determined by the orientation and displacement of the user in relation to a grid made of tesseracts (cubes in four dimensions). The interface accepts input from tridimensional and four-dimensional transformations, and smoothly displays such interactions in real-time. The motion of the user becomes the graphic element whereas the higher dimensional grid references to his/her position relative to it. The user learns how motion inputs affect the grid, recognizing a correlation between the input and the transformations. Mapping information to complex grids in virtual reality is valuable for engineers, artists and users in general because navigation can be internalized like a dance pattern, and further engage us to maneuver space in order to know and experience.

  4. Suitability of digital camcorders for virtual reality image data capture

    NASA Astrophysics Data System (ADS)

    D'Apuzzo, Nicola; Maas, Hans-Gerd

    1998-12-01

    Today's consumer market digital camcorders offer features which make them appear quite interesting devices for virtual reality data capture. The paper compares a digital camcorder with an analogue camcorder and a machine vision type CCD camera and discusses the suitability of these three cameras for virtual reality applications. Besides the discussion of technical features of the cameras, this includes a detailed accuracy test in order to define the range of applications. In combination with the cameras, three different framegrabbers are tested. The geometric accuracy potential of all three cameras turned out to be surprisingly large, and no problems were noticed in the radiometric performance. On the other hand, some disadvantages have to be reported: from the photogrammetrists point of view, the major disadvantage of most camcorders is the missing possibility to synchronize multiple devices, limiting the suitability for 3-D motion data capture. Moreover, the standard video format contains interlacing, which is also undesirable for all applications dealing with moving objects or moving cameras. Further disadvantages are computer interfaces with functionality, which is still suboptimal. While custom-made solutions to these problems are probably rather expensive (and will make potential users turn back to machine vision like equipment), this functionality could probably be included by the manufacturers at almost zero cost.

  5. Orchestrating learning during implementation of a 3D virtual world

    NASA Astrophysics Data System (ADS)

    Karakus, Turkan; Baydas, Ozlem; Gunay, Fatma; Coban, Murat; Goktas, Yuksel

    2016-10-01

    There are many issues to be considered when designing virtual worlds for educational purposes. In this study, the term orchestration has acquired a new definition as the moderation of problems encountered during the activity of turning a virtual world into an educational setting for winter sports. A development case showed that community plays a key role in both the emergence of challenges and in the determination of their solutions. The implications of this study showed that activity theory was a useful tool for understanding contextual issues. Therefore, instructional designers first developed relevant tools and community-based solutions. This study attempts to use activity theory in a prescriptive way, though it is known as a descriptive theory. Finally, since virtual world projects have many aspects, the variety of challenges and practical solutions presented in this study will provide practitioners with suggestions on how to overcome problems in future.

  6. Virtual Reality and Multiple Intelligences: Potentials for Higher Education.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    1994-01-01

    Discussion of the use of virtual reality in higher education looks at how this emerging computer-based technology can promote learning that engages all seven forms of intelligence proposed in H. Gardner's theory of multiple intelligences. Technical and conceptual issues in implementation of virtual reality in education are also examined.…

  7. A Desktop Virtual Reality Earth Motion System in Astronomy Education

    ERIC Educational Resources Information Center

    Chen, Chih Hung; Yang, Jie Chi; Shen, Sarah; Jeng, Ming Chang

    2007-01-01

    In this study, a desktop virtual reality earth motion system (DVREMS) is designed and developed to be applied in the classroom. The system is implemented to assist elementary school students to clarify earth motion concepts using virtual reality principles. A study was conducted to observe the influences of the proposed system in learning.…

  8. Treatment of Complicated Grief Using Virtual Reality: A Case Report

    ERIC Educational Resources Information Center

    Botella, C.; Osma, J.; Palacios, A. Garcia; Guillen, V.; Banos, R.

    2008-01-01

    This is the first work exploring the application of new technologies, concretely virtual reality, to facilitate emotional processing in the treatment of Complicated Grief. Our research team has designed a virtual reality environment (EMMA's World) to foster the expression and processing of emotions. In this study the authors present a description…

  9. Designing a Virtual-Reality-Based, Gamelike Math Learning Environment

    ERIC Educational Resources Information Center

    Xu, Xinhao; Ke, Fengfeng

    2016-01-01

    This exploratory study examined the design issues related to a virtual-reality-based, gamelike learning environment (VRGLE) developed via OpenSimulator, an open-source virtual reality server. The researchers collected qualitative data to examine the VRGLE's usability, playability, and content integration for math learning. They found it important…

  10. Virtual reality, augmented reality…I call it i-Reality

    PubMed Central

    2015-01-01

    The new term improved reality (i-Reality) is suggested to include virtual reality (VR) and augmented reality (AR). It refers to a real world that includes improved, enhanced and digitally created features that would offer an advantage on a particular occasion (i.e., a medical act). I-Reality may help us bridge the gap between the high demand for medical providers and the low supply of them by improving the interaction between providers and patients. PMID:28293571

  11. Virtual reality, augmented reality…I call it i-Reality.

    PubMed

    Grossmann, Rafael J

    2015-01-01

    The new term improved reality (i-Reality) is suggested to include virtual reality (VR) and augmented reality (AR). It refers to a real world that includes improved, enhanced and digitally created features that would offer an advantage on a particular occasion (i.e., a medical act). I-Reality may help us bridge the gap between the high demand for medical providers and the low supply of them by improving the interaction between providers and patients.

  12. Role of virtual reality simulation in endoscopy training.

    PubMed

    Harpham-Lockyer, Louis; Laskaratos, Faidon-Marios; Berlingieri, Pasquale; Epstein, Owen

    2015-12-10

    Recent advancements in virtual reality graphics and models have allowed virtual reality simulators to be incorporated into a variety of endoscopic training programmes. Use of virtual reality simulators in training programmes is thought to improve skill acquisition amongst trainees which is reflected in improved patient comfort and safety. Several studies have already been carried out to ascertain the impact that usage of virtual reality simulators may have upon trainee learning curves and how this may translate to patient comfort. This article reviews the available literature in this area of medical education which is particularly relevant to all parties involved in endoscopy training and curriculum development. Assessment of the available evidence for an optimal exposure time with virtual reality simulators and the long-term benefits of their use are also discussed.

  13. 3D Object Recognition: Symmetry and Virtual Views

    DTIC Science & Technology

    1992-12-01

    NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATIONI Artificial Intelligence Laboratory REPORT NUMBER 545 Technology Square AIM 1409 Cambridge... ARTIFICIAL INTELLIGENCE LABORATORY and CENTER FOR BIOLOGICAL AND COMPUTATIONAL LEARNING A.I. Memo No. 1409 December 1992 C.B.C.L. Paper No. 76 3D Object...research done within the Center for Biological and Computational Learning in the Department of Brain and Cognitive Sciences, and at the Artificial

  14. 3D Virtual Images and Forensic Identification Training

    DTIC Science & Technology

    2010-08-04

    properly trained for these duties, as a minimum all Air Force Dental Residency programs (13 sites) require a course in forensic dentistry including...Nuinber: FKE20080002E 2. Title: ŗD Virtual Images and Forensic Identification Training" 3. Principal Investigator (PI): Stephanie A. Stouder, Lt Col...47XX identifies the requirement for initial and annual training in forensic identification for all AF Dentists. Currently, to ensure that dentists are

  15. Virtual Presence: One Step Beyond Reality

    NASA Technical Reports Server (NTRS)

    Budden, Nancy Ann

    1997-01-01

    Our primary objective was to team up a group consisting of scientists and engineers from two different NASA cultures, and simulate an interactive teleoperated robot conducting geologic field work on the Moon or Mars. The information derived from the experiment will benefit both the robotics team and the planetary exploration team in the areas of robot design and development, and mission planning and analysis. The Earth Sciences and Space and Life Sciences Division combines the past with the future contributing experience from Apollo crews exploring the lunar surface, knowledge of reduced gravity environments, the performance limits of EVA suits, and future goals for human exploration beyond low Earth orbit. The Automation, Robotics. and Simulation Division brings to the table the technical expertise of robotic systems, the future goals of highly interactive robotic capabilities, treading on the edge of technology by joining for the first time a unique combination of telepresence with virtual reality.

  16. Applied virtual reality in aerospace design

    NASA Astrophysics Data System (ADS)

    Hale, Joseph P.

    1995-09-01

    A virtual reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. The objectives of the MSFC VR Applications Program are to develop, assess, validate, and utilize VR in hardware development, operations development and support, mission operations training and science training. Before VR can be used with confidence in a particular application, VR must be validated for that class of applications. For that reason, specific validation studies for selected classes of applications have been proposed and are currently underway. These include macro-ergonomic 'control room class' design analysis, Spacelab stowage reconfiguration training, a full-body microgravity functional reach simulator, a gross anatomy teaching simulator, and micro-ergonomic design analysis. This paper describes the MSFC VR Applications Program and the validation studies.

  17. Applied virtual reality in aerospace design

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P.

    1995-01-01

    A virtual reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. The objectives of the MSFC VR Applications Program are to develop, assess, validate, and utilize VR in hardware development, operations development and support, mission operations training and science training. Before VR can be used with confidence in a particular application, VR must be validated for that class of applications. For that reason, specific validation studies for selected classes of applications have been proposed and are currently underway. These include macro-ergonomic 'control room class' design analysis, Spacelab stowage reconfiguration training, a full-body microgravity functional reach simulator, a gross anatomy teaching simulator, and micro-ergonomic design analysis. This paper describes the MSFC VR Applications Program and the validation studies.

  18. Dissociation in virtual reality: depersonalization and derealization

    NASA Astrophysics Data System (ADS)

    Garvey, Gregory P.

    2010-01-01

    This paper looks at virtual worlds such as Second Life7 (SL) as possible incubators of dissociation disorders as classified by the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition3 (also known as the DSM-IV). Depersonalization is where "a person feels that he or she has changed in some way or is somehow unreal." Derealization when "the same beliefs are held about one's surroundings." Dissociative Identity Disorder (DID), previously known as multiple personality disorder fits users of Second Life who adopt "in-world" avatars and in effect, enact multiple distinct identities or personalities (known as alter egos or alters). Select questions from the Structured Clinical Interview for Depersonalization (SCI-DER)8 will be discussed as they might apply to the user's experience in Second Life. Finally I would like to consider the hypothesis that rather than a pathological disorder, dissociation is a normal response to the "artificial reality" of Second Life.

  19. EEG correlates of virtual reality hypnosis.

    PubMed

    White, David; Ciorciari, Joseph; Carbis, Colin; Liley, David

    2009-01-01

    The study investigated hypnosis-related electroencephalographic (EEG) coherence and power spectra changes in high and low hypnotizables (Stanford Hypnotic Clinical Scale) induced by a virtual reality hypnosis (VRH) induction system. In this study, the EEG from 17 participants (Mean age = 21.35, SD = 1.58) were compared based on their hypnotizability score. The EEG recording associated with a 2-minute, eyes-closed baseline state was compared to the EEG during a hypnosis-related state. This novel induction system was able to produce EEG findings consistent with previous hypnosis literature. Interactions of significance were found with EEG beta coherence. The high susceptibility group (n = 7) showed decreased coherence, while the low susceptibility group (n = 10) demonstrated an increase in coherence between medial frontal and lateral left prefrontal sites. Methodological and efficacy issues are discussed.

  20. Simulation and virtual reality in surgical education: real or unreal?

    PubMed

    Gorman, P J; Meier, A H; Krummel, T M

    1999-11-01

    Rapid change is under way on several fronts in medicine and surgery. Advances in computing power have enabled continued growth in virtual reality, visualization, and simulation technologies. The ideal learning opportunities afforded by simulated and virtual environments have prompted their exploration as learning modalities for surgical education and training. Ongoing improvements in this technology suggest an important future role for virtual reality and simulation in surgical education and training.

  1. An Onboard ISS Virtual Reality Trainer

    NASA Technical Reports Server (NTRS)

    Miralles, Evelyn

    2013-01-01

    Prior to the retirement of the Space Shuttle, many exterior repairs on the International Space Station (ISS) were carried out by shuttle astronauts, trained on the ground and flown to the station to perform these repairs. After the retirement of the shuttle, this is no longer an available option. As such, the need for the ISS crew members to review scenarios while on flight, either for tasks they already trained or for contingency operations has become a very critical subject. In many situations, the time between the last session of Neutral Buoyancy Laboratory (NBL) training and an Extravehicular Activity (EVA) task might be 6 to 8 months. In order to help with training for contingency repairs and to maintain EVA proficiency while on flight, the Johnson Space Center Virtual Reality Lab (VRLab) designed an onboard immersive ISS Virtual Reality Trainer (VRT), incorporating a unique optical system and making use of the already successful Dynamic Onboard Ubiquitous Graphical (DOUG) graphics software, to assist crew members with current procedures and contingency EVAs while on flight. The VRT provides an immersive environment similar to the one experienced at the VRLab crew training facility at NASA Johnson Space Center. EVA tasks are critical for a mission since as time passes the crew members may lose proficiency on previously trained tasks. In addition, there is an increased need for unplanned contingency repairs to fix problems arising as the ISS ages. The need to train and re-train crew members for EVAs and contingency scenarios is crucial and extremely demanding. ISS crew members are now asked to perform EVA tasks for which they have not been trained and potentially have never seen before.

  2. Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project.

    PubMed

    Zucchella, Chiara; Sinforiani, Elena; Tassorelli, Cristina; Cavallini, Elena; Tost-Pardell, Daniela; Grau, Sergi; Pazzi, Stefania; Puricelli, Stefano; Bernini, Sara; Bottiroli, Sara; Vecchi, Tomaso; Sandrini, Giorgio; Nappi, Giuseppe

    2014-01-01

    Conventional cognitive assessment is based on a pencil-and-paper neuropsychological evaluation, which is time consuming, expensive and requires the involvement of several professionals. Information and communication technology could be exploited to allow the development of tools that are easy to use, reduce the amount of data processing, and provide controllable test conditions. Serious games (SGs) have the potential to be new and effective tools in the management and treatment of cognitive impairments Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project in the elderly. Moreover, by adopting SGs in 3D virtual reality settings, cognitive functions might be evaluated using tasks that simulate daily activities, increasing the "ecological validity" of the assessment. In this commentary we report our experience in the creation of the Smart Aging platform, a 3D SGand virtual environment-based platform for the early identification and characterization of mild cognitive impairment.

  3. Spilling the beans on java 3D: a tool for the virtual anatomist.

    PubMed

    Guttmann, G D

    1999-04-15

    The computing world has just provided the anatomist with another tool: Java 3D, within the Java 2 platform. On December 9, 1998, Sun Microsystems released Java 2. Java 3D classes are now included in the jar (Java Archive) archives of the extensions directory of Java 2. Java 3D is also a part of the Java Media Suite of APIs (Application Programming Interfaces). But what is Java? How does Java 3D work? How do you view Java 3D objects? A brief introduction to the concepts of Java and object-oriented programming is provided. Also, there is a short description of the tools of Java 3D and of the Java 3D viewer. Thus, the virtual anatomist has another set of computer tools to use for modeling various aspects of anatomy, such as embryological development. Also, the virtual anatomist will be able to assist the surgeon with virtual surgery using the tools found in Java 3D. Java 3D will be able to fulfill gaps, such as the lack of platform independence, interactivity, and manipulability of 3D images, currently existing in many anatomical computer-aided learning programs.

  4. A virtual reality scenario for all seasons: the virtual classroom.

    PubMed

    Rizzo, Albert A; Bowerly, Todd; Buckwalter, J Galen; Klimchuk, Dean; Mitura, Roman; Parsons, Thomas D

    2006-01-01

    Treatment and rehabilitation of the cognitive, psychological, and motor sequelae of central nervous system dysfunction often relies on assessment instruments to inform diagnosis and to track changes in clinical status. Typically, these assessments employ paper-and-pencil psychometrics, hands-on analog/computer tests, and rating of behavior within the context of real-world functional environments. Virtual reality offers the option to produce and distribute identical "standard" simulation environments in which performance can be measured and rehabilitated. Within such digital scenarios, normative data can be accumulated for performance comparisons needed for assessment/diagnosis and for treatment/rehabilitation purposes. In this manner, reusable archetypic virtual environments constructed for one purpose can also be applied for applications addressing other clinical targets. This article will provide a review of such a retooling approach using a virtual classroom simulation that was originally developed as a controlled stimulus environment in which attention processes could be systematically assessed in children with attention-deficit/hyperactivity disorder. This system is now being applied to other clinical targets including the development of tests that address other cognitive functions, eye movement under distraction conditions, social anxiety disorder, and the creation of an earthquake safety training application for children with developmental and learning disabilities.

  5. Combination of Virtual Tours, 3d Model and Digital Data in a 3d Archaeological Knowledge and Information System

    NASA Astrophysics Data System (ADS)

    Koehl, M.; Brigand, N.

    2012-08-01

    The site of the Engelbourg ruined castle in Thann, Alsace, France, has been for some years the object of all the attention of the city, which is the owner, and also of partners like historians and archaeologists who are in charge of its study. The valuation of the site is one of the main objective, as well as its conservation and its knowledge. The aim of this project is to use the environment of the virtual tour viewer as new base for an Archaeological Knowledge and Information System (AKIS). With available development tools we add functionalities in particular through diverse scripts that convert the viewer into a real 3D interface. By beginning with a first virtual tour that contains about fifteen panoramic images, the site of about 150 times 150 meters can be completely documented by offering the user a real interactivity and that makes visualization very concrete, almost lively. After the choice of pertinent points of view, panoramic images were realized. For the documentation, other sets of images were acquired at various seasons and climate conditions, which allow documenting the site in different environments and states of vegetation. The final virtual tour was deducted from them. The initial 3D model of the castle, which is virtual too, was also joined in the form of panoramic images for completing the understanding of the site. A variety of types of hotspots were used to connect the whole digital documentation to the site, including videos (as reports during the acquisition phases, during the restoration works, during the excavations, etc.), digital georeferenced documents (archaeological reports on the various constituent elements of the castle, interpretation of the excavations and the searches, description of the sets of collected objects, etc.). The completely personalized interface of the system allows either to switch from a panoramic image to another one, which is the classic case of the virtual tours, or to go from a panoramic photographic image

  6. Virtual reality, augmented reality, and robotics applied to digestive operative procedures: from in vivo animal preclinical studies to clinical use

    NASA Astrophysics Data System (ADS)

    Soler, Luc; Marescaux, Jacques

    2006-04-01

    Technological innovations of the 20 th century provided medicine and surgery with new tools, among which virtual reality and robotics belong to the most revolutionary ones. Our work aims at setting up new techniques for detection, 3D delineation and 4D time follow-up of small abdominal lesions from standard mecial images (CT scsan, MRI). It also aims at developing innovative systems making tumor resection or treatment easier with the use of augmented reality and robotized systems, increasing gesture precision. It also permits a realtime great distance connection between practitioners so they can share a same 3D reconstructed patient and interact on a same patient, virtually before the intervention and for real during the surgical procedure thanks to a telesurgical robot. In preclinical studies, our first results obtained from a micro-CT scanner show that these technologies provide an efficient and precise 3D modeling of anatomical and pathological structures of rats and mice. In clinical studies, our first results show the possibility to improve the therapeutic choice thanks to a better detection and and representation of the patient before performing the surgical gesture. They also show the efficiency of augmented reality that provides virtual transparency of the patient in real time during the operative procedure. In the near future, through the exploitation of these systems, surgeons will program and check on the virtual patient clone an optimal procedure without errors, which will be replayed on the real patient by the robot under surgeon control. This medical dream is today about to become reality.

  7. A Parameterizable Framework for Replicated Experiments in Virtual 3D Environments

    NASA Astrophysics Data System (ADS)

    Biella, Daniel; Luther, Wolfram

    This paper reports on a parameterizable 3D framework that provides 3D content developers with an initial spatial starting configuration, metaphorical connectors for accessing exhibits or interactive 3D learning objects or experiments, and other optional 3D extensions, such as a multimedia room, a gallery, username identification tools and an avatar selection room. The framework is implemented in X3D and uses a Web-based content management system. It has been successfully used for an interactive virtual museum for key historical experiments and in two additional interactive e-learning implementations: an African arts museum and a virtual science centre. It can be shown that, by reusing the framework, the production costs for the latter two implementations can be significantly reduced and content designers can focus on developing educational content instead of producing cost-intensive out-of-focus 3D objects.

  8. Using voice input and audio feedback to enhance the reality of a virtual experience

    SciTech Connect

    Miner, N.E.

    1994-04-01

    Virtual Reality (VR) is a rapidly emerging technology which allows participants to experience a virtual environment through stimulation of the participant`s senses. Intuitive and natural interactions with the virtual world help to create a realistic experience. Typically, a participant is immersed in a virtual environment through the use of a 3-D viewer. Realistic, computer-generated environment models and accurate tracking of a participant`s view are important factors for adding realism to a virtual experience. Stimulating a participant`s sense of sound and providing a natural form of communication for interacting with the virtual world are equally important. This paper discusses the advantages and importance of incorporating voice recognition and audio feedback capabilities into a virtual world experience. Various approaches and levels of complexity are discussed. Examples of the use of voice and sound are presented through the description of a research application developed in the VR laboratory at Sandia National Laboratories.

  9. Employing Virtual Humans for Education and Training in X3D/VRML Worlds

    ERIC Educational Resources Information Center

    Ieronutti, Lucio; Chittaro, Luca

    2007-01-01

    Web-based education and training provides a new paradigm for imparting knowledge; students can access the learning material anytime by operating remotely from any location. Web3D open standards, such as X3D and VRML, support Web-based delivery of Educational Virtual Environments (EVEs). EVEs have a great potential for learning and training…

  10. Virtual reality in medical education and assessment

    NASA Technical Reports Server (NTRS)

    Sprague, Laurie A.; Bell, Brad; Sullivan, Tim; Voss, Mark; Payer, Andrew F.; Goza, Stewart Michael

    1994-01-01

    The NASA Johnson Space Center (JSC)/LinCom Corporation, the University of Texas Medical Branch at Galveston (UTMB), and the Galveston Independent School District (GISD) have teamed up to develop a virtual visual environment display (VIVED) that provides a unique educational experience using virtual reality (VR) technologies. The VIVED end product will be a self-contained educational experience allowing students a new method of learning as they interact with the subject matter through VR. This type of interface is intuitive and utilizes spatial and psychomotor abilities which are now constrained or reduced by the current two dimensional terminals and keyboards. The perpetual challenge to educators remains the identification and development of methodologies which conform the learners abilities and preferences. The unique aspects of VR provide an opportunity to explore a new educational experience. Endowing medical students with an understanding of the human body poses some difficulty challenges. One of the most difficult is to convey the three dimensional nature of anatomical structures. The ideal environment for addressing this problem would be one that allows students to become small enough to enter the body and travel through it - much like a person walks through a building. By using VR technology, this effect can be achieved; when VR is combined with multimedia technologies, the effect can be spectacular.

  11. Using Virtual Reality to Rehabilitate Neglect

    PubMed Central

    Sedda, A.; Borghese, N. A.; Ronchetti, M.; Mainetti, R.; Pasotti, F.; Beretta, G.; Bottini, G.

    2013-01-01

    Purpose: Virtual Reality (VR) platforms gained a lot of attention in the rehabilitation field due to their ability to engage patients and the opportunity they offer to use real world scenarios. As neglect is characterized by an impairment in exploring space that greatly affects daily living, VR could be a powerful tool compared to classical paper and pencil tasks and computer training. Nevertheless, available platforms are costly and obstructive. Here we describe a low cost platform for neglect rehabilitation, that using consumer equipments allows the patient to train at home in an intensive fashion. Method: We tested the platform on IB, a chronic neglect patient, who did not benefit from classical rehabilitation. Results: Our results show that IB improved both in terms of neglect and attention. Importantly, these ameliorations lasted at a follow up evaluation 5 months after the last treatment session and generalized to everyday life activities. Conclusions: VR platforms built using equipment technology and following theoretical principles on brain functioning may induce greater ameliorations in visuo-spatial deficits than classical paradigms possibly thanks to the real world scenarios in association with the “visual feedback” of the patient’s own body operating in the virtual environment. PMID:22713415

  12. Implementing virtual reality interfaces for the geosciences

    SciTech Connect

    Bethel, W.; Jacobsen, J.; Austin, A.; Lederer, M.; Little, T.

    1996-06-01

    For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called {open_quotes}Virtual Reality{close_quotes} (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools. VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter three or six-dimensional parameters. In the latter, three or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world, but the keyboard is the obstacle in that typing is cumbersome, error-prone and typically slow. In the latter, the user can interact with these parameters by means of motor skills which are highly developed. Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as a tool for computing streamlines via manipulation of a {open_quotes}rake.{close_quotes} The rake is presented to the user in the form of a {open_quotes}virtual well{close_quotes} icon, and provides parameters used by the streamlines algorithm.

  13. Image guidance of breast cancer surgery using 3-D ultrasound images and augmented reality visualization.

    PubMed

    Sato, Y; Nakamoto, M; Tamaki, Y; Sasama, T; Sakita, I; Nakajima, Y; Monden, M; Tamura, S

    1998-10-01

    This paper describes augmented reality visualization for the guidance of breast-conservative cancer surgery using ultrasonic images acquired in the operating room just before surgical resection. By combining an optical three-dimensional (3-D) position sensor, the position and orientation of each ultrasonic cross section are precisely measured to reconstruct geometrically accurate 3-D tumor models from the acquired ultrasonic images. Similarly, the 3-D position and orientation of a video camera are obtained to integrate video and ultrasonic images in a geometrically accurate manner. Superimposing the 3-D tumor models onto live video images of the patient's breast enables the surgeon to perceive the exact 3-D position of the tumor, including irregular cancer invasions which cannot be perceived by touch, as if it were visible through the breast skin. Using the resultant visualization, the surgeon can determine the region for surgical resection in a more objective and accurate manner, thereby minimizing the risk of a relapse and maximizing breast conservation. The system was shown to be effective in experiments using phantom and clinical data.

  14. The Virtual-casing Principle For 3D Toroidal Systems

    SciTech Connect

    Lazerson, Samuel A.

    2014-02-24

    The capability to calculate the magnetic field due to the plasma currents in a toroidally confined magnetic fusion equilibrium is of manifest relevance to equilibrium reconstruction and stellarator divertor design. Two methodologies arise for calculating such quantities. The first being a volume integral over the plasma current density for a given equilibrium. Such an integral is computationally expensive. The second is a surface integral over a surface current on the equilibrium boundary. This method is computationally desirable as the calculation does not grow as the radial resolution of the volume integral. This surface integral method has come to be known as the "virtual-casing principle". In this paper, a full derivation of this method is presented along with a discussion regarding its optimal application.

  15. Computer-assisted three-dimensional surgical planning: 3D virtual articulator: technical note.

    PubMed

    Ghanai, S; Marmulla, R; Wiechnik, J; Mühling, J; Kotrikova, B

    2010-01-01

    This study presents a computer-assisted planning system for dysgnathia treatment. It describes the process of information gathering using a virtual articulator and how the splints are constructed for orthognathic surgery. The deviation of the virtually planned splints is shown in six cases on the basis of conventionally planned cases. In all cases the plaster models were prepared and scanned using a 3D laser scanner. Successive lateral and posterior-anterior cephalometric images were used for reconstruction before surgery. By identifying specific points on the X-rays and marking them on the virtual models, it was possible to enhance the 2D images to create a realistic 3D environment and to perform virtual repositioning of the jaw. A hexapod was used to transfer the virtual planning to the real splints. Preliminary results showed that conventional repositioning could be replicated using the virtual articulator.

  16. Applications of Panoramic Images: from 720° Panorama to Interior 3d Models of Augmented Reality

    NASA Astrophysics Data System (ADS)

    Lee, I.-C.; Tsai, F.

    2015-05-01

    A series of panoramic images are usually used to generate a 720° panorama image. Although panoramic images are typically used for establishing tour guiding systems, in this research, we demonstrate the potential of using panoramic images acquired from multiple sites to create not only 720° panorama, but also three-dimensional (3D) point clouds and 3D indoor models. Since 3D modeling is one of the goals of this research, the location of the panoramic sites needed to be carefully planned in order to maintain a robust result for close-range photogrammetry. After the images are acquired, panoramic images are processed into 720° panoramas, and these panoramas which can be used directly as panorama guiding systems or other applications. In addition to these straightforward applications, interior orientation parameters can also be estimated while generating 720° panorama. These parameters are focal length, principle point, and lens radial distortion. The panoramic images can then be processed with closerange photogrammetry procedures to extract the exterior orientation parameters and generate 3D point clouds. In this research, VisaulSFM, a structure from motion software is used to estimate the exterior orientation, and CMVS toolkit is used to generate 3D point clouds. Next, the 3D point clouds are used as references to create building interior models. In this research, Trimble Sketchup was used to build the model, and the 3D point cloud was added to the determining of locations of building objects using plane finding procedure. In the texturing process, the panorama images are used as the data source for creating model textures. This 3D indoor model was used as an Augmented Reality model replacing a guide map or a floor plan commonly used in an on-line touring guide system. The 3D indoor model generating procedure has been utilized in two research projects: a cultural heritage site at Kinmen, and Taipei Main Station pedestrian zone guidance and navigation system. The

  17. A computer-based training system combining virtual reality and multimedia

    NASA Technical Reports Server (NTRS)

    Stansfield, Sharon A.

    1993-01-01

    Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment. The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system.

  18. A computer-based training system combining virtual reality and multimedia

    SciTech Connect

    Stansfield, S.A.

    1993-04-28

    Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment: The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system.

  19. Understanding Human Perception of Building Categories in Virtual 3d Cities - a User Study

    NASA Astrophysics Data System (ADS)

    Tutzauer, P.; Becker, S.; Niese, T.; Deussen, O.; Fritsch, D.

    2016-06-01

    Virtual 3D cities are becoming increasingly important as a means of visually communicating diverse urban-related information. To get a deeper understanding of a human's cognitive experience of virtual 3D cities, this paper presents a user study on the human ability to perceive building categories (e.g. residential home, office building, building with shops etc.) from geometric 3D building representations. The study reveals various dependencies between geometric properties of the 3D representations and the perceptibility of the building categories. Knowledge about which geometries are relevant, helpful or obstructive for perceiving a specific building category is derived. The importance and usability of such knowledge is demonstrated based on a perception-guided 3D building abstraction process.

  20. Use of Virtual Reality for Space Flight

    NASA Technical Reports Server (NTRS)

    Harm, Deborah; Taylor, L. C.; Reschke, M. F.

    2011-01-01

    Virtual environments offer unique training opportunities, particularly for training astronauts and preadapting them to the novel sensory conditions of microgravity. Two unresolved human factors issues in virtual reality (VR) systems are: 1) potential "cybersickness", and 2) maladaptive sensorimotor performance following exposure to VR systems. Interestingly, these aftereffects are often quite similar to adaptive sensorimotor responses observed in astronauts during and/or following space flight. Active exploratory behavior in a new environment, with resulting feedback and the formation of new associations between sensory inputs and response outputs, promotes appropriate perception and motor control in the new environment. Thus, people adapt to consistent, sustained alterations of sensory input such as those produced by microgravity. Our research examining the effects of repeated exposures to a full field of view dome VR system showed that motion sickness and initial decrements in eye movement and postural control were greatly diminished following three exposures. These results suggest that repeated transitions between VR and the normal environment preflight might be a useful countermeasure for neurosensory and sensorimotor effects of space flight. The range of VR applications is enormous, extending from ground-based VR training for extravehicular activities at NASA, to medical and educational uses. It seems reasonable to suggest that other space related uses of VR should be investigated. For example, 1) use of head-mounted VR on orbit to rehearse/practice upcoming operational activities, and 2) ground-based VR training for emergency egress procedures. We propose that by combining VR designed for operational activities preflight, along with an appropriate schedule to facilitate sensorimotor adaptation and improve spatial orientation would potentially accomplish two important goals for astronauts and cosmonauts, preflight sensorimotor adaption and enhanced operational

  1. The VRFurnace: A Virtual Reality Application for Energy System Data Analysis

    SciTech Connect

    Johnson, Peter Eric

    2001-01-01

    This paper presents the Virtual Reality Furnace (VRFurnace) application, an interactive 3-D visualization platform for pulverized coal furnace analysis. The VRFurnace is a versatile toolkit where a variety of different CFD data sets related to pulverized coal furnaces can be studied interactively. The toolkit combines standard CFD analysis techniques with tools that more effectively utilize the 3-D capabilities of a virtual environment. Interaction with data is achieved through a dynamic instructional menu system. The application has been designed for use in a projection-based system which allows engineers, management, and operators to see and interact with the data at the same time. Future developments are discussed and will include the ability to combine multiple power plant components into a single application, allow remote collaboration between different virtual environments, and allow users to make changes to a flow field and see the results of these changes as they are made creating a complete virtual power plant.

  2. Virtual Reality in the Rehabilitation of Patients with Neurological Disorders.

    PubMed

    Kefaliakos, Antonios; Pliakos, Ioannis; Kiekkas, Panagiotis; Charalampidou, Martha; Diomidous, Marianna

    2016-01-01

    Neurological disorders affect the lifestyle and the living conditions of a patient. Virtual Reality is a technology that may be used to simulate various types of tasks in a computerized environment guiding the patient and help on rehabilitation. This review try to answer how Virtual Reality technologies can effect on the patients rehabilitation's results. Treatments which involves Virtual Reality applications offer new ways to make the patients more committed to their program and keeps them motivated. Another characteristic of a Virtual Reality treatment is that both patients and therapists can observe the mistakes made during a physiotherapy session. The insert of VR sessions in traditional rehabilitation therapy of patients with neurological disorders have produced positive results.

  3. The 2016 VGTC Virtual Reality: Best Dissertation Award.

    PubMed

    Mehra, Ravish

    2017-04-01

    The 2016 IEEE VGTC Virtual Reality Best Dissertation Award goes to Ravish Mehra, a 2014 graduate from the University of North Carolina at Chapel Hill, for his dissertation entitled: "Efficient Techniques for Wave-Based Sound Propagation in Interactive Applications".

  4. Integration of virtual and real scenes within an integral 3D imaging environment

    NASA Astrophysics Data System (ADS)

    Ren, Jinsong; Aggoun, Amar; McCormick, Malcolm

    2002-11-01

    The Imaging Technologies group at De Montfort University has developed an integral 3D imaging system, which is seen as the most likely vehicle for 3D television avoiding psychological effects. To create real fascinating three-dimensional television programs, a virtual studio that performs the task of generating, editing and integrating the 3D contents involving virtual and real scenes is required. The paper presents, for the first time, the procedures, factors and methods of integrating computer-generated virtual scenes with real objects captured using the 3D integral imaging camera system. The method of computer generation of 3D integral images, where the lens array is modelled instead of the physical camera is described. In the model each micro-lens that captures different elemental images of the virtual scene is treated as an extended pinhole camera. An integration process named integrated rendering is illustrated. Detailed discussion and deep investigation are focused on depth extraction from captured integral 3D images. The depth calculation method from the disparity and the multiple baseline method that is used to improve the precision of depth estimation are also presented. The concept of colour SSD and its further improvement in the precision is proposed and verified.

  5. A new approach towards image based virtual 3D city modeling by using close range photogrammetry

    NASA Astrophysics Data System (ADS)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2014-05-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing day to day for various engineering and non-engineering applications. Generally three main image based approaches are using for virtual 3D city models generation. In first approach, researchers used Sketch based modeling, second method is Procedural grammar based modeling and third approach is Close range photogrammetry based modeling. Literature study shows that till date, there is no complete solution available to create complete 3D city model by using images. These image based methods also have limitations This paper gives a new approach towards image based virtual 3D city modeling by using close range photogrammetry. This approach is divided into three sections. First, data acquisition process, second is 3D data processing, and third is data combination process. In data acquisition process, a multi-camera setup developed and used for video recording of an area. Image frames created from video data. Minimum required and suitable video image frame selected for 3D processing. In second section, based on close range photogrammetric principles and computer vision techniques, 3D model of area created. In third section, this 3D model exported to adding and merging of other pieces of large area. Scaling and alignment of 3D model was done. After applying the texturing and rendering on this model, a final photo-realistic textured 3D model created. This 3D model transferred into walk-through model or in movie form. Most of the processing steps are automatic. So this method is cost effective and less laborious. Accuracy of this model is good. For this research work, study area is the campus of department of civil engineering, Indian Institute of Technology, Roorkee. This campus acts as a prototype for city. Aerial photography is restricted in many country

  6. Measuring performance in virtual reality phacoemulsification surgery

    NASA Astrophysics Data System (ADS)

    Söderberg, Per; Laurell, Carl-Gustaf; Simawi, Wamidh; Skarman, Eva; Nordh, Leif; Nordqvist, Per

    2008-02-01

    We have developed a virtual reality (VR) simulator for phacoemulsification surgery. The current work aimed at developing a relative performance index that characterizes the performance of an individual trainee. We recorded measurements of 28 response variables during three iterated surgical sessions in 9 experienced cataract surgeons, separately for the sculpting phase and the evacuation phase of phacoemulsification surgery and compared their outcome to that of a reference group of naive trainees. We defined an individual overall performance index, an individual class specific performance index and an individual variable specific performance index. We found that on an average the experienced surgeons performed at a lower level than a reference group of naive trainees but that this was particularly attributed to a few surgeons. When their overall performance index was further analyzed as class specific performance index and variable specific performance index it was found that the low level performance was attributed to a behavior that is acceptable for an experienced surgeon but not for a naive trainee. It was concluded that relative performance indices should use a reference group that corresponds to the measured individual since the definition of optimal surgery may vary among trainee groups depending on their level of experience.

  7. Performance index for virtual reality phacoemulsification surgery

    NASA Astrophysics Data System (ADS)

    Söderberg, Per; Laurell, Carl-Gustaf; Simawi, Wamidh; Skarman, Eva; Nordqvist, Per; Nordh, Leif

    2007-02-01

    We have developed a virtual reality (VR) simulator for phacoemulsification (phaco) surgery. The current work aimed at developing a performance index that characterizes the performance of an individual trainee. We recorded measurements of 28 response variables during three iterated surgical sessions in 9 subjects naive to cataract surgery and 6 experienced cataract surgeons, separately for the sculpting phase and the evacuation phase of phacoemulsification surgery. We further defined a specific performance index for a specific measurement variable and a total performance index for a specific trainee. The distribution function for the total performance index was relatively evenly distributed both for the sculpting and the evacuation phase indicating that parametric statistics can be used for comparison of total average performance indices for different groups in the future. The current total performance index for an individual considers all measurement variables included with the same weight. It is possible that a future development of the system will indicate that a better characterization of a trainee can be obtained if the various measurements variables are given specific weights. The currently developed total performance index for a trainee is statistically an independent observation of that particular trainee.

  8. STS-118 Astronaut Dave Williams Trains Using Virtual Reality Hardware

    NASA Technical Reports Server (NTRS)

    2007-01-01

    STS-118 astronaut and mission specialist Dafydd R. 'Dave' Williams, representing the Canadian Space Agency, uses Virtual Reality Hardware in the Space Vehicle Mock Up Facility at the Johnson Space Center to rehearse some of his duties for the upcoming mission. This type of virtual reality training allows the astronauts to wear special gloves and other gear while looking at a computer that displays simulating actual movements around the various locations on the station hardware which with they will be working.

  9. A virtual reality environment for patient data visualization and endoscopic surgical planning.

    PubMed

    Foo, Jung-Leng; Lobe, Thom; Winer, Eliot

    2009-04-01

    Visualizing patient data in a three-dimensional (3D) representation can be an effective surgical planning tool.As medical imaging technologies improve with faster and higher resolution scans, the use of virtual reality for interacting with medical images adds another level of realism to a 3D representation. The software framework presented in this paper is designed to load and display any DICOM/PACS-compatible 3D image data for visualization and interaction in an immersive virtual environment. In "examiner" mode, the surgeon can interact with a 3D virtual model of the patient by using an intuitive set of controls designed to allow slicing, coloring,and windowing of the image to show different tissue densities and enhance important structures. In the simulated"endoscopic camera" mode, the surgeon can see through the point of view of a virtual endoscopic camera to navigate inside the patient. These tools allow the surgeon to perform virtual endoscopy on any suitable structure.The software is highly scalable, as it can be used on a single desktop computer to a cluster of computers in an immersive multiprojection virtual environment. By wearing a pair of stereo glasses, a surgeon becomes immersed within the model itself, thus providing a sense of realism, as if the surgeon is "inside" the patient.

  10. Role of virtual reality for cerebral palsy management.

    PubMed

    Weiss, Patrice L Tamar; Tirosh, Emanuel; Fehlings, Darcy

    2014-08-01

    Virtual reality is the use of interactive simulations to present users with opportunities to perform in virtual environments that appear, sound, and less frequently, feel similar to real-world objects and events. Interactive computer play refers to the use of a game where a child interacts and plays with virtual objects in a computer-generated environment. Because of their distinctive attributes that provide ecologically realistic and motivating opportunities for active learning, these technologies have been used in pediatric rehabilitation over the past 15 years. The ability of virtual reality to create opportunities for active repetitive motor/sensory practice adds to their potential for neuroplasticity and learning in individuals with neurologic disorders. The objectives of this article is to provide an overview of how virtual reality and gaming are used clinically, to present the results of several example studies that demonstrate their use in research, and to briefly remark on future developments.

  11. The Usability of Online Geographic Virtual Reality for Urban Planning

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Moore, A. B.

    2013-08-01

    Virtual reality (VR) technology is starting to become widely and freely available (for example the online OpenSimulator tool), with potential for use in 3D urban planning and design tasks but still needing rigorous assessment to establish this. A previous study consulted with a small group of urban professionals, who concluded in a satisfaction usability test that online VR had potential value as a usable 3D communication and remote marketing tool but acknowledged that visual quality and geographic accuracy were obstacles to overcome. This research takes the investigation a significant step further to also examine the usability aspects of efficiency (how quickly tasks are completed) and effectiveness (how successfully tasks are completed), relating to OpenSimulator in an urban planning situation. The comparative study pits a three-dimensional VR model (with increased graphic fidelity and geographic content to address the feedback of the previous study) of a subdivision design (in a Dunedin suburb) against 3D models built with GIS (ArcGIS) and CAD (BricsCAD) tools, two types of software environment well established in urban professional practice. Urban professionals participated in the study by attempting to perform timed tasks correctly in each of the environments before being asked questions about the technologies involved and their perceived importance to their professional work. The results reinforce the positive feedback for VR of the previous study, with the graphical and geographic data issues being somewhat addressed (though participants stressed the need for accurate and precise object and terrain modification capabilities in VR). Ease-ofuse and associated fastest task completion speed were significant positive outcomes to emerge from the comparison with GIS and CAD, pointing to a strong future for VR in an urban planning context.

  12. Applying a 3D Situational Virtual Learning Environment to the Real World Business--An Extended Research in Marketing

    ERIC Educational Resources Information Center

    Wang, Shwu-huey

    2012-01-01

    In order to understand (1) what kind of students can be facilitated through the help of three-dimensional virtual learning environment (3D VLE), and (2) the relationship between a conventional test (ie, paper and pencil test) and the 3D VLE used in this study, the study designs a 3D virtual supermarket (3DVS) to help students transform their role…

  13. Virtual Reality as a Tool in the Education

    ERIC Educational Resources Information Center

    Piovesan, Sandra Dutra; Passerino, Liliana Maria; Pereira, Adriana Soares

    2012-01-01

    The virtual reality is being more and more used in the education, enabling the student to find out, to explore and to build his own knowledge. This paper presents an Educational Software for presence or distance education, for subjects of Formal Language, where the student can manipulate virtually the target that must be explored, analyzed and…

  14. Using Immersive Virtual Reality for Electrical Substation Training

    ERIC Educational Resources Information Center

    Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana

    2015-01-01

    Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…

  15. Virtual reality and telepresence for military medicine.

    PubMed

    Satava, R M

    1997-01-01

    For decades, warfighters have been putting in place a sophisticated "digital battlefield", an electronic communication and information system to support advanced technology. Medicine is now in a position to leverage these technologies to produce a fundamental revolution, and the keystone is the digital physician. Today nearly all information about a patient can be acquired electronically, and with the new technologies of teleoperation and telesurgery we can provide remote treatment and even surgery through telemedicine. The following framework for military medicine will leverage upon the current electronic battlefield. A personnel status monitor (PSM) will have a global positioning locator to tell the position of each soldier and a suite of vital signs sensors. When a soldier is wounded, the medic will instantly know the location of the soldier, and how serious is the casualty. This will permit the medic to locate the most critically wounded soldier. Once stabilised, he will be placed in a critical care pod, a fully automated intensive care unit in a stretcher, which will monitor his vital signs, administer fluids and medications and provide environmental protection. If immediate surgery is needed, a remote telepresence surgery vehicle will come to the wounded soldier, the medic will place him in the vehicle, and a surgeon will operate remotely using telepresence surgery from a distant Mobile Advance Surgical Hospital (MASH) to the combat zone. Also, the expertise from any specialist will be available from the rear echelons as far back as the home country. For education and training in combat casualty care, virtual reality simulators are being implemented. This same scenario can be utilised in civilian health care, especially in providing care to patients in remote areas who do not currently have access to simple, let alone sophisticated, health care.

  16. Soldier evaluation of the virtual reality Iraq.

    PubMed

    Reger, Greg M; Gahm, Gregory A; Rizzo, Albert A; Swanson, Robert; Duma, Susan

    2009-01-01

    Repeated combat deployments to Iraq and Afghanistan are resulting in increased rates of posttraumatic stress disorder (PTSD) among military personnel. Although exposure therapy is an effective treatment for this disorder, some personnel do not significantly respond to treatment, possibly due to poor activation of the trauma memory or a lack of emotional engagement during therapy. In addition, some service members do not seek mental healthcare due to treatment stigma. Researchers recently developed a virtual reality (VR) Iraq to attempt to improve activation of the traumatic memory during exposure therapy and to provide a treatment approach that may be more appealing to some service members, relative to traditional face-to-face talk therapy. Initial validation of the application requires an assessment of how well it represents the experiences of previously deployed service members. This study evaluated the realism of the VR Iraq application according to the subjective evaluation of 93 U.S. Army soldiers who returned from Iraq in the last year. Those screening negative for PTSD used and evaluated a VR tactical convoy and a VR dismounted patrol in a simulated Middle Eastern city. Results indicated that 86% of soldiers rated the overall realism of the VR convoy as ranging from adequate to excellent. Eighty-two percent of soldiers reported adequate-to-excellent overall realism of the city environment. Results provide evidence that the VR Iraq presents a realistic context in which VR exposure therapy can be conducted. However, clinical trials are needed to assess the efficacy of VR exposure therapy for Iraq veterans with PTSD.

  17. Virtual surgical planning and 3D printing in repeat calvarial vault reconstruction for craniosynostosis: technical note.

    PubMed

    LoPresti, Melissa; Daniels, Bradley; Buchanan, Edward P; Monson, Laura; Lam, Sandi

    2017-02-03

    Repeat surgery for restenosis after initial nonsyndromic craniosynostosis intervention is sometimes needed. Calvarial vault reconstruction through a healed surgical bed adds a level of intraoperative complexity and may benefit from preoperative and intraoperative definitions of biometric and aesthetic norms. Computer-assisted design and manufacturing using 3D imaging allows the precise formulation of operative plans in anticipation of surgical intervention. 3D printing turns virtual plans into anatomical replicas, templates, or customized implants by using a variety of materials. The authors present a technical note illustrating the use of this technology: a repeat calvarial vault reconstruction that was planned and executed using computer-assisted design and 3D printed intraoperative guides.

  18. Evaluating Virtual Reality and Augmented Reality Training for Industrial Maintenance and Assembly Tasks

    ERIC Educational Resources Information Center

    Gavish, Nirit; Gutiérrez, Teresa; Webel, Sabine; Rodríguez, Jorge; Peveri, Matteo; Bockholt, Uli; Tecchia, Franco

    2015-01-01

    The current study evaluated the use of virtual reality (VR) and augmented reality (AR) platforms, developed within the scope of the SKILLS Integrated Project, for industrial maintenance and assembly (IMA) tasks training. VR and AR systems are now widely regarded as promising training platforms for complex and highly demanding IMA tasks. However,…

  19. Two Innovative Steps for Training on Maintenance: 'VIRMAN' Spanish Project based on Virtual Reality 'STARMATE' European Project based on Augmented Reality

    SciTech Connect

    Gonzalez Anez, Francisco

    2002-07-01

    This paper presents two development projects (STARMATE and VIRMAN) focused on supporting training on maintenance. Both projects aim at specifying, designing, developing, and demonstrating prototypes allowing computer guided maintenance of complex mechanical elements using Augmented and Virtual Reality techniques. VIRMAN is a Spanish development project. The objective is to create a computer tool for maintenance training course elaborations and training delivery based on 3D virtual reality models of complex components. The training delivery includes 3D record displays on maintenance procedures with all complementary information for intervention understanding. Users are requested to perform the maintenance intervention trying to follow up the procedure. Users can be evaluated about the level of knowledge achieved. Instructors can check the evaluation records left during the training sessions. VIRMAN is simple software supported by a regular computer and can be used in an Internet framework. STARMATE is a forward step in the area of virtual reality. STARMATE is a European Commission project in the frame of 'Information Societies Technologies'. A consortium of five companies and one research institute shares their expertise in this new technology. STARMATE provides two main functionalities (1) user assistance for achieving assembly/de-assembly and following maintenance procedures, and (2) workforce training. The project relies on Augmented Reality techniques, which is a growing area in Virtual Reality research. The idea of Augmented Reality is to combine a real scene, viewed by the user, with a virtual scene, generated by a computer, augmenting the reality with additional information. The user interface is see-through goggles, headphones, microphone and an optical tracking system. All these devices are integrated in a helmet connected with two regular computers. The user has his hands free for performing the maintenance intervention and he can navigate in the virtual

  20. A Virtual Reality Simulator Prototype for Learning and Assessing Phaco-sculpting Skills

    NASA Astrophysics Data System (ADS)

    Choi, Kup-Sze

    This paper presents a virtual reality based simulator prototype for learning phacoemulsification in cataract surgery, with focus on the skills required for making a cross-shape trench in cataractous lens by an ultrasound probe during the phaco-sculpting procedure. An immersive virtual environment is created with 3D models of the lens and surgical tools. Haptic device is also used as 3D user interface. Phaco-sculpting is simulated by interactively deleting the constituting tetrahedrons of the lens model. Collisions between the virtual probe and the lens are effectively identified by partitioning the space containing the lens hierarchically with an octree. The simulator can be programmed to collect real-time quantitative user data for reviewing and assessing trainee's performance in an objective manner. A game-based learning environment can be created on top of the simulator by incorporating gaming elements based on the quantifiable performance metrics.

  1. Surgical Navigation Technology Based on Augmented Reality and Integrated 3D Intraoperative Imaging

    PubMed Central

    Elmi-Terander, Adrian; Skulason, Halldor; Söderman, Michael; Racadio, John; Homan, Robert; Babic, Drazenko; van der Vaart, Nijs; Nachabe, Rami

    2016-01-01

    Study Design. A cadaveric laboratory study. Objective. The aim of this study was to assess the feasibility and accuracy of thoracic pedicle screw placement using augmented reality surgical navigation (ARSN). Summary of Background Data. Recent advances in spinal navigation have shown improved accuracy in lumbosacral pedicle screw placement but limited benefits in the thoracic spine. 3D intraoperative imaging and instrument navigation may allow improved accuracy in pedicle screw placement, without the use of x-ray fluoroscopy, and thus opens the route to image-guided minimally invasive therapy in the thoracic spine. Methods. ARSN encompasses a surgical table, a motorized flat detector C-arm with intraoperative 2D/3D capabilities, integrated optical cameras for augmented reality navigation, and noninvasive patient motion tracking. Two neurosurgeons placed 94 pedicle screws in the thoracic spine of four cadavers using ARSN on one side of the spine (47 screws) and free-hand technique on the contralateral side. X-ray fluoroscopy was not used for either technique. Four independent reviewers assessed the postoperative scans, using the Gertzbein grading. Morphometric measurements of the pedicles axial and sagittal widths and angles, as well as the vertebrae axial and sagittal rotations were performed to identify risk factors for breaches. Results. ARSN was feasible and superior to free-hand technique with respect to overall accuracy (85% vs. 64%, P < 0.05), specifically significant increases of perfectly placed screws (51% vs. 30%, P < 0.05) and reductions in breaches beyond 4 mm (2% vs. 25%, P < 0.05). All morphometric dimensions, except for vertebral body axial rotation, were risk factors for larger breaches when performed with the free-hand method. Conclusion. ARSN without fluoroscopy was feasible and demonstrated higher accuracy than free-hand technique for thoracic pedicle screw placement. Level of Evidence: N/A PMID:27513166

  2. Supporting Distributed Team Working in 3D Virtual Worlds: A Case Study in Second Life

    ERIC Educational Resources Information Center

    Minocha, Shailey; Morse, David R.

    2010-01-01

    Purpose: The purpose of this paper is to report on a study into how a three-dimensional (3D) virtual world (Second Life) can facilitate socialisation and team working among students working on a team project at a distance. This models the situation in many commercial sectors where work is increasingly being conducted across time zones and between…

  3. Teaching Digital Natives: 3-D Virtual Science Lab in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Franklin, Teresa J.

    2008-01-01

    This paper presents the development of a 3-D virtual environment in Second Life for the delivery of standards-based science content for middle school students in the rural Appalachian region of Southeast Ohio. A mixed method approach in which quantitative results of improved student learning and qualitative observations of implementation within…

  4. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars.

    PubMed

    Patil, Shashidhar; Chintalapalli, Harinadha Reddy; Kim, Dubeom; Chai, Youngho

    2015-06-18

    In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

  5. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars

    PubMed Central

    Patil, Shashidhar; Chintalapalli, Harinadha Reddy; Kim, Dubeom; Chai, Youngho

    2015-01-01

    In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor. PMID:26094629

  6. Socialisation for Learning at a Distance in a 3-D Multi-User Virtual Environment

    ERIC Educational Resources Information Center

    Edirisingha, Palitha; Nie, Ming; Pluciennik, Mark; Young, Ruth

    2009-01-01

    This paper reports findings of a pilot study that examined the pedagogical potential of "Second Life" (SL), a popular three-dimensional multi-user virtual environment (3-D MUVE) developed by the Linden Lab. The study is part of a 1-year research and development project titled "Modelling of Secondlife Environments"…

  7. The Cognitive Apprenticeship Theory for the Teaching of Mathematics in an Online 3D Virtual Environment

    ERIC Educational Resources Information Center

    Bouta, Hara; Paraskeva, Fotini

    2013-01-01

    Research spanning two decades shows that there is a continuing development of 3D virtual worlds and investment in such environments for educational purposes. Research stresses the need for these environments to be well-designed and for suitable pedagogies to be implemented in the teaching practice in order for these worlds to be fully effective.…

  8. Virtual Reality and Learning: Where Is the Pedagogy?

    ERIC Educational Resources Information Center

    Fowler, Chris

    2015-01-01

    The aim of this paper was to build upon Dalgarno and Lee's model or framework of learning in three-dimensional (3-D) virtual learning environments (VLEs) and to extend their road map for further research in this area. The enhanced model shares the common goal with Dalgarno and Lee of identifying the learning benefits from using 3-D VLEs. The…

  9. EXPLORING ENVIRONMENTAL DATA IN A HIGHLY IMMERSIVE VIRTUAL REALITY ENVIRONMENT

    EPA Science Inventory

    Geography inherently fills a 3D space and yet we struggle with displaying geography using, primaarily, 2D display devices. Virtual environments offer a more realistically-dimensioned display space and this is being realized in the expanding area of research on 3D Geographic Infor...

  10. GEARS a 3D Virtual Learning Environment and Virtual Social and Educational World Used in Online Secondary Schools

    ERIC Educational Resources Information Center

    Barkand, Jonathan; Kush, Joseph

    2009-01-01

    Virtual Learning Environments (VLEs) are becoming increasingly popular in online education environments and have multiple pedagogical advantages over more traditional approaches to education. VLEs include 3D worlds where students can engage in simulated learning activities such as Second Life. According to Claudia L'Amoreaux at Linden Lab, "at…

  11. Using virtual 3D audio in multispeech channel and multimedia environments

    NASA Astrophysics Data System (ADS)

    Orosz, Michael D.; Karplus, Walter J.; Balakrishnan, Jerry D.

    2000-08-01

    The advantages and disadvantages of using virtual 3-D audio in mission-critical, multimedia display interfaces were evaluated. The 3D audio platform seems to be an especially promising candidate for aircraft cockpits, flight control rooms, and other command and control environments in which operators must make mission-critical decisions while handling demanding and routine tasks. Virtual audio signal processing creates the illusion for a listener wearing conventional earphones that each of a multiplicity of simultaneous speech or audio channels is originating from a different, program- specified location in virtual space. To explore the possible uses of this new, readily available technology, a test bed simulating some of the conditions experienced by the chief flight test coordinator at NASA's Dryden Flight Research Center was designed and implemented. Thirty test subjects simultaneously performed routine tasks requiring constant hand-eye coordination, while monitoring four speech channels, each generating continuous speech signals, for the occurrence of pre-specified keywords. Performance measures included accuracy in identifying the keywords, accuracy in identifying the speaker of the keyword, and response time. We found substantial improvements on all of these measures when comparing virtual audio with conventional, monaural transmissions. We also explored the effect on operator performance of different spatial configurations of the audio sources in 3-D space, simulated movement (dither) in the source locations, and of providing graphical redundancy. Some of these manipulations were less effective and may even decrease performance efficiency, even though they improve some aspects of the virtual space simulation.

  12. Alleviating travel anxiety through virtual reality and narrated video technology.

    PubMed

    Ahn, J C; Lee, O

    2013-01-01

    This study presents an empirical evidence of benefit of narrative video clips in embedded virtual reality websites of hotels for relieving travel anxiety. Even though it was proven that virtual reality functions do provide some relief in travel anxiety, a stronger virtual reality website can be built when narrative video clips that show video clips with narration about important aspects of the hotel. We posit that these important aspects are 1. Escape route and 2. Surrounding neighborhood information, which are derived from the existing research on anxiety disorder as well as travel anxiety. Thus we created a video clip that showed and narrated about the escape route from the hotel room, another video clip that showed and narrated about surrounding neighborhood. We then conducted experiments with this enhanced virtual reality website of a hotel by having human subjects play with the website and fill out a questionnaire. The result confirms our hypothesis that there is a statistically significant relationship between the degree of travel anxiety and psychological relief caused by the use of embedded virtual reality functions with narrative video clips of a hotel website (Tab. 2, Fig. 3, Ref. 26).

  13. Valorisation of Cultural Heritage Through Virtual Visit and Augmented Reality: the Case of the Abbey of Epau (france)

    NASA Astrophysics Data System (ADS)

    Simonetto, E.; Froment, C.; Labergerie, E.; Ferré, G.; Séchet, B.; Chédorge, H.; Cali, J.; Polidori, L.

    2013-07-01

    Terrestrial Laser Scanning (TLS), 3-D modeling and its Web visualization are the three key steps needed to perform storage and grant-free and wide access to cultural heritage, as highlighted in many recent examples. The goal of this study is to set up 3-D Web resources for "virtually" visiting the exterior of the Abbaye de l'Epau, an old French abbey which has both a rich history and delicate architecture. The virtuality is considered in two ways: the flowing navigation in a virtual reality environment around the abbey and a game activity using augmented reality. First of all, the data acquisition consists in GPS and tacheometry survey, terrestrial laser scanning and photography acquisition. After data pre-processing, the meshed and textured 3-D model is generated using 3-D Reshaper commercial software. The virtual reality visit and augmented reality animation are then created using Unity software. This work shows the interest of such tools in bringing out the regional cultural heritage and making it attractive to the public.

  14. Implementing Virtual Reality Technology as an Effective WEB Based KIOSK: Darulaman's Teacher Training College Tour (IPDA VR Tour)

    ERIC Educational Resources Information Center

    Azman, Fadzil

    2004-01-01

    At present the development of Virtual Reality (VR) technology is expanding due to the importance and needs to use the 3D elements and 360 degrees panorama. In expressing a clearer picture to consumers in various fields such as education, military, medicine, entertainment and so on. In live with the development the web based VR kiosk project in…

  15. Implementing Virtual Reality Technology as an Effective Web Based Kiosk: Darulaman's Teacher Training College Tour (Ipda Vr Tour)

    ERIC Educational Resources Information Center

    Fadzil, Azman

    2006-01-01

    At present, the development of Virtual Reality (VR) technology is expanding due to the importance and needs to use the 3D elements and 360 degrees panorama in expressing a clearer picture to consumers in various fields such as education, military, medicine, entertainment and so on. The web based VR kiosk project in Darulaman's Teacher Training…

  16. Interactive graphical model building using telepresence and virtual reality

    SciTech Connect

    Cooke, C.; Stansfield, S.

    1993-10-01

    This paper presents a prototype system developed at Sandia National Laboratories to create and verify computer-generated graphical models of remote physical environments. The goal of the system is to create an interface between an operator and a computer vision system so that graphical models can be created interactively. Virtual reality and telepresence are used to allow interaction between the operator, computer, and remote environment. A stereo view of the remote environment is produced by two CCD cameras. The cameras are mounted on a three degree-of-freedom platform which is slaved to a mechanically-tracked, stereoscopic viewing device. This gives the operator a sense of immersion in the physical environment. The stereo video is enhanced by overlaying the graphical model onto it. Overlay of the graphical model onto the stereo video allows visual verification of graphical models. Creation of a graphical model is accomplished by allowing the operator to assist the computer in modeling. The operator controls a 3-D cursor to mark objects to be modeled. The computer then automatically extracts positional and geometric information about the object and creates the graphical model.

  17. An Interactive 3D Virtual Anatomy Puzzle for Learning and Simulation - Initial Demonstration and Evaluation.

    PubMed

    Messier, Erik; Wilcox, Jascha; Dawson-Elli, Alexander; Diaz, Gabriel; Linte, Cristian A

    2016-01-01

    To inspire young students (grades 6-12) to become medical practitioners and biomedical engineers, it is necessary to expose them to key concepts of the field in a way that is both exciting and informative. Recent advances in medical image acquisition, manipulation, processing, visualization, and display have revolutionized the approach in which the human body and internal anatomy can be seen and studied. It is now possible to collect 3D, 4D, and 5D medical images of patient specific data, and display that data to the end user using consumer level 3D stereoscopic display technology. Despite such advancements, traditional 2D modes of content presentation such as textbooks and slides are still the standard didactic equipment used to teach young students anatomy. More sophisticated methods of display can help to elucidate the complex 3D relationships between structures that are so often missed when viewing only 2D media, and can instill in students an appreciation for the interconnection between medicine and technology. Here we describe the design, implementation, and preliminary evaluation of a 3D virtual anatomy puzzle dedicated to helping users learn the anatomy of various organs and systems by manipulating 3D virtual data. The puzzle currently comprises several components of the human anatomy and can be easily extended to include additional organs and systems. The 3D virtual anatomy puzzle game was implemented and piloted using three display paradigms - a traditional 2D monitor, a 3D TV with active shutter glass, and the DK2 version Oculus Rift, as well as two different user interaction devices - a space mouse and traditional keyboard controls.

  18. Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments

    NASA Astrophysics Data System (ADS)

    Portalés, Cristina; Lerma, José Luis; Navarro, Santiago

    2010-01-01

    Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.

  19. Modulation of thermal pain-related brain activity with virtual reality: evidence from fMRI.

    PubMed

    Hoffman, Hunter G; Richards, Todd L; Coda, Barbara; Bills, Aric R; Blough, David; Richards, Anne L; Sharar, Sam R

    2004-06-07

    This study investigated the neural correlates of virtual reality analgesia. Virtual reality significantly reduced subjective pain ratings (i.e. analgesia). Using fMRI, pain-related brain activity was measured for each participant during conditions of no virtual reality and during virtual reality (order randomized). As predicted, virtual reality significantly reduced pain-related brain activity in all five regions of interest; the anterior cingulate cortex, primary and secondary somatosensory cortex, insula, and thalamus (p<0.002, corrected). Results showed direct modulation of human brain pain responses by virtual reality distraction.

  20. Comparative analysis of video processing and 3D rendering for cloud video games using different virtualization technologies

    NASA Astrophysics Data System (ADS)

    Bada, Adedayo; Alcaraz-Calero, Jose M.; Wang, Qi; Grecos, Christos

    2014-05-01

    This paper describes a comprehensive empirical performance evaluation of 3D video processing employing the physical/virtual architecture implemented in a cloud environment. Different virtualization technologies, virtual video cards and various 3D benchmarks tools have been utilized in order to analyse the optimal performance in the context of 3D online gaming applications. This study highlights 3D video rendering performance under each type of hypervisors, and other factors including network I/O, disk I/O and memory usage. Comparisons of these factors under well-known virtual display technologies such as VNC, Spice and Virtual 3D adaptors reveal the strengths and weaknesses of the various hypervisors with respect to 3D video rendering and streaming.

  1. Capturing differences in dental training using a virtual reality simulator.

    PubMed

    Mirghani, I; Mushtaq, F; Allsop, M J; Al-Saud, L M; Tickhill, N; Potter, C; Keeling, A; Mon-Williams, M A; Manogue, M

    2016-11-19

    Virtual reality simulators are becoming increasingly popular in dental schools across the world. But to what extent do these systems reflect actual dental ability? Addressing this question of construct validity is a fundamental step that is necessary before these systems can be fully integrated into a dental school's curriculum. In this study, we examined the sensitivity of the Simodont (a haptic virtual reality dental simulator) to differences in dental training experience. Two hundred and eighty-nine participants, with 1 (n = 92), 3 (n = 79), 4 (n = 57) and 5 (n = 61) years of dental training, performed a series of tasks upon their first exposure to the simulator. We found statistically significant differences between novice (Year 1) and experienced dental trainees (operationalised as 3 or more years of training), but no differences between performance of experienced trainees with varying levels of experience. This work represents a crucial first step in understanding the value of haptic virtual reality simulators in dental education.

  2. Could virtual reality be effective in treating children with phobias?

    PubMed

    Bouchard, Stéphane

    2011-02-01

    The use of virtual reality to treat anxiety disorders in adults is gaining popularity and its efficacy is supported by numerous outcome studies. Similar research for children is lagging behind. The outcome studies on the use of virtual reality to treat anxiety disorders in children currently address only specific phobias, and all of the available trials are reviewed in this article. Despite the limited number of studies, results are very encouraging for the treatment of school and spider phobias. A study with adolescents suggests that, at least for social anxiety, exposure stimuli would be more effective if they were developed specifically for younger populations. Virtual reality may not increase children's motivation towards therapy unless their fearful apprehension is addressed before initiating the treatment.

  3. vPresent: A cloud based 3D virtual presentation environment for interactive product customization

    NASA Astrophysics Data System (ADS)

    Nan, Xiaoming; Guo, Fei; He, Yifeng; Guan, Ling

    2013-09-01

    In modern society, many companies offer product customization services to their customers. There are two major issues in providing customized products. First, product manufacturers need to effectively present their products to the customers who may be located in any geographical area. Second, customers need to be able to provide their feedbacks on the product in real-time. However, the traditional presentation approaches cannot effectively convey sufficient information for the product or efficiently adjust product design according to customers' real-time feedbacks. In order to address these issues, we propose vPresent , a cloud based 3D virtual presentation environment, in this paper. In vPresent, the product expert can show the 3D virtual product to the remote customers and dynamically customize the product based on customers' feedbacks, while customers can provide their opinions in real time when they are viewing a vivid 3D visualization of the product. Since the proposed vPresent is a cloud based system, the customers are able to access the customized virtual products from anywhere at any time, via desktop, laptop, or even smart phone. The proposed vPresent is expected to effectively deliver 3D visual information to customers and provide an interactive design platform for the development of customized products.

  4. Visualizing Mars Using Virtual Reality: A State of the Art Mapping Technique Used on Mars Pathfinder

    NASA Technical Reports Server (NTRS)

    Stoker, C.; Zbinden, E.; Blackmon, T.; Nguyen, L.

    1999-01-01

    We describe an interactive terrain visualization system which rapidly generates and interactively displays photorealistic three-dimensional (3-D) models produced from stereo images. This product, first demonstrated in Mars Pathfinder, is interactive, 3-D, and can be viewed in an immersive display which qualifies it for the name Virtual Reality (VR). The use of this technology on Mars Pathfinder was the first use of VR for geologic analysis. A primary benefit of using VR to display geologic information is that it provides an improved perception of depth and spatial layout of the remote site. The VR aspect of the display allows an operator to move freely in the environment, unconstrained by the physical limitations of the perspective from which the data were acquired. Virtual Reality offers a way to archive and retrieve information in a way that is intuitively obvious. Combining VR models with stereo display systems can give the user a sense of presence at the remote location. The capability, to interactively perform measurements from within the VR model offers unprecedented ease in performing operations that are normally time consuming and difficult using other techniques. Thus, Virtual Reality can be a powerful a cartographic tool. Additional information is contained in the original extended abstract.

  5. Future directions for the development of virtual reality within an automotive manufacturer.

    PubMed

    Lawson, Glyn; Salanitri, Davide; Waterfield, Brian

    2016-03-01

    Virtual Reality (VR) can reduce time and costs, and lead to increases in quality, in the development of a product. Given the pressure on car companies to reduce time-to-market and to continually improve quality, the automotive industry has championed the use of VR across a number of applications, including design, manufacturing, and training. This paper describes interviews with 11 engineers and employees of allied disciplines from an automotive manufacturer about their current physical and virtual properties and processes. The results guided a review of research findings and scientific advances from the academic literature, which formed the basis of recommendations for future developments of VR technologies and applications. These include: develop a greater range of virtual contexts; use multi-sensory simulation; address perceived differences between virtual and real cars; improve motion capture capabilities; implement networked 3D technology; and use VR for market research.

  6. 3D-e-Chem-VM: Structural Cheminformatics Research Infrastructure in a Freely Available Virtual Machine.

    PubMed

    McGuire, Ross; Verhoeven, Stefan; Vass, Márton; Vriend, Gerrit; de Esch, Iwan J P; Lusher, Scott J; Leurs, Rob; Ridder, Lars; Kooistra, Albert J; Ritschel, Tina; de Graaf, Chris

    2017-02-27

    3D-e-Chem-VM is an open source, freely available Virtual Machine ( http://3d-e-chem.github.io/3D-e-Chem-VM/ ) that integrates cheminformatics and bioinformatics tools for the analysis of protein-ligand interaction data. 3D-e-Chem-VM consists of software libraries, and database and workflow tools that can analyze and combine small molecule and protein structural information in a graphical programming environment. New chemical and biological data analytics tools and workflows have been developed for the efficient exploitation of structural and pharmacological protein-ligand interaction data from proteomewide databases (e.g., ChEMBLdb and PDB), as well as customized information systems focused on, e.g., G protein-coupled receptors (GPCRdb) and protein kinases (KLIFS). The integrated structural cheminformatics research infrastructure compiled in the 3D-e-Chem-VM enables the design of new approaches in virtual ligand screening (Chemdb4VS), ligand-based metabolism prediction (SyGMa), and structure-based protein binding site comparison and bioisosteric replacement for ligand design (KRIPOdb).

  7. 3D-e-Chem-VM: Structural Cheminformatics Research Infrastructure in a Freely Available Virtual Machine

    PubMed Central

    2017-01-01

    3D-e-Chem-VM is an open source, freely available Virtual Machine (http://3d-e-chem.github.io/3D-e-Chem-VM/) that integrates cheminformatics and bioinformatics tools for the analysis of protein–ligand interaction data. 3D-e-Chem-VM consists of software libraries, and database and workflow tools that can analyze and combine small molecule and protein structural information in a graphical programming environment. New chemical and biological data analytics tools and workflows have been developed for the efficient exploitation of structural and pharmacological protein–ligand interaction data from proteomewide databases (e.g., ChEMBLdb and PDB), as well as customized information systems focused on, e.g., G protein-coupled receptors (GPCRdb) and protein kinases (KLIFS). The integrated structural cheminformatics research infrastructure compiled in the 3D-e-Chem-VM enables the design of new approaches in virtual ligand screening (Chemdb4VS), ligand-based metabolism prediction (SyGMa), and structure-based protein binding site comparison and bioisosteric replacement for ligand design (KRIPOdb). PMID:28125221

  8. Exploring Learning through Audience Interaction in Virtual Reality Dome Theaters

    NASA Astrophysics Data System (ADS)

    Apostolellis, Panagiotis; Daradoumis, Thanasis

    Informal learning in public spaces like museums, science centers and planetariums is increasingly popular during the last years. Recent advancements in large-scale displays allowed contemporary technology-enhanced museums to get equipped with digital domes, some with real-time capabilities like Virtual Reality systems. By conducting extensive literature review we have come to the conclusion that little to no research has been carried out on the leaning outcomes that the combination of VR and audience interaction can provide in the immersive environments of dome theaters. Thus, we propose that audience collaboration in immersive virtual reality environments presents a promising approach to support effective learning in groups of school aged children.

  9. Virtual Charter Schools: Realities and Unknowns

    ERIC Educational Resources Information Center

    Torre, Daniela

    2013-01-01

    Virtual charter schools have emerged over the last decade as an increasingly popular alternative to traditional public schooling. Unlike their face-to-face counterparts, virtual charter schools educate students through blended or entirely online curricula. They present a host of new policy issues that should be scrutinized in order to ensure that…

  10. fVisiOn: glasses-free tabletop 3D display to provide virtual 3D media naturally alongside real media

    NASA Astrophysics Data System (ADS)

    Yoshida, Shunsuke

    2012-06-01

    A novel glasses-free tabletop 3D display, named fVisiOn, floats virtual 3D objects on an empty, flat, tabletop surface and enables multiple viewers to observe raised 3D images from any angle at 360° Our glasses-free 3D image reproduction method employs a combination of an optical device and an array of projectors and produces continuous horizontal parallax in the direction of a circular path located above the table. The optical device shapes a hollow cone and works as an anisotropic diffuser. The circularly arranged projectors cast numerous rays into the optical device. Each ray represents a particular ray that passes a corresponding point on a virtual object's surface and orients toward a viewing area around the table. At any viewpoint on the ring-shaped viewing area, both eyes collect fractional images from different projectors, and all the viewers around the table can perceive the scene as 3D from their perspectives because the images include binocular disparity. The entire principle is installed beneath the table, so the tabletop area remains clear. No ordinary tabletop activities are disturbed. Many people can naturally share the 3D images displayed together with real objects on the table. In our latest prototype, we employed a handmade optical device and an array of over 100 tiny projectors. This configuration reproduces static and animated 3D scenes for a 130° viewing area and allows 5-cm-tall virtual characters to play soccer and dance on the table.

  11. Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve

    2011-03-01

    Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.

  12. Thermal feedback in virtual reality and telerobotic systems

    NASA Technical Reports Server (NTRS)

    Zerkus, Mike; Becker, Bill; Ward, Jon; Halvorsen, Lars

    1994-01-01

    A new concept has been developed that allows temperature to be part of the virtual world. The Displaced Temperature Sensing System (DTSS) can 'display' temperature in a virtual reality system.The DTSS can also serve as a feedback device for telerobotics. For virtual reality applications the virtual world software would be required to have a temperature map of its world. By whatever means (magnetic tracker, ultrasound tracker, etc.) the hand and fingers, which have been instrumented with thermodes, would be tracked. The temperature associated with the current position would be transmitted to the DRSS via a serial data link. The DTSS would provide that temperature to the fingers. For telerobotic operation the function of the DTSS is to transmit a temperature from a remote location to the fingers where the temperature can be felt.

  13. Efficacy of virtual reality in pedestrian safety research.

    PubMed

    Deb, Shuchisnigdha; Carruth, Daniel W; Sween, Richard; Strawderman, Lesley; Garrison, Teena M

    2017-03-16

    Advances in virtual reality technology present new opportunities for human factors research in areas that are dangerous, difficult, or expensive to study in the real world. The authors developed a new pedestrian simulator using the HTC Vive head mounted display and Unity software. Pedestrian head position and orientation were tracked as participants attempted to safely cross a virtual signalized intersection (5.5 m). In 10% of 60 trials, a vehicle violated the traffic signal and in 10.84% of these trials, a collision between the vehicle and the pedestrian was observed. Approximately 11% of the participants experienced simulator sickness and withdrew from the study. Objective measures, including the average walking speed, indicate that participant behavior in VR matches published real world norms. Subjective responses indicate that the virtual environment was realistic and engaging. Overall, the study results confirm the effectiveness of the new virtual reality technology for research on full motion tasks.

  14. Image-based 3D reconstruction and virtual environmental walk-through

    NASA Astrophysics Data System (ADS)

    Sun, Jifeng; Fang, Lixiong; Luo, Ying

    2001-09-01

    We present a 3D reconstruction method, which combines geometry-based modeling, image-based modeling and rendering techniques. The first component is an interactive geometry modeling method which recovery of the basic geometry of the photographed scene. The second component is model-based stereo algorithm. We discus the image processing problems and algorithms of walking through in virtual space, then designs and implement a high performance multi-thread wandering algorithm. The applications range from architectural planning and archaeological reconstruction to virtual environments and cinematic special effects.

  15. Future Cyborgs: Human-Machine Interface for Virtual Reality Applications

    DTIC Science & Technology

    2007-04-01

    to enhance the immersive quality of an environment. Walt Disney World uses the sense of smell during their virtual reality ride “Soaring” to...application. It is the interface that allows the man to become immersed in an artificially created world . It is the interface that allows him to interact... natural and realistic interactions. These revolutionary interfaces should be able to overcome the limitations of the current generation of virtual

  16. Virtual Reality as a Tool in Early Interventions

    DTIC Science & Technology

    2006-04-01

    the Telemedicine and Advanced Technology Research Center (TATRC), studying virtual reality therapy as an early intervention tool for PTSD. 1.0...and symptoms of PTSD, we discuss treatment options, focusing on VR therapy . We then describe the research that we are conducting at the Virtual...Center (TATRC), studying VR therapy as an early intervention tool for war-related PTSD. 2.0 POST-TRAUMATIC STRESS DISORDER PTSD affects an estimated

  17. Versatile, Immersive, Creative and Dynamic Virtual 3-D Healthcare Learning Environments: A Review of the Literature

    PubMed Central

    2008-01-01

    The author provides a critical overview of three-dimensional (3-D) virtual worlds and “serious gaming” that are currently being developed and used in healthcare professional education and medicine. The relevance of this e-learning innovation for teaching students and professionals is debatable and variables influencing adoption, such as increased knowledge, self-directed learning, and peer collaboration, by academics, healthcare professionals, and business executives are examined while looking at various Web 2.0/3.0 applications. There is a need for more empirical research in order to unearth the pedagogical outcomes and advantages associated with this e-learning technology. A brief description of Roger’s Diffusion of Innovations Theory and Siemens’ Connectivism Theory for today’s learners is presented as potential underlying pedagogical tenets to support the use of virtual 3-D learning environments in higher education and healthcare. PMID:18762473

  18. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    PubMed Central

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl’s law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3D-MIP platform when a larger number of cores is available. PMID:24910506

  19. 3D augmented reality for improving social acceptance and public participation in wind farms planning

    NASA Astrophysics Data System (ADS)

    Grassi, S.; Klein, T. M.

    2016-09-01

    Wind energy is one of the most important source of renewable energy characterized by a significant growth in the last decades and giving a more and more relevant contribution to the energy supply. One of the main disadvantages of a faster integration of wind energy into the energy mix is related to the visual impact of wind turbines on the landscape. In addition, the siting of new massive infrastructures has the potential to threaten a community's well-being if new projects are perceived being unfair. The public perception of the impact of wind turbines on the landscape is also crucial for their acceptance. The implementation of wind energy projects is hampered often because of a lack of planning or communication tools enabling a more transparent and efficient interaction between all stakeholders involved in the projects (i.e. developers, local communities and administrations, NGOs, etc.). Concerning the visual assessment of wind farms, a critical gap lies in effective visualization tools to improve the public perception of alternative wind turbines layouts. In this paper, we describe the advantages of a 3D dynamical and interactive visualization platform for an augmented reality to support wind energy planners in order to enhance the social acceptance of new wind energy projects.

  20. Virtual Reality Simulation of the International Space Welding Experiment

    NASA Technical Reports Server (NTRS)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.

  1. The cognitive apprenticeship theory for the teaching of mathematics in an online 3D virtual environment

    NASA Astrophysics Data System (ADS)

    Bouta, Hara; Paraskeva, Fotini

    2013-03-01

    Research spanning two decades shows that there is a continuing development of 3D virtual worlds and investment in such environments for educational purposes. Research stresses the need for these environments to be well-designed and for suitable pedagogies to be implemented in the teaching practice in order for these worlds to be fully effective. To this end, we propose a pedagogical framework based on the cognitive apprenticeship for deriving principles and guidelines to inform the design, development and use of a 3D virtual environment. This study examines how the use of a 3D virtual world facilitates the teaching of mathematics in primary education by combining design principles and guidelines based on the Cognitive Apprenticeship Theory and the teaching methods that this theory introduces. We focus specifically on 5th and 6th grade students' engagement (behavioral, affective and cognitive) while learning fractional concepts over a period of two class sessions. Quantitative and qualitative analyses indicate considerable improvement in the engagement of the students who participated in the experiment. This paper presents the findings regarding students' cognitive engagement in the process of comprehending basic fractional concepts - notoriously hard for students to master. The findings are encouraging and suggestions are made for further research.

  2. Using virtual reality environment to improve joint attention associated with pervasive developmental disorder.

    PubMed

    Cheng, Yufang; Huang, Ruowen

    2012-01-01

    The focus of this study is using data glove to practice Joint attention skill in virtual reality environment for people with pervasive developmental disorder (PDD). The virtual reality environment provides a safe environment for PDD people. Especially, when they made errors during practice in virtual reality environment, there is no suffering or dangerous consequences to deal with. Joint attention is a critical skill in the disorder characteristics of children with PDD. The absence of joint attention is a deficit frequently affects their social relationship in daily life. Therefore, this study designed the Joint Attention Skills Learning (JASL) systems with data glove tool to help children with PDD to practice joint attention behavior skills. The JASL specifically focus the skills of pointing, showing, sharing things and behavior interaction with other children with PDD. The system is designed in playroom-scene and presented in the first-person perspectives for users. The functions contain pointing and showing, moving virtual objects, 3D animation, text, speaking sounds, and feedback. The method was employed single subject multiple-probe design across subjects' designs, and analysis of visual inspection in this study. It took 3 months to finish the experimental section. Surprisingly, the experiment results reveal that the participants have further extension in improving the joint attention skills in their daily life after using the JASL system. The significant potential in this particular treatment of joint attention for each participant will be discussed in details in this paper.

  3. Air Force Medical Modeling and Simulation: Bringing Virtual Reality to Reality

    DTIC Science & Technology

    2011-01-26

    Modeling and Simulation 26 January 2011 Colonel Deborah N. Burgess, MD, FACP 1 Military Health System Conference Medical Modernization Division...Medical Modeling and Simulation: Bringing Virtual Reality to Reality 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Coordinating Group-1 Modeling & Simulation subgroup • USAF SG designated SPO vs MEFPAK for medical simulation E&T *Current Jan 2011 2011 MHS

  4. Architecture of web services in the enhancement of real-time 3D video virtualization in cloud environment

    NASA Astrophysics Data System (ADS)

    Bada, Adedayo; Wang, Qi; Alcaraz-Calero, Jose M.; Grecos, Christos

    2016-04-01

    This paper proposes a new approach to improving the application of 3D video rendering and streaming by jointly exploring and optimizing both cloud-based virtualization and web-based delivery. The proposed web service architecture firstly establishes a software virtualization layer based on QEMU (Quick Emulator), an open-source virtualization software that has been able to virtualize system components except for 3D rendering, which is still in its infancy. The architecture then explores the cloud environment to boost the speed of the rendering at the QEMU software virtualization layer. The capabilities and inherent limitations of Virgil 3D, which is one of the most advanced 3D virtual Graphics Processing Unit (GPU) available, are analyzed through benchmarking experiments and integrated into the architecture to further speed up the rendering. Experimental results are reported and analyzed to demonstrate the benefits of the proposed approach.

  5. Virtual 3D bladder reconstruction for augmented medical records from white light cystoscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Lurie, Kristen L.; Zlatev, Dimitar V.; Angst, Roland; Liao, Joseph C.; Ellerbee, Audrey K.

    2016-02-01

    Bladder cancer has a high recurrence rate that necessitates lifelong surveillance to detect mucosal lesions. Examination with white light cystoscopy (WLC), the standard of care, is inherently subjective and data storage limited to clinical notes, diagrams, and still images. A visual history of the bladder wall can enhance clinical and surgical management. To address this clinical need, we developed a tool to transform in vivo WLC videos into virtual 3-dimensional (3D) bladder models using advanced computer vision techniques. WLC videos from rigid cystoscopies (1280 x 720 pixels) were recorded at 30 Hz followed by immediate camera calibration to control for image distortions. Video data were fed into an automated structure-from-motion algorithm that generated a 3D point cloud followed by a 3D mesh to approximate the bladder surface. The highest quality cystoscopic images were projected onto the approximated bladder surface to generate a virtual 3D bladder reconstruction. In intraoperative WLC videos from 36 patients undergoing transurethral resection of suspected bladder tumors, optimal reconstruction was achieved from frames depicting well-focused vasculature, when the bladder was maintained at constant volume with minimal debris, and when regions of the bladder wall were imaged multiple times. A significant innovation of this work is the ability to perform the reconstruction using video from a clinical procedure collected with standard equipment, thereby facilitating rapid clinical translation, application to other forms of endoscopy and new opportunities for longitudinal studies of cancer recurrence.

  6. 3D Audio System

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Ames Research Center research into virtual reality led to the development of the Convolvotron, a high speed digital audio processing system that delivers three-dimensional sound over headphones. It consists of a two-card set designed for use with a personal computer. The Convolvotron's primary application is presentation of 3D audio signals over headphones. Four independent sound sources are filtered with large time-varying filters that compensate for motion. The perceived location of the sound remains constant. Possible applications are in air traffic control towers or airplane cockpits, hearing and perception research and virtual reality development.

  7. 3D Virtual Worlds as Art Media and Exhibition Arenas: Students' Responses and Challenges in Contemporary Art Education

    ERIC Educational Resources Information Center

    Lu, Lilly

    2013-01-01

    3D virtual worlds (3D VWs) are considered one of the emerging learning spaces of the 21st century; however, few empirical studies have investigated educational applications and student learning aspects in art education. This study focused on students' responses to and challenges with 3D VWs in both aspects. The findings show that most participants…

  8. Virtual Reality for Life Skills Education: Program Evaluation

    ERIC Educational Resources Information Center

    Vogel, Jennifer; Bowers, Clint; Meehan, Cricket; Hoeft, Raegan; Bradley, Kristy

    2004-01-01

    A program evaluation was completed for a Virtual Reality (VR) pilot project intended to aid deaf children in learning various life skills which they may be at risk of not adequately learning. Such skills include crossing the street safely, exiting a building during a fire drill, and avoiding situations in which strangers may harm them. The VR was…

  9. Virtual Reality: Teaching Tool of the Twenty-First Century?

    ERIC Educational Resources Information Center

    Hoffman, Helene; Vu, Dzung

    1997-01-01

    Virtual reality-based procedural and surgical simulations promise to revolutionize medical training. A wide range of simulations representing diverse content areas and varied implementation strategies are under development or in early use. The new systems will make broad-based training experiences available for students at all levels without risks…

  10. Virtual Reality in Psychological, Medical and Pedagogical Applications

    ERIC Educational Resources Information Center

    Eichenberg, Christiane, Ed.

    2012-01-01

    This book has an aim to present latest applications, trends and developments of virtual reality technologies in three humanities disciplines: in medicine, psychology and pedagogy. Studies show that people in both educational as well as in the medical therapeutic range expect more and more that modern media are included in the corresponding demand…

  11. Role-Playing a Legend in Virtual Reality.

    ERIC Educational Resources Information Center

    Ge, Xun; Lee, Jack; Yamashiro, Kelly A.

    2003-01-01

    Reports a case study of thirteen college students engaging in a role-play activity of a Maui legend in a virtual reality environment. Immersed in the authentic cultural environment, the students not only interacted with the environment and each other, but recreated the legend based on their interpretation of the culture. (CAK)

  12. Are Spatial Visualization Abilities Relevant to Virtual Reality?

    ERIC Educational Resources Information Center

    Chen, Chwen Jen

    2006-01-01

    This study aims to investigate the effects of virtual reality (VR)-based learning environment on learners of different spatial visualization abilities. The findings of the aptitude-by-treatment interaction study have shown that learners benefit most from the Guided VR mode, irrespective of their spatial visualization abilities. This indicates that…

  13. QuickTime Virtual Reality for Web Delivery.

    ERIC Educational Resources Information Center

    Hodges, Charles

    Virtual reality (VR) can create a unique and interesting environment in which students at a distance can explore and investigate objects or scenes via the World Wide Web. Creating these VR components is a process that is much more simple than many believe. This paper outlines when using VR may be appropriate in instructional settings and describes…

  14. Virtual Reality Hypermedia Design Frameworks for Science Instruction.

    ERIC Educational Resources Information Center

    Maule, R. William; Oh, Byron; Check, Rosa

    This paper reports on a study that conceptualizes a research framework to aid software design and development for virtual reality (VR) computer applications for instruction in the sciences. The framework provides methodologies for the processing, collection, examination, classification, and presentation of multimedia information within hyperlinked…

  15. Language Policies as Virtual Reality: Two Australian Examples.

    ERIC Educational Resources Information Center

    Moore, Helen

    1996-01-01

    Explores the background of Australia's shift from its National Policy on Languages (NPL) to the Australian Language and Literacy Policy (ALLP). Findings are that the ALLP's virtual reality serves the ideologies of a power elite; and the NPL's understandings are necessary to avoid the consequences of alienation and racism. (50 references) (CK)

  16. Advanced Virtual Reality Simulations in Aerospace Education and Research

    NASA Astrophysics Data System (ADS)

    Plotnikova, L.; Trivailo, P.

    2002-01-01

    Recent research developments at Aerospace Engineering, RMIT University have demonstrated great potential for using Virtual Reality simulations as a very effective tool in advanced structures and dynamics applications. They have also been extremely successful in teaching of various undergraduate and postgraduate courses for presenting complex concepts in structural and dynamics designs. Characteristic examples are related to the classical orbital mechanics, spacecraft attitude and structural dynamics. Advanced simulations, reflecting current research by the authors, are mainly related to the implementation of various non-linear dynamic techniques, including using Kane's equations to study dynamics of space tethered satellite systems and the Co-rotational Finite Element method to study reconfigurable robotic systems undergoing large rotations and large translations. The current article will describe the numerical implementation of the modern methods of dynamics, and will concentrate on the post-processing stage of the dynamic simulations. Numerous examples of building Virtual Reality stand-alone animations, designed by the authors, will be discussed in detail. These virtual reality examples will include: The striking feature of the developed technology is the use of the standard mathematical packages, like MATLAB, as a post-processing tool to generate Virtual Reality Modelling Language files with brilliant interactive, graphics and audio effects. These stand-alone demonstration files can be run under Netscape or Microsoft Explorer and do not require MATLAB. Use of this technology enables scientists to easily share their results with colleagues using the Internet, contributing to the flexible learning development at schools and Universities.

  17. How to Create a Low-Cost Virtual Reality Network.

    ERIC Educational Resources Information Center

    Moore, Noel

    1993-01-01

    Describes a project which developed a shared electronic environment of virtual reality using satellite telecommunications technologies to create desktop multimedia networking. The origins of the concept of shared electronic space are explained, and the importance for human communication of sharing both audio and visual space simultaneously is…

  18. The Future of Virtual Reality in the Classroom

    ERIC Educational Resources Information Center

    Vance, Amelia

    2016-01-01

    As state boards of education and other state policymakers consider the future of schools, sorting fad technology from technology that accelerates learning is key. Virtual reality (VR) is one such technology with promise that seems unlikely to fizzle. Hailed as potentially transformative for education and still in the early stages of application,…

  19. Language Learning in Virtual Reality Environments: Past, Present, and Future

    ERIC Educational Resources Information Center

    Lin, Tsun-Ju; Lan, Yu-Ju

    2015-01-01

    This study investigated the research trends in language learning in a virtual reality environment by conducting a content analysis of findings published in the literature from 2004 to 2013 in four top ranked computer-assisted language learning journals: "Language Learning & Technology," "CALICO Journal," "Computer…

  20. A Virtual Reality Dance Training System Using Motion Capture Technology

    ERIC Educational Resources Information Center

    Chan, J. C. P.; Leung, H.; Tang, J. K. T.; Komura, T.

    2011-01-01

    In this paper, a new dance training system based on the motion capture and virtual reality (VR) technologies is proposed. Our system is inspired by the traditional way to learn new movements-imitating the teacher's movements and listening to the teacher's feedback. A prototype of our proposed system is implemented, in which a student can imitate…

  1. Improving Weight Maintenance Using Virtual Reality (Second Life)

    ERIC Educational Resources Information Center

    Sullivan, Debra K.; Goetz, Jeannine R.; Gibson, Cheryl A.; Washburn, Richard A.; Smith, Bryan K.; Lee, Jaehoon; Gerald, Stephanie; Fincham, Tennille; Donnelly, Joseph E.

    2013-01-01

    Objective: Compare weight loss and maintenance between a face-to-face (FTF) weight management clinic and a clinic delivered via virtual reality (VR). Methods: Participants were randomized to 3 months of weight loss with a weekly clinic delivered via FTF or VR and then 6 months' weight maintenance delivered with VR. Data were collected at baseline…

  2. Teaching Marketing through a Micro-Economy in Virtual Reality

    ERIC Educational Resources Information Center

    Drake-Bridges, Erin; Strelzoff, Andrew; Sulbaran, Tulio

    2011-01-01

    Teaching retailing principles to students is a challenge because although real-world wholesale and retail decision making very heavily depends on dynamic conditions, classroom exercises are limited to abstract discussions and role-playing. This article describes two interlocking class projects taught using the virtual reality of secondlife.com,…

  3. Reduced Mimicry to Virtual Reality Avatars in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Forbes, Paul A. G.; Pan, Xueni; de C. Hamilton, Antonia F.

    2016-01-01

    Mimicry involves unconsciously copying the actions of others. Increasing evidence suggests that autistic people can copy the goal of an observed action but show differences in their mimicry. We investigated mimicry in autism spectrum disorder (ASD) within a two-dimensional virtual reality environment. Participants played an imitation game with a…

  4. Issues Surrounding the Use of Virtual Reality in Geographic Education

    ERIC Educational Resources Information Center

    Lisichenko, Richard

    2015-01-01

    As with all classroom innovations intended to improve geographic education, the adoption of virtual reality (VR) poses issues for consideration prior to endorsing its use. Of these, effectiveness, implementation, and safe use need to be addressed. Traditionally, sense of place, geographic knowledge, and firsthand experiences provided by field…

  5. A Constructivist Approach to Virtual Reality for Experiential Learning

    ERIC Educational Resources Information Center

    Aiello, P.; D'Elia, F.; Di Tore, S.; Sibilio, M.

    2012-01-01

    Consideration of a possible use of virtual reality technologies in school contexts requires gathering together the suggestions of many scientific domains aimed at "understanding" the features of these same tools that let them offer valid support to the teaching-learning processes in educational settings. Specifically, the present study is aimed at…

  6. Feasibility of Virtual Reality Environments for Adolescent Social Anxiety Disorder

    ERIC Educational Resources Information Center

    Parrish, Danielle E.; Oxhandler, Holly K.; Duron, Jacuelynn F.; Swank, Paul; Bordnick, Patrick

    2016-01-01

    Purpose: This study assessed the feasibility of virtual reality (VR) exposure as an assessment and treatment modality for youth with social anxiety disorder (SAD). Methods: Forty-one adolescents, 20 of which were identified as having SAD, were recruited from a community sample. Youth with and without SAD were exposed to two social virtual…

  7. Virtual Reality: An Experiential Tool for Clinical Psychology

    ERIC Educational Resources Information Center

    Riva, Giuseppe

    2009-01-01

    Several Virtual Reality (VR) applications for the understanding, assessment and treatment of mental health problems have been developed in the last 15 years. Typically, in VR the patient learns to manipulate problematic situations related to his/her problem. In fact, VR can be described as an advanced form of human-computer interface that is able…

  8. Exploration through Virtual Reality: Encounters with the Target Culture

    ERIC Educational Resources Information Center

    O'Brien, Mary Grantham; Levy, Richard M.

    2008-01-01

    This paper presents the results of a study on the use of a virtual reality (VR) world in a German language classroom. After participating in a lesson on the use of commands, students experienced the language and culture through navigation in a VR world. It is argued that this new medium allows for students to be immersed in the target culture and…

  9. Virtual Reality as a Distraction Technique in Chronic Pain Patients

    PubMed Central

    Gao, Kenneth; Sulea, Camelia; Wiederhold, Mark D.

    2014-01-01

    Abstract We explored the use of virtual reality distraction techniques for use as adjunctive therapy to treat chronic pain. Virtual environments were specifically created to provide pleasant and engaging experiences where patients navigated on their own through rich and varied simulated worlds. Real-time physiological monitoring was used as a guide to determine the effectiveness and sustainability of this intervention. Human factors studies showed that virtual navigation is a safe and effective method for use with chronic pain patients. Chronic pain patients demonstrated significant relief in subjective ratings of pain that corresponded to objective measurements in peripheral, noninvasive physiological measures. PMID:24892196

  10. Learning Science in a Virtual Reality Application: The Impacts of Animated-Virtual Actors' Visual Complexity

    ERIC Educational Resources Information Center

    Kartiko, Iwan; Kavakli, Manolya; Cheng, Ken

    2010-01-01

    As the technology in computer graphics advances, Animated-Virtual Actors (AVAs) in Virtual Reality (VR) applications become increasingly rich and complex. Cognitive Theory of Multimedia Learning (CTML) suggests that complex visual materials could hinder novice learners from attending to the lesson properly. On the other hand, previous studies have…

  11. Virtual Boutique: a 3D modeling and content-based management approach to e-commerce

    NASA Astrophysics Data System (ADS)

    Paquet, Eric; El-Hakim, Sabry F.

    2000-12-01

    The Virtual Boutique is made out of three modules: the decor, the market and the search engine. The decor is the physical space occupied by the Virtual Boutique. It can reproduce any existing boutique. For this purpose, photogrammetry is used. A set of pictures of a real boutique or space is taken and a virtual 3D representation of this space is calculated from them. Calculations are performed with software developed at NRC. This representation consists of meshes and texture maps. The camera used in the acquisition process determines the resolution of the texture maps. Decorative elements are added like painting, computer generated objects and scanned objects. The objects are scanned with laser scanner developed at NRC. This scanner allows simultaneous acquisition of range and color information based on white laser beam triangulation. The second module, the market, is made out of all the merchandises and the manipulators, which are used to manipulate and compare the objects. The third module, the search engine, can search the inventory based on an object shown by the customer in order to retrieve similar objects base don shape and color. The items of interest are displayed in the boutique by reconfiguring the market space, which mean that the boutique can be continuously customized according to the customer's needs. The Virtual Boutique is entirely written in Java 3D and can run in mono and stereo mode and has been optimized in order to allow high quality rendering.

  12. Improving the Management of an Air Campaign with Virtual Reality

    DTIC Science & Technology

    1995-06-01

    autostereoscopic , and volumetric systems make up this technology area, as well as particular projection technologies like cathode ray tubes (CRTs...They replace the user’s field of view with a virtual recreation. Autostereoscopic systems achieve 3-D in the same fashion as stereoscopic, except...Volumetric systems are fundamentally different from stereo- and autostereoscopic . Whereas stereoscopic systems use two perspective views to mimic a 3-D

  13. Education about Hallucinations Using an Internet Virtual Reality System: A Qualitative Survey

    ERIC Educational Resources Information Center

    Yellowlees, Peter M.; Cook, James N.

    2006-01-01

    Objective: The authors evaluate an Internet virtual reality technology as an education tool about the hallucinations of psychosis. Method: This is a pilot project using Second Life, an Internet-based virtual reality system, in which a virtual reality environment was constructed to simulate the auditory and visual hallucinations of two patients…

  14. Applications of virtual reality to nuclear safeguards and non-proliferation

    SciTech Connect

    Stansfield, S.

    1996-12-31

    This paper presents several applications of virtual reality relevant to the areas of nuclear safeguards and non-proliferation. Each of these applications was developed to the prototype stage at Sandia National Laboratories` Virtual Reality and Intelligent Simulation laboratory. These applications include the use of virtual reality for facility visualization, training of inspection personnel, and security and monitoring of nuclear facilities.

  15. The Potential of Using Virtual Reality Technology in Physical Activity Settings

    ERIC Educational Resources Information Center

    Pasco, Denis

    2013-01-01

    In recent years, virtual reality technology has been successfully used for learning purposes. The purposes of the article are to examine current research on the role of virtual reality in physical activity settings and discuss potential application of using virtual reality technology to enhance learning in physical education. The article starts…

  16. Using Virtual Reality Environment to Improve Joint Attention Associated with Pervasive Developmental Disorder

    ERIC Educational Resources Information Center

    Cheng, Yufang; Huang, Ruowen

    2012-01-01

    The focus of this study is using data glove to practice Joint attention skill in virtual reality environment for people with pervasive developmental disorder (PDD). The virtual reality environment provides a safe environment for PDD people. Especially, when they made errors during practice in virtual reality environment, there is no suffering or…

  17. Turning virtual reality into reality: a checklist to ensure virtual reality studies of eating behavior and physical activity parallel the real world.

    PubMed

    Tal, Aner; Wansink, Brian

    2011-03-01

    Virtual reality (VR) provides a potentially powerful tool for researchers seeking to investigate eating and physical activity. Some unique conditions are necessary to ensure that the psychological processes that influence real eating behavior also influence behavior in VR environments. Accounting for these conditions is critical if VR-assisted research is to accurately reflect real-world situations. The current work discusses key considerations VR researchers must take into account to ensure similar psychological functioning in virtual and actual reality and does so by focusing on the process of spontaneous mental simulation. Spontaneous mental simulation is prevalent under real-world conditions but may be absent under VR conditions, potentially leading to differences in judgment and behavior between virtual and actual reality. For simulation to occur, the virtual environment must be perceived as being available for action. A useful chart is supplied as a reference to help researchers to investigate eating and physical activity more effectively.

  18. Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation.

    PubMed

    Wang, Junchen; Suenaga, Hideyuki; Liao, Hongen; Hoshi, Kazuto; Yang, Liangjing; Kobayashi, Etsuko; Sakuma, Ichiro

    2015-03-01

    Autostereoscopic 3D image overlay for augmented reality (AR) based surgical navigation has been studied and reported many times. For the purpose of surgical overlay, the 3D image is expected to have the same geometric shape as the original organ, and can be transformed to a specified location for image overlay. However, how to generate a 3D image with high geometric fidelity and quantitative evaluation of 3D image's geometric accuracy have not been addressed. This paper proposes a graphics processing unit (GPU) based computer-generated integral imaging pipeline for real-time autostereoscopic 3D display, and an automatic closed-loop 3D image calibration paradigm for displaying undistorted 3D images. Based on the proposed methods, a novel AR device for 3D image surgical overlay is presented, which mainly consists of a 3D display, an AR window, a stereo camera for 3D measurement, and a workstation for information processing. The evaluation on the 3D image rendering performance with 2560×1600 elemental image resolution shows the rendering speeds of 50-60 frames per second (fps) for surface models, and 5-8 fps for large medical volumes. The evaluation of the undistorted 3D image after the calibration yields sub-millimeter geometric accuracy. A phantom experiment simulating oral and maxillofacial surgery was also performed to evaluate the proposed AR overlay device in terms of the image registration accuracy, 3D image overlay accuracy, and the visual effects of the overlay. The experimental results show satisfactory image registration and image overlay accuracy, and confirm the system usability.

  19. The role of presence in virtual reality exposure therapy.

    PubMed

    Price, Matthew; Anderson, Page

    2007-01-01

    A growing body of literature suggests that virtual reality is a successful tool for exposure therapy in the treatment of anxiety disorders. Virtual reality (VR) researchers posit the construct of presence, defined as the interpretation of an artificial stimulus as if it were real, to be a presumed factor that enables anxiety to be felt during virtual reality exposure therapy (VRE). However, a handful of empirical studies on the relation between presence and anxiety in VRE have yielded mixed findings. The current study tested the following hypotheses about the relation between presence and anxiety in VRE with a clinical sample of fearful flyers: (1) presence is related to in-session anxiety; (2) presence mediates the extent that pre-existing (pre-treatment) anxiety is experienced during exposure with VR; (3) presence is positively related to the amount of phobic elements included within the virtual environment; (4) presence is related to treatment outcome. Results supported presence as a factor that contributes to the experience of anxiety in the virtual environment as well as a relation between presence and the phobic elements, but did not support a relation between presence and treatment outcome. The study suggests that presence may be a necessary but insufficient requirement for successful VRE.

  20. The Impact of Virtual Reality on Chronic Pain.

    PubMed

    Jones, Ted; Moore, Todd; Choo, James

    2016-01-01

    The treatment of chronic pain could benefit from additional non-opioid interventions. Virtual reality (VR) has been shown to be effective in decreasing pain for procedural or acute pain but to date there have been few studies on its use in chronic pain. The present study was an investigation of the impact of a virtual reality application for chronic pain. Thirty (30) participants with various chronic pain conditions were offered a five-minute session using a virtual reality application called Cool! Participants were asked about their pain using a 0-10 visual analog scale rating before the VR session, during the session and immediately after the session. They were also asked about immersion into the VR world and about possible side effects. Pain was reduced from pre-session to post-session by 33%. Pain was reduced from pre-session during the VR session by 60%. These changes were both statistically significant at the p < .001 level. Three participants (10%) reported no change between pre and post pain ratings. Ten participants (33%) reported complete pain relief while doing the virtual reality session. All participants (100%) reported a decrease in pain to some degree between pre-session pain and during-session pain. The virtual reality experience was found here to provide a significant amount of pain relief. A head mounted display (HMD) was used with all subjects and no discomfort was experienced. Only one participant noted any side effects. VR seems to have promise as a non-opioid treatment for chronic pain and further investigation is warranted.

  1. The Impact of Virtual Reality on Chronic Pain

    PubMed Central

    Jones, Ted; Moore, Todd; Choo, James

    2016-01-01

    The treatment of chronic pain could benefit from additional non-opioid interventions. Virtual reality (VR) has been shown to be effective in decreasing pain for procedural or acute pain but to date there have been few studies on its use in chronic pain. The present study was an investigation of the impact of a virtual reality application for chronic pain. Thirty (30) participants with various chronic pain conditions were offered a five-minute session using a virtual reality application called Cool! Participants were asked about their pain using a 0–10 visual analog scale rating before the VR session, during the session and immediately after the session. They were also asked about immersion into the VR world and about possible side effects. Pain was reduced from pre-session to post-session by 33%. Pain was reduced from pre-session during the VR session by 60%. These changes were both statistically significant at the p < .001 level. Three participants (10%) reported no change between pre and post pain ratings. Ten participants (33%) reported complete pain relief while doing the virtual reality session. All participants (100%) reported a decrease in pain to some degree between pre-session pain and during-session pain. The virtual reality experience was found here to provide a significant amount of pain relief. A head mounted display (HMD) was used with all subjects and no discomfort was experienced. Only one participant noted any side effects. VR seems to have promise as a non-opioid treatment for chronic pain and further investigation is warranted. PMID:27997539

  2. The use of virtual reality in acrophobia research and treatment.

    PubMed

    Coelho, Carlos M; Waters, Allison M; Hine, Trevor J; Wallis, Guy

    2009-06-01

    Acrophobia, or fear of heights, is a widespread and debilitating anxiety disorder affecting perhaps 1 in 20 adults. Virtual reality (VR) technology has been used in the psychological treatment of acrophobia since 1995, and has come to dominate the treatment of numerous anxiety disorders. It is now known that virtual reality exposure therapy (VRET) regimens are highly effective for acrophobia treatment. This paper reviews current theoretical understanding of acrophobia as well as the evolution of its common treatments from the traditional exposure therapies to the most recent virtually guided ones. In particular, the review focuses on recent innovations in the use of VR technology and discusses the benefits it may offer for examining the underlying causes of the disorder, allowing for the systematic assessment of interrelated factors such as the visual, vestibular and postural control systems.

  3. The assessment of virtual reality for human anatomy instruction

    NASA Technical Reports Server (NTRS)

    Benn, Karen P.

    1994-01-01

    This research project seeks to meet the objective of science training by developing, assessing, and validating virtual reality as a human anatomy training medium. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment the traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three dimensional, unlike the one dimensional depiction found in textbooks and the two dimensional depiction found on the computer. Virtual reality is a breakthrough technology that allows one to step through the computer screen into a three dimensional world. This technology offers many opportunities to enhance science education. Therefore, a virtual testing environment of the abdominopelvic region of a human cadaver was created to study the placement of body parts within the nine anatomical divisions of the abdominopelvic region and the four abdominal quadrants.

  4. Blood Pool Segmentation Results in Superior Virtual Cardiac Models than Myocardial Segmentation for 3D Printing.

    PubMed

    Farooqi, Kanwal M; Lengua, Carlos Gonzalez; Weinberg, Alan D; Nielsen, James C; Sanz, Javier

    2016-08-01

    The method of cardiac magnetic resonance (CMR) three-dimensional (3D) image acquisition and post-processing which should be used to create optimal virtual models for 3D printing has not been studied systematically. Patients (n = 19) who had undergone CMR including both 3D balanced steady-state free precession (bSSFP) imaging and contrast-enhanced magnetic resonance angiography (MRA) were retrospectively identified. Post-processing for the creation of virtual 3D models involved using both myocardial (MS) and blood pool (BP) segmentation, resulting in four groups: Group 1-bSSFP/MS, Group 2-bSSFP/BP, Group 3-MRA/MS and Group 4-MRA/BP. The models created were assessed by two raters for overall quality (1-poor; 2-good; 3-excellent) and ability to identify predefined vessels (1-5: superior vena cava, inferior vena cava, main pulmonary artery, ascending aorta and at least one pulmonary vein). A total of 76 virtual models were created from 19 patient CMR datasets. The mean overall quality scores for Raters 1/2 were 1.63 ± 0.50/1.26 ± 0.45 for Group 1, 2.12 ± 0.50/2.26 ± 0.73 for Group 2, 1.74 ± 0.56/1.53 ± 0.61 for Group 3 and 2.26 ± 0.65/2.68 ± 0.48 for Group 4. The numbers of identified vessels for Raters 1/2 were 4.11 ± 1.32/4.05 ± 1.31 for Group 1, 4.90 ± 0.46/4.95 ± 0.23 for Group 2, 4.32 ± 1.00/4.47 ± 0.84 for Group 3 and 4.74 ± 0.56/4.63 ± 0.49 for Group 4. Models created using BP segmentation (Groups 2 and 4) received significantly higher ratings than those created using MS for both overall quality and number of vessels visualized (p < 0.05), regardless of the acquisition technique. There were no significant differences between Groups 1 and 3. The ratings for Raters 1 and 2 had good correlation for overall quality (ICC = 0.63) and excellent correlation for the total number of vessels visualized (ICC = 0.77). The intra-rater reliability was good for Rater A (ICC = 0.65). Three models were successfully printed

  5. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.

  6. Web GIS in practice VII: stereoscopic 3-D solutions for online maps and virtual globes

    PubMed Central

    Boulos, Maged N Kamel; Robinson, Larry R

    2009-01-01

    Because our pupils are about 6.5 cm apart, each eye views a scene from a different angle and sends a unique image to the visual cortex, which then merges the images from both eyes into a single picture. The slight difference between the right and left images allows the brain to properly perceive the 'third dimension' or depth in a scene (stereopsis). However, when a person views a conventional 2-D (two-dimensional) image representation of a 3-D (three-dimensional) scene on a conventional computer screen, each eye receives essentially the same information. Depth in such cases can only be approximately inferred from visual clues in the image, such as perspective, as only one image is offered to both eyes. The goal of stereoscopic 3-D displays is to project a slightly different image into each eye to achieve a much truer and realistic perception of depth, of different scene planes, and of object relief. This paper presents a brief review of a number of stereoscopic 3-D hardware and software solutions for creating and displaying online maps and virtual globes (such as Google Earth) in "true 3D", with costs ranging from almost free to multi-thousand pounds sterling. A practical account is also given of the experience of the USGS BRD UMESC (United States Geological Survey's Biological Resources Division, Upper Midwest Environmental Sciences Center) in setting up a low-cost, full-colour stereoscopic 3-D system. PMID:19849837

  7. A Voice and Mouse Input Interface for 3D Virtual Environments

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Bryson, Steve T.

    2003-01-01

    There have been many successful stories on how 3D input devices can be fully integrated into an immersive virtual environment. Electromagnetic trackers, optical trackers, gloves, and flying mice are just some of these input devices. Though we can use existing 3D input devices that are commonly used for VR applications, there are several factors that prevent us from choosing these input devices for our applications. One main factor is that most of these tracking devices are not suitable for prolonged use due to human fatigue associated with using them. A second factor is that many of them would occupy additional office space. Another factor is that many of the 3D input devices are expensive due to the unusual hardware that are required. For our VR applications, we want a user interface that would work naturally with standard equipment. In this paper, we demonstrate applications or our proposed muitimodal interface using a 3D dome display. We also show that effective data analysis can be achieved while the scientists view their data rendered inside the dome display and perform user interactions simply using the mouse and voice input. Though the sphere coordinate grid seems to be ideal for interaction using a 3D dome display, we can also use other non-spherical grids as well.

  8. Web GIS in practice VII: stereoscopic 3-D solutions for online maps and virtual globes

    USGS Publications Warehouse

    Boulos, Maged N.K.; Robinson, Larry R.

    2009-01-01

    Because our pupils are about 6.5 cm apart, each eye views a scene from a different angle and sends a unique image to the visual cortex, which then merges the images from both eyes into a single picture. The slight difference between the right and left images allows the brain to properly perceive the 'third dimension' or depth in a scene (stereopsis). However, when a person views a conventional 2-D (two-dimensional) image representation of a 3-D (three-dimensional) scene on a conventional computer screen, each eye receives essentially the same information. Depth in such cases can only be approximately inferred from visual clues in the image, such as perspective, as only one image is offered to both eyes. The goal of stereoscopic 3-D displays is to project a slightly different image into each eye to achieve a much truer and realistic perception of depth, of different scene planes, and of object relief. This paper presents a brief review of a number of stereoscopic 3-D hardware and software solutions for creating and displaying online maps and virtual globes (such as Google Earth) in "true 3D", with costs ranging from almost free to multi-thousand pounds sterling. A practical account is also given of the experience of the USGS BRD UMESC (United States Geological Survey's Biological Resources Division, Upper Midwest Environmental Sciences Center) in setting up a low-cost, full-colour stereoscopic 3-D system.

  9. Web GIS in practice VII: stereoscopic 3-D solutions for online maps and virtual globes.

    PubMed

    Boulos, Maged N Kamel; Robinson, Larry R

    2009-10-22

    Because our pupils are about 6.5 cm apart, each eye views a scene from a different angle and sends a unique image to the visual cortex, which then merges the images from both eyes into a single picture. The slight difference between the right and left images allows the brain to properly perceive the 'third dimension' or depth in a scene (stereopsis). However, when a person views a conventional 2-D (two-dimensional) image representation of a 3-D (three-dimensional) scene on a conventional computer screen, each eye receives essentially the same information. Depth in such cases can only be approximately inferred from visual clues in the image, such as perspective, as only one image is offered to both eyes. The goal of stereoscopic 3-D displays is to project a slightly different image into each eye to achieve a much truer and realistic perception of depth, of different scene planes, and of object relief. This paper presents a brief review of a number of stereoscopic 3-D hardware and software solutions for creating and displaying online maps and virtual globes (such as Google Earth) in "true 3D", with costs ranging from almost free to multi-thousand pounds sterling. A practical account is also given of the experience of the USGS BRD UMESC (United States Geological Survey's Biological Resources Division, Upper Midwest Environmental Sciences Center) in setting up a low-cost, full-colour stereoscopic 3-D system.

  10. Dynamic WIFI-Based Indoor Positioning in 3D Virtual World

    NASA Astrophysics Data System (ADS)

    Chan, S.; Sohn, G.; Wang, L.; Lee, W.

    2013-11-01

    A web-based system based on the 3DTown project was proposed using Google Earth plug-in that brings information from indoor positioning devices and real-time sensors into an integrated 3D indoor and outdoor virtual world to visualize the dynamics of urban life within the 3D context of a city. We addressed limitation of the 3DTown project with particular emphasis on video surveillance camera used for indoor tracking purposes. The proposed solution was to utilize wireless local area network (WLAN) WiFi as a replacement technology for localizing objects of interest due to the wide spread availability and large coverage area of WiFi in indoor building spaces. Indoor positioning was performed using WiFi without modifying existing building infrastructure or introducing additional access points (AP)s. A hybrid probabilistic approach was used for indoor positioning based on previously recorded WiFi fingerprint database in the Petrie Science and Engineering building at York University. In addition, we have developed a 3D building modeling module that allows for efficient reconstruction of outdoor building models to be integrated with indoor building models; a sensor module for receiving, distributing, and visualizing real-time sensor data; and a web-based visualization module for users to explore the dynamic urban life in a virtual world. In order to solve the problems in the implementation of the proposed system, we introduce approaches for integration of indoor building models with indoor positioning data, as well as real-time sensor information and visualization on the web-based system. In this paper we report the preliminary results of our prototype system, demonstrating the system's capability for implementing a dynamic 3D indoor and outdoor virtual world that is composed of discrete modules connected through pre-determined communication protocols.

  11. Virtual reality simulators for gastrointestinal endoscopy training

    PubMed Central

    Triantafyllou, Konstantinos; Lazaridis, Lazaros Dimitrios; Dimitriadis, George D

    2014-01-01

    The use of simulators as educational tools for medical procedures is spreading rapidly and many efforts have been made for their implementation in gastrointestinal endoscopy training. Endoscopy simulation training has been suggested for ascertaining patient safety while positively influencing the trainees’ learning curve. Virtual simulators are the most promising tool among all available types of simulators. These integrated modalities offer a human-like endoscopy experience by combining virtual images of the gastrointestinal tract and haptic realism with using a customized endoscope. From their first steps in the 1980s until today, research involving virtual endoscopic simulators can be divided in two categories: investigation of the impact of virtual simulator training in acquiring endoscopy skills and measuring competence. Emphasis should also be given to the financial impact of their implementation in endoscopy, including the cost of these state-of-the-art simulators and the potential economic benefits from their usage. Advances in technology will contribute to the upgrade of existing models and the development of new ones; while further research should be carried out to discover new fields of application. PMID:24527175

  12. Virtual Dreams Give Way to Digital Reality.

    ERIC Educational Resources Information Center

    LaGuardia, Cheryl

    1995-01-01

    Examines the shift from the vision of the virtual library to the digital library concept. Discusses attitudes toward electronic resources, CD-ROM technology, the appropriate use of electronic formats, differences in information needs, balance between print and electronic media in libraries, and collaborative resource development. A sidebar…

  13. The Realities of K-12 Virtual Education

    ERIC Educational Resources Information Center

    Glass, Gene V.

    2009-01-01

    In a decade, virtual education in its contemporary form of asynchronous, computer-mediated interaction between a teacher and students over the Internet has grown from a novelty to an established mode of education that may provide all or part of formal schooling for nearly one in every 50 students in the US. In a non-random 2007 survey of school…

  14. Presence and rehabilitation: toward second-generation virtual reality applications in neuropsychology

    PubMed Central

    Riva, Giuseppe; Mantovani, Fabrizia; Gaggioli, Andrea

    2004-01-01

    Virtual Reality (VR) offers a blend of attractive attributes for rehabilitation. The most exploited is its ability to create a 3D simulation of reality that can be explored by patients under the supervision of a therapist. In fact, VR can be defined as an advanced communication interface based on interactive 3D visualization, able to collect and integrate different inputs and data sets in a single real-like experience. However, "treatment is not just fixing what is broken; it is nurturing what is best" (Seligman & Csikszentmihalyi). For rehabilitators, this statement supports the growing interest in the influence of positive psychological state on objective health care outcomes. This paper introduces a bio-cultural theory of presence linking the state of optimal experience defined as "flow" to a virtual reality experience. This suggests the possibility of using VR for a new breed of rehabilitative applications focused on a strategy defined as transformation of flow. In this view, VR can be used to trigger a broad empowerment process within the flow experience induced by a high sense of presence. The link between its experiential and simulative capabilities may transform VR into the ultimate rehabilitative device. Nevertheless, further research is required to explore more in depth the link between cognitive processes, motor activities, presence and flow. PMID:15679950

  15. Computer Based Training: Field Deployable Trainer and Shared Virtual Reality

    NASA Technical Reports Server (NTRS)

    Mullen, Terence J.

    1997-01-01

    Astronaut training has traditionally been conducted at specific sites with specialized facilities. Because of its size and nature the training equipment is generally not portable. Efforts are now under way to develop training tools that can be taken to remote locations, including into orbit. Two of these efforts are the Field Deployable Trainer and Shared Virtual Reality projects. Field Deployable Trainer NASA has used the recent shuttle mission by astronaut Shannon Lucid to the Russian space station, Mir, as an opportunity to develop and test a prototype of an on-orbit computer training system. A laptop computer with a customized user interface, a set of specially prepared CD's, and video tapes were taken to the Mir by Ms. Lucid. Based upon the feedback following the launch of the Lucid flight, our team prepared materials for the next Mir visitor. Astronaut John Blaha will fly on NASA/MIR Long Duration Mission 3, set to launch in mid September. He will take with him a customized hard disk drive and a package of compact disks containing training videos, references and maps. The FDT team continues to explore and develop new and innovative ways to conduct offsite astronaut training using personal computers. Shared Virtual Reality Training NASA's Space Flight Training Division has been investigating the use of virtual reality environments for astronaut training. Recent efforts have focused on activities requiring interaction by two or more people, called shared VR. Dr. Bowen Loftin, from the University of Houston, directs a virtual reality laboratory that conducts much of the NASA sponsored research. I worked on a project involving the development of a virtual environment that can be used to train astronauts and others to operate a science unit called a Biological Technology Facility (BTF). Facilities like this will be used to house and control microgravity experiments on the space station. It is hoped that astronauts and instructors will ultimately be able to share

  16. A new 3-D diagnosis strategy for duodenal malignant lesions using multidetector row CT, CT virtual duodenoscopy, duodenography, and 3-D multicholangiography.

    PubMed

    Sata, N; Endo, K; Shimura, K; Koizumi, M; Nagai, H

    2007-01-01

    Recent advances in multidetector row computed tomography (MD-CT) technology provide new opportunities for clinical diagnoses of various diseases. Here we assessed CT virtual duodenoscopy, duodenography, and three-dimensional (3D) multicholangiography created by MD-CT for clinical diagnosis of duodenal malignant lesions. The study involved seven cases of periduodenal carcinoma (four ampullary carcinomas, two duodenal carcinomas, one pancreatic carcinoma). Biliary contrast medium was administered intravenously, followed by intravenous administration of an anticholinergic agent and oral administration of effervescent granules for expanding the upper gastrointestinal tract. Following intravenous administration of a nonionic contrast medium, an upper abdominal MD-CT scan was performed in the left lateral position. Scan data were processed on a workstation to create CT virtual duodenoscopy, duodenography, 3D multicholangiography, and various postprocessing images, which were then evaluated for their effectiveness as preoperative diagnostic tools. Carcinoma location and extent were clearly demonstrated as defects or colored low-density areas in 3-D multicholangiography images and as protruding lesions in virtual duodenography and duodenoscopy images. These findings were confirmed using multiplanar or curved planar reformation images. In conclusion, CT virtual duodenoscopy, doudenography, 3-D multicholangiography, and various images created by MD-CT alone provided necessary and adequate preoperative diagnostic information.

  17. The use of strain gauge platform and virtual reality tool for patient stability examination

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Wysk, Lukasz; Skoczylas, Marcin

    2016-09-01

    Virtual reality is one of the fastest growing information technologies. This paper is only a prelude to a larger study on the use of virtual reality tools in analysing bony labyrinth and sense of balance. Problems with the functioning of these areas of the body are a controversial topic in debate among specialists. The result of still unresolved imbalance treatments is a constant number of people reporting this type of ailment. Considering above, authors created a system and application that contains a model of virtual environment, and a tool for the modification of the obstacles in 3D space. Preliminary studies of patients from a test group aged 22-49 years were also carried out, in which behaviour and sense of balance in relation to the horizontal curvature of the virtual world around patient has been analysed. Experiments carried out on a test group showed that the shape of the curve and the virtual world space and age of patient has a major impact on a sense of balance. The data obtained can be linked with actual disorders of bony labyrinth and human behaviour at the time of their occurrence. Another important achievement that will be the subject of further work is possible use a modified version of the software for rehabilitation purposes.

  18. Design Virtual Reality Scene Roam for Tour Animations Base on VRML and Java

    NASA Astrophysics Data System (ADS)

    Cao, Zaihui; hu, Zhongyan

    Virtual reality has been involved in a wide range of academic and commercial applications. It can give users a natural feeling of the environment by creating realistic virtual worlds. Implementing a virtual tour through a model of a tourist area on the web has become fashionable. In this paper, we present a web-based application that allows a user to, walk through, see, and interact with a fully three-dimensional model of the tourist area. Issues regarding navigation and disorientation areaddressed and we suggest a combination of the metro map and an intuitive navigation system. Finally we present a prototype which implements our ideas. The application of VR techniques integrates the visualization and animation of the three dimensional modelling to landscape analysis. The use of the VRML format produces the possibility to obtain some views of the 3D model and to explore it in real time. It is an important goal for the spatial information sciences.

  19. Putting 3D modelling and 3D printing into practice: virtual surgery and preoperative planning to reconstruct complex post-traumatic skeletal deformities and defects

    PubMed Central

    Tetsworth, Kevin; Block, Steve; Glatt, Vaida

    2017-01-01

    3D printing technology has revolutionized and gradually transformed manufacturing across a broad spectrum of industries, including healthcare. Nowhere is this more apparent than in orthopaedics with many surgeons already incorporating aspects of 3D modelling and virtual procedures into their routine clinical practice. As a more extreme application, patient-specific 3D printed titanium truss cages represent a novel approach for managing the challenge of segmental bone defects. This review illustrates the potential indications of this innovative technique using 3D printed titanium truss cages in conjunction with the Masquelet technique. These implants are custom designed during a virtual surgical planning session with the combined input of an orthopaedic surgeon, an orthopaedic engineering professional and a biomedical design engineer. The ability to 3D model an identical replica of the original intact bone in a virtual procedure is of vital importance when attempting to precisely reconstruct normal anatomy during the actual procedure. Additionally, other important factors must be considered during the planning procedure, such as the three-dimensional configuration of the implant. Meticulous design is necessary to allow for successful implantation through the planned surgical exposure, while being aware of the constraints imposed by local anatomy and prior implants. This review will attempt to synthesize the current state of the art as well as discuss our personal experience using this promising technique. It will address implant design considerations including the mechanical, anatomical and functional aspects unique to each case. PMID:28220752

  20. Putting 3D modelling and 3D printing into practice: virtual surgery and preoperative planning to reconstruct complex post-traumatic skeletal deformities and defects.

    PubMed

    Tetsworth, Kevin; Block, Steve; Glatt, Vaida

    2017-01-01

    3D printing technology has revolutionized and gradually transformed manufacturing across a broad spectrum of industries, including healthcare. Nowhere is this more apparent than in orthopaedics with many surgeons already incorporating aspects of 3D modelling and virtual procedures into their routine clinical practice. As a more extreme application, patient-specific 3D printed titanium truss cages represent a novel approach for managing the challenge of segmental bone defects. This review illustrates the potential indications of this innovative technique using 3D printed titanium truss cages in conjunction with the Masquelet technique. These implants are custom designed during a virtual surgical planning session with the combined input of an orthopaedic surgeon, an orthopaedic engineering professional and a biomedical design engineer. The ability to 3D model an identical replica of the original intact bone in a virtual procedure is of vital importance when attempting to precisely reconstruct normal anatomy during the actual procedure. Additionally, other important factors must be considered during the planning procedure, such as the three-dimensional configuration of the implant. Meticulous design is necessary to allow for successful implantation through the planned surgical exposure, while being aware of the constraints imposed by local anatomy and prior implants. This review will attempt to synthesize the current state of the art as well as discuss our personal experience using this promising technique. It will address implant design considerations including the mechanical, anatomical and functional aspects unique to each case.

  1. Real-time markerless tracking for augmented reality: the virtual visual servoing framework.

    PubMed

    Comport, Andrew I; Marchand, Eric; Pressigout, Muriel; Chaumette, François

    2006-01-01

    Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency at a reasonable cost. In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for a "video see through" monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, nonlinear pose estimation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curves interaction matrices are given for different 3D geometrical primitives including straight lines, circles, cylinders, and spheres. A local moving edges tracker is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively reweighted least squares implementation. This approach is then extended to address the 3D model-free augmented reality problem. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination, and mistracking.

  2. NeuroVR: an open source virtual reality platform for clinical psychology and behavioral neurosciences.

    PubMed

    Riva, Giuseppe; Gaggioli, Andrea; Villani, Daniela; Preziosa, Alessandra; Morganti, Francesca; Corsi, Riccardo; Faletti, Gianluca; Vezzadini, Luca

    2007-01-01

    In the past decade, the use of virtual reality for clinical and research applications has become more widespread. However, the diffusion of this approach is still limited by three main issues: poor usability, lack of technical expertise among clinical professionals, and high costs. To address these challenges, we introduce NeuroVR (http://www.neurovr.org--http://www.neurotiv.org), a cost-free virtual reality platform based on open-source software, that allows non-expert users to adapt the content of a pre-designed virtual environment to meet the specific needs of the clinical or experimental setting. Using the NeuroVR Editor, the user can choose the appropriate psychological stimuli/stressors from a database of objects (both 2D and 3D) and videos, and easily place them into the virtual environment. The edited scene can then be visualized in the NeuroVR Player using either immersive or non-immersive displays. Currently, the NeuroVR library includes different virtual scenes (apartment, office, square, supermarket, park, classroom, etc.), covering two of the most studied clinical applications of VR: specific phobias and eating disorders. The NeuroVR Editor is based on Blender (http://www.blender.org), the open source, cross-platform suite of tools for 3D creation, and is available as a completely free resource. An interesting feature of the NeuroVR Editor is the possibility to add new objects to the database. This feature allows the therapist to enhance the patient's feeling of familiarity and intimacy with the virtual scene, i.e., by using photos or movies of objects/people that are part of the patient's daily life, thereby improving the efficacy of the exposure. The NeuroVR platform runs on standard personal computers with Microsoft Windows; the only requirement for the hardware is related to the graphics card, which must support OpenGL.

  3. Exploring conformational search protocols for ligand-based virtual screening and 3-D QSAR modeling

    NASA Astrophysics Data System (ADS)

    Cappel, Daniel; Dixon, Steven L.; Sherman, Woody; Duan, Jianxin

    2015-02-01

    3-D ligand conformations are required for most ligand-based drug design methods, such as pharmacophore modeling, shape-based screening, and 3-D QSAR model building. Many studies of conformational search methods have focused on the reproduction of crystal structures (i.e. bioactive conformations); however, for ligand-based modeling the key question is how to generate a ligand alignment that produces the best results for a given query molecule. In this work, we study different conformation generation modes of ConfGen and the impact on virtual screening (Shape Screening and e-Pharmacophore) and QSAR predictions (atom-based and field-based). In addition, we develop a new search method, called common scaffold alignment, that automatically detects the maximum common scaffold between each screening molecule and the query to ensure identical coordinates of the common core, thereby minimizing the noise introduced by analogous parts of the molecules. In general, we find that virtual screening results are relatively insensitive to the conformational search protocol; hence, a conformational search method that generates fewer conformations could be considered "better" because it is more computationally efficient for screening. However, for 3-D QSAR modeling we find that more thorough conformational sampling tends to produce better QSAR predictions. In addition, significant improvements in QSAR predictions are obtained with the common scaffold alignment protocol developed in this work, which focuses conformational sampling on parts of the molecules that are not part of the common scaffold.

  4. Virtual reality as a mechanism for exposure therapy.

    PubMed

    de Carvalho, Marcele Regine; Freire, Rafael C; Nardi, Antonio Egidio

    2010-03-01

    Virtual reality (VR) is as effective in inducing emotional responses as reality and its application is extremely valuable in exposure treatment. In virtual environments, the patients experience similar physiological symptoms and fear as they do in real life situations, thereby facilitating the habituation process. Our goal is to offer an overview of the current panorama of VR and psychotherapy, underlining the (virtual) exposure technique and the studies that focus on panic disorder treatment through the use of VR. The literature was revised through consultation to the ISI and PubMed databases. Virtual exposure treatment offers good results and great patient acceptability. However, despite the importance of this data for the evaluation of treatment efficacy, only a few studies measure physiological responses during exposure. Lack of controlled studies and standardized treatment protocols were observed. Despite the great advance of VR use in psychotherapy, a great deal of its potential is still unknown, therefore requiring the creation of new virtual environments so that controlled studies regarding its clinical application can be conducted. Throughout the process of elaboration and investigation, clinical experiences in virtual environments must be related to real experiences in a flexible context that combines relevant cultural, physical and cognitive aspects.

  5. EEG-based cognitive load of processing events in 3D virtual worlds is lower than processing events in 2D displays.

    PubMed

    Dan, Alex; Reiner, Miriam

    2016-08-31

    Interacting with 2D displays, such as computer screens, smartphones, and TV, is currently a part of our daily routine; however, our visual system is built for processing 3D worlds. We examined the cognitive load associated with a simple and a complex task of learning paper-folding (origami) by observing 2D or stereoscopic 3D displays. While connected to an electroencephalogram (EEG) system, participants watched a 2D video of an instructor demonstrating the paper-folding tasks, followed by a stereoscopic 3D projection of the same instructor (a digital avatar) illustrating identical tasks. We recorded the power of alpha and theta oscillations and calculated the cognitive load index (CLI) as the ratio of the average power of frontal theta (Fz.) and parietal alpha (Pz). The results showed a significantly higher cognitive load index associated with processing the 2D projection as compared to the 3D projection; additionally, changes in the average theta Fz power were larger for the 2D conditions as compared to the 3D conditions, while alpha average Pz power values were similar for 2D and 3D conditions for the less complex task and higher in the 3D state for the more complex task. The cognitive load index was lower for the easier task and higher for the more complex task in 2D and 3D. In addition, participants with lower spatial abilities benefited more from the 3D compared to the 2D display. These findings have implications for understanding cognitive processing associated with 2D and 3D worlds and for employing stereoscopic 3D technology over 2D displays in designing emerging virtual and augmented reality applications.

  6. Lead-oriented synthesis: Investigation of organolithium-mediated routes to 3-D scaffolds and 3-D shape analysis of a virtual lead-like library.

    PubMed

    Lüthy, Monique; Wheldon, Mary C; Haji-Cheteh, Chehasnah; Atobe, Masakazu; Bond, Paul S; O'Brien, Peter; Hubbard, Roderick E; Fairlamb, Ian J S

    2015-06-01

    Synthetic routes to six 3-D scaffolds containing piperazine, pyrrolidine and piperidine cores have been developed. The synthetic methodology focused on the use of N-Boc α-lithiation-trapping chemistry. Notably, suitably protected and/or functionalised medicinal chemistry building blocks were synthesised via concise, connective methodology. This represents a rare example of lead-oriented synthesis. A virtual library of 190 compounds was then enumerated from the six scaffolds. Of these, 92 compounds (48%) fit the lead-like criteria of: (i) -1⩽AlogP⩽3; (ii) 14⩽number of heavy atoms⩽26; (iii) total polar surface area⩾50Å(2). The 3-D shapes of the 190 compounds were analysed using a triangular plot of normalised principal moments of inertia (PMI). From this, 46 compounds were identified which had lead-like properties and possessed 3-D shapes in under-represented areas of pharmaceutical space. Thus, the PMI analysis of the 190 member virtual library showed that whilst scaffolds which may appear on paper to be 3-D in shape, only 24% of the compounds actually had 3-D structures in the more interesting areas of 3-D drug space.

  7. A virtual interface for interactions with 3D models of the human body.

    PubMed

    De Paolis, Lucio T; Pulimeno, Marco; Aloisio, Giovanni

    2009-01-01

    The developed system is the first prototype of a virtual interface designed to avoid contact with the computer so that the surgeon is able to visualize 3D models of the patient's organs more effectively during surgical procedure or to use this in the pre-operative planning. The doctor will be able to rotate, to translate and to zoom in on 3D models of the patient's organs simply by moving his finger in free space; in addition, it is possible to choose to visualize all of the organs or only some of them. All of the interactions with the models happen in real-time using the virtual interface which appears as a touch-screen suspended in free space in a position chosen by the user when the application is started up. Finger movements are detected by means of an optical tracking system and are used to simulate touch with the interface and to interact by pressing the buttons present on the virtual screen.

  8. Visualization of reservoir simulation data with an immersive virtual reality system

    SciTech Connect

    Williams, B.K.

    1996-10-01

    This paper discusses an investigation into the use of an immersive virtual reality (VR) system to visualize reservoir simulation output data. The hardware and software configurations of the test-immersive VR system are described and compared to a nonimmersive VR system and to an existing workstation screen-based visualization system. The structure of 3D reservoir simulation data and the actions to be performed on the data within the VR system are discussed. The subjective results of the investigation are then presented, followed by a discussion of possible future work.

  9. Virtual reality simulation in neurosurgery: technologies and evolution.

    PubMed

    Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H

    2013-01-01

    Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.

  10. Astronauts Prepare for Mission With Virtual Reality Hardware

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Astronauts John M. Grunsfeld (left), STS-109 payload commander, and Nancy J. Currie, mission specialist, use the virtual reality lab at Johnson Space Center to train for upcoming duties aboard the Space Shuttle Columbia. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team to perform its duties for the fourth Hubble Space Telescope Servicing mission. The most familiar form of virtual reality technology is some form of headpiece, which fits over your eyes and displays a three dimensional computerized image of another place. Turn your head left and right, and you see what would be to your sides; turn around, and you see what might be sneaking up on you. An important part of the technology is some type of data glove that you use to propel yourself through the virtual world. Currently, the medical community is using the new technologies in four major ways: To see parts of the body more accurately, for study, to make better diagnosis of disease and to plan surgery in more detail; to obtain a more accurate picture of a procedure during surgery; to perform more types of surgery with the most noninvasive, accurate methods possible; and to model interactions among molecules at a molecular level.

  11. Simulation Of Assembly Processes With Technical Of Virtual Reality

    NASA Astrophysics Data System (ADS)

    García García, Manuel; Arenas Reina, José Manuel; Lite, Alberto Sánchez; Sebastián Pérez, Miguel Ángel

    2009-11-01

    Virtual reality techniques use at industrial processes provides a real approach to product life cycle. For components manual assembly, the use of virtual surroundings facilitates a simultaneous engineering in which variables such as human factors and productivity take a real act. On the other hand, in the actual phase of industrial competition it is required a rapid adjustment to client needs and to market situation. In this work it is analyzed the assembly of the front components of a vehicle using virtual reality tools and following up a product-process design methodology which includes every life service stage. This study is based on workstations design, taking into account productive and human factors from the ergonomic point of view implementing a postural study of every assembly operation, leaving the rest of stages for a later study. Design is optimized applying this methodology together with the use of virtual reality tools. It is also achieved a 15% reduction on time assembly and of 90% reduction in muscle—skeletal diseases at every assembly operation.

  12. Designing 3 Dimensional Virtual Reality Using Panoramic Image

    NASA Astrophysics Data System (ADS)

    Wan Abd Arif, Wan Norazlinawati; Wan Ahmad, Wan Fatimah; Nordin, Shahrina Md.; Abdullah, Azrai; Sivapalan, Subarna

    The high demand to improve the quality of the presentation in the knowledge sharing field is to compete with rapidly growing technology. The needs for development of technology based learning and training lead to an idea to develop an Oil and Gas Plant Virtual Environment (OGPVE) for the benefit of our future. Panoramic Virtual Reality learning based environment is essential in order to help educators overcome the limitations in traditional technical writing lesson. Virtual reality will help users to understand better by providing the simulations of real-world and hard to reach environment with high degree of realistic experience and interactivity. Thus, in order to create a courseware which will achieve the objective, accurate images of intended scenarios must be acquired. The panorama shows the OGPVE and helps to generate ideas to users on what they have learnt. This paper discusses part of the development in panoramic virtual reality. The important phases for developing successful panoramic image are image acquisition and image stitching or mosaicing. In this paper, the combination of wide field-of-view (FOV) and close up image used in this panoramic development are also discussed.

  13. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  14. Visualization of large scale geologically related data in virtual 3D scenes with OpenGL

    NASA Astrophysics Data System (ADS)

    Seng, Dewen; Liang, Xi; Wang, Hongxia; Yue, Guoying

    2007-11-01

    This paper demonstrates a method for three-dimensional (3D) reconstruction and visualization of large scale multidimensional surficial, geological and mine planning data with the programmable visualization environment OpenGL. A simulation system developed by the authors is presented for importing, filtering and visualizing of multidimensional geologically related data. The approach for the visual simulation of complicated mining engineering environment implemented in the system is described in detail. Aspects like presentations of multidimensional data with spatial dependence, navigation in the surficial and geological frame of reference and in time, interaction techniques are presented. The system supports real 3D landscape representations. Furthermore, the system provides many visualization methods for rendering multidimensional data within virtual 3D scenes and combines them with several navigation techniques. Real data derived from an iron mine in Wuhan City of China demonstrates the effectiveness and efficiency of the system. A case study with the results and benefits achieved by using real 3D representations and navigations of the system is given.

  15. 3D QSAR studies, pharmacophore modeling and virtual screening on a series of steroidal aromatase inhibitors.

    PubMed

    Xie, Huiding; Qiu, Kaixiong; Xie, Xiaoguang

    2014-11-14

    Aromatase inhibitors are the most important targets in treatment of estrogen-dependent cancers. In order to search for potent steroidal aromatase inhibitors (SAIs) with lower side effects and overcome cellular resistance, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were performed on a series of SAIs to build 3D QSAR models. The reliable and predictive CoMFA and CoMSIA models were obtained with statistical results (CoMFA: q² = 0.636, r²(ncv) = 0.988, r²(pred) = 0.658; CoMSIA: q² = 0.843, r²(ncv) = 0.989, r²(pred) = 0.601). This 3D QSAR approach provides significant insights that can be used to develop novel and potent SAIs. In addition, Genetic algorithm with linear assignment of hypermolecular alignment of database (GALAHAD) was used to derive 3D pharmacophore models. The selected pharmacophore model contains two acceptor atoms and four hydrophobic centers, which was used as a 3D query for virtual screening against NCI2000 database. Six hit compounds were obtained and their biological activities were further predicted by the CoMFA and CoMSIA models, which are expected to design potent and novel SAIs.

  16. Virtual Sculpting and 3D Printing for Young People with Disabilities.

    PubMed

    Mcloughlin, Leigh; Fryazinov, Oleg; Moseley, Mark; Sanchez, Mathieu; Adzhiev, Valery; Comninos, Peter; Pasko, Alexander

    2016-01-01

    The SHIVA project was designed to provide virtual sculpting tools for young people with complex disabilities, allowing them to engage with artistic and creative activities that they might otherwise never be able to access. Their creations are then physically built using 3D printing. To achieve this, the authors built a generic, accessible GUI and a suitable geometric modeling system and used these to produce two prototype modeling exercises. These tools were deployed in a school for students with complex disabilities and are now being used for a variety of educational and developmental purposes. This article presents the project's motivations, approach, and implementation details together with initial results, including 3D printed objects designed by young people with disabilities.

  17. Going Virtual… or Not: Development and Testing of a 3D Virtual Astronomy Environment

    NASA Astrophysics Data System (ADS)

    Ruzhitskaya, L.; Speck, A.; Ding, N.; Baldridge, S.; Witzig, S.; Laffey, J.

    2013-04-01

    We present our preliminary results of a pilot study of students' knowledge transfer of an astronomy concept into a new environment. We also share our discoveries on what aspects of a 3D environment students consider being motivational and discouraging for their learning. This study was conducted among 64 non-science major students enrolled in an astronomy laboratory course. During the course, students learned the concept and applications of Kepler's laws using a 2D interactive environment. Later in the semester, the students were placed in a 3D environment in which they were asked to conduct observations and to answers a set of questions pertaining to the Kepler's laws of planetary motion. In this study, we were interested in observing scrutinizing and assessing students' behavior: from choices that they made while creating their avatars (virtual representations) to tools they choose to use, to their navigational patterns, to their levels of discourse in the environment. These helped us to identify what features of the 3D environment our participants found to be helpful and interesting and what tools created unnecessary clutter and distraction. The students' social behavior patterns in the virtual environment together with their answers to the questions helped us to determine how well they understood Kepler's laws, how well they could transfer the concepts to a new situation, and at what point a motivational tool such as a 3D environment becomes a disruption to the constructive learning. Our founding confirmed that students construct deeper knowledge of a concept when they are fully immersed in the environment.

  18. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    PubMed

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  19. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    PubMed Central

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  20. Virtual reality and telerobotics applications of an Address Recalculation Pipeline

    NASA Technical Reports Server (NTRS)

    Regan, Matthew; Pose, Ronald

    1994-01-01

    The technology described in this paper was designed to reduce latency to user interactions in immersive virtual reality environments. It is also ideally suited to telerobotic applications such as interaction with remote robotic manipulators in space or in deep sea operations. in such circumstances the significant latency is observed response to user stimulus which is due to communications delays, and the disturbing jerkiness due to low and unpredictable frame rates on compressed video user feedback or computationally limited virtual worlds, can be masked by our techniques. The user is provided with highly responsive visual feedback independent of communication or computational delays in providing physical video feedback or in rendering virtual world images. Virtual and physical environments can be combined seamlessly using these techniques.

  1. 3-D Imaging In Virtual Environment: A Scientific Clinical and Teaching Tool

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; DeVincenzi, Donald L. (Technical Monitor)

    1996-01-01

    The advent of powerful graphics workstations and computers has led to the advancement of scientific knowledge through three-dimensional (3-D) reconstruction and imaging of biological cells and tissues. The Biocomputation Center at NASA Ames Research Center pioneered the effort to produce an entirely computerized method for reconstruction of objects from serial sections studied in a transmission electron microscope (TEM). The software developed, ROSS (Reconstruction of Serial Sections), is now being distributed to users across the United States through Space Act Agreements. The software is in widely disparate fields such as geology, botany, biology and medicine. In the Biocomputation Center, ROSS serves as the basis for development of virtual environment technologies for scientific and medical use. This report will describe the Virtual Surgery Workstation Project that is ongoing with clinicians at Stanford University Medical Center, and the role of the Visible Human data in the project.

  2. Integration of the virtual model of a Stewart platform with the avatar of a vehicle in a virtual reality

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2016-08-01

    The development of methods of computer aided design and engineering allows conducting virtual tests, among others concerning motion simulation of technical means. The paper presents a method of integrating an object in the form of a virtual model of a Stewart platform with an avatar of a vehicle moving in a virtual environment. The area of the problem includes issues related to the problem of fidelity of mapping the work of the analyzed technical mean. The main object of investigations is a 3D model of a Stewart platform, which is a subsystem of the simulator designated for driving learning for disabled persons. The analyzed model of the platform, prepared for motion simulation, was created in the “Motion Simulation” module of a CAD/CAE class system Siemens PLM NX. Whereas the virtual environment, in which the moves the avatar of the passenger car, was elaborated in a VR class system EON Studio. The element integrating both of the mentioned software environments is a developed application that reads information from the virtual reality (VR) concerning the current position of the car avatar. Then, basing on the accepted algorithm, it sends control signals to respective joints of the model of the Stewart platform (CAD).

  3. Fast generation of virtual X-ray images for reconstruction of 3D anatomy.

    PubMed

    Ehlke, Moritz; Ramm, Heiko; Lamecker, Hans; Hege, Hans-Christian; Zachow, Stefan

    2013-12-01

    We propose a novel GPU-based approach to render virtual X-ray projections of deformable tetrahedral meshes. These meshes represent the shape and the internal density distribution of a particular anatomical structure and are derived from statistical shape and intensity models (SSIMs). We apply our method to improve the geometric reconstruction of 3D anatomy (e.g. pelvic bone) from 2D X-ray images. For that purpose, shape and density of a tetrahedral mesh are varied and virtual X-ray projections are generated within an optimization process until the similarity between the computed virtual X-ray and the respective anatomy depicted in a given clinical X-ray is maximized. The OpenGL implementation presented in this work deforms and projects tetrahedral meshes of high resolution (200.000+ tetrahedra) at interactive rates. It generates virtual X-rays that accurately depict the density distribution of an anatomy of interest. Compared to existing methods that accumulate X-ray attenuation in deformable meshes, our novel approach significantly boosts the deformation/projection performance. The proposed projection algorithm scales better with respect to mesh resolution and complexity of the density distribution, and the combined deformation and projection on the GPU scales better with respect to the number of deformation parameters. The gain in performance allows for a larger number of cycles in the optimization process. Consequently, it reduces the risk of being stuck in a local optimum. We believe that our approach will improve treatments in orthopedics, where 3D anatomical information is essential.

  4. Surgical approaches to complex vascular lesions: the use of virtual reality and stereoscopic analysis as a tool for resident and student education.

    PubMed

    Agarwal, Nitin; Schmitt, Paul J; Sukul, Vishad; Prestigiacomo, Charles J

    2012-08-01

    Virtual reality training for complex tasks has been shown to be of benefit in fields involving highly technical and demanding skill sets. The use of a stereoscopic three-dimensional (3D) virtual reality environment to teach a patient-specific analysis of the microsurgical treatment modalities of a complex basilar aneurysm is presented. Three different surgical approaches were evaluated in a virtual environment and then compared to elucidate the best surgical approach. These approaches were assessed with regard to the line-of-sight, skull base anatomy and visualisation of the relevant anatomy at the level of the basilar artery and surrounding structures. Overall, the stereoscopic 3D virtual reality environment with fusion of multimodality imaging affords an excellent teaching tool for residents and medical students to learn surgical approaches to vascular lesions. Future studies will assess the educational benefits of this modality and develop a series of metrics for student assessments.

  5. Virtual reality technologies for research and education in obesity and diabetes: research needs and opportunities.

    PubMed

    Ershow, Abby G; Peterson, Charles M; Riley, William T; Rizzo, Albert Skip; Wansink, Brian

    2011-03-01

    The rising rates, high prevalence, and adverse consequences of obesity and diabetes call for new approaches to the complex behaviors needed to prevent and manage these conditions. Virtual reality (VR) technologies, which provide controllable, multisensory, interactive three-dimensional (3D) stimulus environments, are a potentially valuable means of engaging patients in interventions that foster more healthful eating and physical activity patterns. Furthermore, the capacity of VR technologies to motivate, record, and measure human performance represents a novel and useful modality for conducting research. This article summarizes background information and discussions for a joint July 2010 National Institutes of Health - Department of Defense workshop entitled Virtual Reality Technologies for Research and Education in Obesity and Diabetes. The workshop explored the research potential of VR technologies as tools for behavioral and neuroscience studies in diabetes and obesity, and the practical potential of VR in fostering more effective utilization of diabetes- and obesity-related nutrition and lifestyle information. Virtual reality technologies were considered especially relevant for fostering desirable health-related behaviors through motivational reinforcement, personalized teaching approaches, and social networking. Virtual reality might also be a means of extending the availability and capacity of health care providers. Progress in the field will be enhanced by further developing available platforms and taking advantage of VR's capabilities as a research tool for well-designed hypothesis-testing behavioral science. Multidisciplinary collaborations are needed between the technology industry and academia, and among researchers in biomedical, behavioral, pedagogical, and computer science disciplines. Research priorities and funding opportunities for use of VR to improve prevention and management of obesity and diabetes can be found at agency websites (National

  6. Immersive virtual reality and environmental noise assessment: An innovative audio–visual approach

    SciTech Connect

    Ruotolo, Francesco; Maffei, Luigi; Di Gabriele, Maria; Iachini, Tina; Masullo, Massimiliano; Ruggiero, Gennaro; Senese, Vincenzo Paolo

    2013-07-15

    Several international studies have shown that traffic noise has a negative impact on people's health and that people's annoyance does not depend only on noise energetic levels, but rather on multi-perceptual factors. The combination of virtual reality technology and audio rendering techniques allow us to experiment a new approach for environmental noise assessment that can help to investigate in advance the potential negative effects of noise associated with a specific project and that in turn can help designers to make educated decisions. In the present study, the audio–visual impact of a new motorway project on people has been assessed by means of immersive virtual reality technology. In particular, participants were exposed to 3D reconstructions of an actual landscape without the projected motorway (ante operam condition), and of the same landscape with the projected motorway (post operam condition). Furthermore, individuals' reactions to noise were assessed by means of objective cognitive measures (short term verbal memory and executive functions) and subjective evaluations (noise and visual annoyance). Overall, the results showed that the introduction of a projected motorway in the environment can have immediate detrimental effects of people's well-being depending on the distance from the noise source. In particular, noise due to the new infrastructure seems to exert a negative influence on short term verbal memory and to increase both visual and noise annoyance. The theoretical and practical implications of these findings are discussed. -- Highlights: ► Impact of traffic noise on people's well-being depends on multi-perceptual factors. ► A multisensory virtual reality technology is used to simulate a projected motorway. ► Effects on short-term memory and auditory and visual subjective annoyance were found. ► The closer the distance from the motorway the stronger was the effect. ► Multisensory virtual reality methodologies can be used to study

  7. Virtual Reality Technologies for Research and Education in Obesity and Diabetes: Research Needs and Opportunities

    PubMed Central

    Ershow, Abby G; Peterson, Charles M; Riley, William T; Rizzo, Albert “Skip”; Wansink, Brian

    2011-01-01

    The rising rates, high prevalence, and adverse consequences of obesity and diabetes call for new approaches to the complex behaviors needed to prevent and manage these conditions. Virtual reality (VR) technologies, which provide controllable, multisensory, interactive three-dimensional (3D) stimulus environments, are a potentially valuable means of engaging patients in interventions that foster more healthful eating and physical activity patterns. Furthermore, the capacity of VR technologies to motivate, record, and measure human performance represents a novel and useful modality for conducting research. This article summarizes background information and discussions for a joint July 2010 National Institutes of Health – Department of Defense workshop entitled Virtual Reality Technologies for Research and Education in Obesity and Diabetes. The workshop explored the research potential of VR technologies as tools for behavioral and neuroscience studies in diabetes and obesity, and the practical potential of VR in fostering more effective utilization of diabetes- and obesity-related nutrition and lifestyle information. Virtual reality technologies were considered especially relevant for fostering desirable health-related behaviors through motivational reinforcement, personalized teaching approaches, and social networking. Virtual reality might also be a means of extending the availability and capacity of health care providers. Progress in the field will be enhanced by further developing available platforms and taking advantage of VR’s capabilities as a research tool for well-designed hypothesis-testing behavioral science. Multidisciplinary collaborations are needed between the technology industry and academia, and among researchers in biomedical, behavioral, pedagogical, and computer science disciplines. Research priorities and funding opportunities for use of VR to improve prevention and management of obesity and diabetes can be found at agency websites (National

  8. Reduced Mimicry to Virtual Reality Avatars in Autism Spectrum Disorder.

    PubMed

    Forbes, Paul A G; Pan, Xueni; de C Hamilton, Antonia F

    2016-12-01

    Mimicry involves unconsciously copying the actions of others. Increasing evidence suggests that autistic people can copy the goal of an observed action but show differences in their mimicry. We investigated mimicry in autism spectrum disorder (ASD) within a two-dimensional virtual reality environment. Participants played an imitation game with a socially engaged avatar and socially disengaged avatar. Despite being told only to copy the goal of the observed action, autistic participants and matched neurotypical participants mimicked the kinematics of the avatars' movements. However, autistic participants mimicked less. Social engagement did not modulate mimicry in either group. The results demonstrate the feasibility of using virtual reality to induce mimicry and suggest mimicry differences in ASD may also occur when interacting with avatars.

  9. Application of Virtual, Augmented, and Mixed Reality to Urology

    PubMed Central

    2016-01-01

    Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected. PMID:27706017

  10. Application of Virtual, Augmented, and Mixed Reality to Urology.

    PubMed

    Hamacher, Alaric; Kim, Su Jin; Cho, Sung Tae; Pardeshi, Sunil; Lee, Seung Hyun; Eun, Sung-Jong; Whangbo, Taeg Keun

    2016-09-01

    Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected.

  11. Virtual Reality Simulation of Gynecologic Laparoscopy

    PubMed

    Bernstein

    1996-08-01

    Realistic virtual simulation of gynecologic laparoscopy would permit the surgeon to practice any procedure, with any degree of pathology, at any time and as many times as necessary to achieve proficiency before attempting it in the operating room. Effective computer simulation requires accurate anatomy, realistic three-dimensional computer graphics, the ability to cut and deform tissue in response to instruments, and an appropriate hardware interface. The Visible Human Project from the National Library of Medicine has made available extremely accurate, three-dimensional, digital data that computer animation companies have begun to transform to three-dimensional graphic images. The problem of tissue deformation and movement is approached by a software package called TELEOS. Hardware consisting of two scissor-grip laparoscopic handles mounted on a sensor can interface with any simulation program to simulate a multiplicity of laparoscopic instruments. The next step will be to combine TELEOS with the three-dimensional anatomy data and configure it for gynecologic surgery.

  12. Human Factors in Virtual Reality Development

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Proffitt, Dennis R.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    This half-day tutorial will provide an overview of basic perceptual functioning as it relates to the design of virtual environment systems. The tutorial consists of three parts. First, basic issues in visual perception will be presented, including discussions of the visual sensations of brightness and color, and the visual perception of depth relationships in three-dimensional space (with a special emphasis on motion -specified depth). The second section will discuss the importance of conducting human-factors user studies and evaluations. Examples and suggestions on how best to get help with user studies will be provided. Finally, we will discuss how, by drawing on their complementary competencies, perceptual psychologists and computer engineers can work as a team to develop optimal VR systems, technologies, and techniques.

  13. Towards a Transcription System of Sign Language for 3D Virtual Agents

    NASA Astrophysics Data System (ADS)

    Do Amaral, Wanessa Machado; de Martino, José Mario

    Accessibility is a growing concern in computer science. Since virtual information is mostly presented visually, it may seem that access for deaf people is not an issue. However, for prelingually deaf individuals, those who were deaf since before acquiring and formally learn a language, written information is often of limited accessibility than if presented in signing. Further, for this community, signing is their language of choice, and reading text in a spoken language is akin to using a foreign language. Sign language uses gestures and facial expressions and is widely used by deaf communities. To enabling efficient production of signed content on virtual environment, it is necessary to make written records of signs. Transcription systems have been developed to describe sign languages in written form, but these systems have limitations. Since they were not originally designed with computer animation in mind, in general, the recognition and reproduction of signs in these systems is an easy task only to those who deeply know the system. The aim of this work is to develop a transcription system to provide signed content in virtual environment. To animate a virtual avatar, a transcription system requires explicit enough information, such as movement speed, signs concatenation, sequence of each hold-and-movement and facial expressions, trying to articulate close to reality. Although many important studies in sign languages have been published, the transcription problem remains a challenge. Thus, a notation to describe, store and play signed content in virtual environments offers a multidisciplinary study and research tool, which may help linguistic studies to understand the sign languages structure and grammar.

  14. [3D-TECHNOLOGIES AS A CORE ELEMENT OF PLANNING AND IMPLEMENTATION OF VIRTUAL AND ACTUAL RENAL SURGERY].

    PubMed

    Glybochko, P V; Aljaev, Ju G; Bezrukov, E A; Sirota, E S; Proskura, A V

    2015-01-01

    The purpose of this article is to demonstrate the role of modern computer technologies in performing virtual and actual renal tumor surgery. Currently 3D modeling makes it possible to clearly define strategy and tactics of an individual patient treatment.

  15. Collaborative virtual reality environments for computational science and design.

    SciTech Connect

    Papka, M. E.

    1998-02-17

    The authors are developing a networked, multi-user, virtual-reality-based collaborative environment coupled to one or more petaFLOPs computers, enabling the interactive simulation of 10{sup 9} atom systems. The purpose of this work is to explore the requirements for this coupling. Through the design, development, and testing of such systems, they hope to gain knowledge that allows computational scientists to discover and analyze their results more quickly and in a more intuitive manner.

  16. Fostering Learning Through Interprofessional Virtual Reality Simulation Development.

    PubMed

    Nicely, Stephanie; Farra, Sharon

    2015-01-01

    This article presents a unique strategy for improving didactic learning and clinical skill while simultaneously fostering interprofessional collaboration and communication. Senior-level nursing students collaborated with students enrolled in the Department of Interactive Media Studies to design a virtual reality simulation based upon disaster management and triage techniques. Collaborative creation of the simulation proved to be a strategy for enhancing students' knowledge of and skill in disaster management and triage while impacting attitudes about interprofessional communication and teamwork.

  17. Assessing suturing techniques using a virtual reality surgical simulator.

    PubMed

    Kazemi, Hamed; Rappel, James K; Poston, Timothy; Hai Lim, Beng; Burdet, Etienne; Leong Teo, Chee

    2010-09-01

    Advantages of virtual-reality simulators surgical skill assessment and training include more training time, no risk to patient, repeatable difficulty level, reliable feedback, without the resource demands, and ethical issues of animal-based training. We tested this for a key subtask and showed a strong link between skill in the simulator and in reality. Suturing performance was assessed for four groups of participants, including experienced surgeons and naive subjects, on a custom-made virtual-reality simulator. Each subject tried the experiment 30 times using five different types of needles to perform a standardized suture placement task. Traditional metrics of performance as well as new metrics enabled by our system were proposed, and the data indicate difference between trained and untrained performance. In all traditional parameters such as time, number of attempts, and motion quantity, the medical surgeons outperformed the other three groups, though differences were not significant. However, motion smoothness, penetration and exit angles, tear size areas, and orientation change were statistically significant in the trained group when compared with untrained group. This suggests that these parameters can be used in virtual microsurgery training.

  18. Virtual reality therapy: an effective treatment for phobias.

    PubMed

    North, M M; North, S M; Coble, J R

    1998-01-01

    Behavioral therapy techniques for treating phobias often includes graded exposure of the patient to anxiety-producing stimuli (Systematic Desensitization). However, in utilizing systematic desensitization, research reviews demonstrate that many patients appear to have difficulty in applying imaginative techniques. This chapter describes the Virtual Reality Therapy (VRT), a new therapeutical approach that can be used to overcome some of the difficulties inherent in the traditional treatment of phobias. VRT, like current imaginal and in vivo modalities, can generate stimuli that could be utilized in desensitization therapy. Like systematic desensitization therapy, VRT can provide stimuli for patients who have difficulty in imagining scenes and/or are too phobic to experience real situations. As far as we know, the idea of using virtual reality technology to combat psychological disorders was first conceived within the Human-Computer Interaction Group at Clark Atlanta University in November 1992. Since then, we have successfully conducted the first known pilot experiments in the use of virtual reality technologies in the treatment of specific phobias: fear of flying, fear of heights, fear of being in certain situations (such as a dark barn, an enclosed bridge over a river, and in the presence of an animal [a black cat] in a dark room), and fear of public speaking. The results of these experiments are described.

  19. Mixed reality virtual pets to reduce childhood obesity.

    PubMed

    Johnsen, Kyle; Ahn, Sun Joo; Moore, James; Brown, Scott; Robertson, Thomas P; Marable, Amanda; Basu, Aryabrata

    2014-04-01

    Novel approaches are needed to reduce the high rates of childhood obesity in the developed world. While multifactorial in cause, a major factor is an increasingly sedentary lifestyle of children. Our research shows that a mixed reality system that is of interest to children can be a powerful motivator of healthy activity. We designed and constructed a mixed reality system that allowed children to exercise, play with, and train a virtual pet using their own physical activity as input. The health, happiness, and intelligence of each virtual pet grew as its associated child owner exercised more, reached goals, and interacted with their pet. We report results of a research study involving 61 children from a local summer camp that shows a large increase in recorded and observed activity, alongside observational evidence that the virtual pet was responsible for that change. These results, and the ease at which the system integrated into the camp environment, demonstrate the practical potential to impact the exercise behaviors of children with mixed reality.

  20. Dynamic 3-D virtual fixtures for minimally invasive beating heart procedures.

    PubMed

    Ren, Jing; Patel, Rajni V; McIsaac, Kenneth A; Guiraudon, Gerard; Peters, Terry M

    2008-08-01

    Two-dimensional or 3-D visual guidance is often used for minimally invasive cardiac surgery and diagnosis. This visual guidance suffers from several drawbacks such as limited field of view, loss of signal from time to time, and in some cases, difficulty of interpretation. These limitations become more evident in beating-heart procedures when the surgeon has to perform a surgical procedure in the presence of heart motion. In this paper, we propose dynamic 3-D virtual fixtures (DVFs) to augment the visual guidance system with haptic feedback, to provide the surgeon with more helpful guidance by constraining the surgeon's hand motions thereby protecting sensitive structures. DVFs can be generated from preoperative dynamic magnetic resonance (MR) or computed tomograph (CT) images and then mapped to the patient during surgery. We have validated the feasibility of the proposed method on several simulated surgical tasks using a volunteer's cardiac image dataset. Validation results show that the integration of visual and haptic guidance can permit a user to perform surgical tasks more easily and with reduced error rate. We believe this is the first work presented in the field of virtual fixtures that explicitly considers heart motion.

  1. Virtual reality hardware and graphic display options for brain-machine interfaces.

    PubMed

    Marathe, Amar R; Carey, Holle L; Taylor, Dawn M

    2008-01-15

    Virtual reality hardware and graphic displays are reviewed here as a development environment for brain-machine interfaces (BMIs). Two desktop stereoscopic monitors and one 2D monitor were compared in a visual depth discrimination task and in a 3D target-matching task where able-bodied individuals used actual hand movements to match a virtual hand to different target hands. Three graphic representations of the hand were compared: a plain sphere, a sphere attached to the fingertip of a realistic hand and arm, and a stylized pacman-like hand. Several subjects had great difficulty using either stereo monitor for depth perception when perspective size cues were removed. A mismatch in stereo and size cues generated inappropriate depth illusions. This phenomenon has implications for choosing target and virtual hand sizes in BMI experiments. Target-matching accuracy was about as good with the 2D monitor as with either 3D monitor. However, users achieved this accuracy by exploring the boundaries of the hand in the target with carefully controlled movements. This method of determining relative depth may not be possible in BMI experiments if movement control is more limited. Intuitive depth cues, such as including a virtual arm, can significantly improve depth perception accuracy with or without stereo viewing.

  2. 3D virtual human atria: A computational platform for studying clinical atrial fibrillation.

    PubMed

    Aslanidi, Oleg V; Colman, Michael A; Stott, Jonathan; Dobrzynski, Halina; Boyett, Mark R; Holden, Arun V; Zhang, Henggui

    2011-10-01

    Despite a vast amount of experimental and clinical data on the underlying ionic, cellular and tissue substrates, the mechanisms of common atrial arrhythmias (such as atrial fibrillation, AF) arising from the functional interactions at the whole atria level remain unclear. Computational modelling provides a quantitative framework for integrating such multi-scale data and understanding the arrhythmogenic behaviour that emerges from the collective spatio-temporal dynamics in all parts of the heart. In this study, we have developed a multi-scale hierarchy of biophysically detailed computational models for the human atria--the 3D virtual human atria. Primarily, diffusion tensor MRI reconstruction of the tissue geometry and fibre orientation in the human sinoatrial node (SAN) and surrounding atrial muscle was integrated into the 3D model of the whole atria dissected from the Visible Human dataset. The anatomical models were combined with the heterogeneous atrial action potential (AP) models, and used to simulate the AP conduction in the human atria under various conditions: SAN pacemaking and atrial activation in the normal rhythm, break-down of regular AP wave-fronts during rapid atrial pacing, and the genesis of multiple re-entrant wavelets characteristic of AF. Contributions of different properties of the tissue to mechanisms of the normal rhythm and arrhythmogenesis were investigated. Primarily, the simulations showed that tissue heterogeneity caused the break-down of the normal AP wave-fronts at rapid pacing rates, which initiated a pair of re-entrant spiral waves; and tissue anisotropy resulted in a further break-down of the spiral waves into multiple meandering wavelets characteristic of AF. The 3D virtual atria model itself was incorporated into the torso model to simulate the body surface ECG patterns in the normal and arrhythmic conditions. Therefore, a state-of-the-art computational platform has been developed, which can be used for studying multi

  3. Scalable, high-performance 3D imaging software platform: system architecture and application to virtual colonoscopy.

    PubMed

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2012-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system.

  4. The cranial nerve skywalk: A 3D tutorial of cranial nerves in a virtual platform.

    PubMed

    Richardson-Hatcher, April; Hazzard, Matthew; Ramirez-Yanez, German

    2014-01-01

    Visualization of the complex courses of the cranial nerves by students in the health-related professions is challenging through either diagrams in books or plastic models in the gross laboratory. Furthermore, dissection of the cranial nerves in the gross laboratory is an extremely meticulous task. Teaching and learning the cranial nerve pathways is difficult using two-dimensional (2D) illustrations alone. Three-dimensional (3D) models aid the teacher in describing intricate and complex anatomical structures and help students visualize them. The study of the cranial nerves can be supplemented with 3D, which permits the students to fully visualize their distribution within the craniofacial complex. This article describes the construction and usage of a virtual anatomy platform in Second Life™, which contains 3D models of the cranial nerves III, V, VII, and IX. The Cranial Nerve Skywalk features select cranial nerves and the associated autonomic pathways in an immersive online environment. This teaching supplement was introduced to groups of pre-healthcare professional students in gross anatomy courses at both institutions and student feedback is included.

  5. Laboratory-based x-ray phase-contrast tomography enables 3D virtual histology

    NASA Astrophysics Data System (ADS)

    Töpperwien, Mareike; Krenkel, Martin; Quade, Felix; Salditt, Tim

    2016-09-01

    Due to the large penetration depth and small wavelength hard x-rays offer a unique potential for 3D biomedical and biological imaging, combining capabilities of high resolution and large sample volume. However, in classical absorption-based computed tomography, soft tissue only shows a weak contrast, limiting the actual resolution. With the advent of phase-contrast methods, the much stronger phase shift induced by the sample can now be exploited. For high resolution, free space propagation behind the sample is particularly well suited to make the phase shift visible. Contrast formation is based on the self-interference of the transmitted beam, resulting in object-induced intensity modulations in the detector plane. As this method requires a sufficiently high degree of spatial coherence, it was since long perceived as a synchrotron-based imaging technique. In this contribution we show that by combination of high brightness liquid-metal jet microfocus sources and suitable sample preparation techniques, as well as optimized geometry, detection and phase retrieval, excellent three-dimensional image quality can be obtained, revealing the anatomy of a cobweb spider in high detail. This opens up new opportunities for 3D virtual histology of small organisms. Importantly, the image quality is finally augmented to a level accessible to automatic 3D segmentation.

  6. The use of virtual reality exposure in the treatment of anxiety disorders.

    PubMed

    Rothbaum, B O; Hodges, L F

    1999-10-01

    One possible alternative to standard in vivo exposure may be virtual reality exposure. Virtual reality integrates real-time computer graphics, body tracking devices, visual displays, and other sensory input devices to immerse a participant in a computer-generated virtual environment. Virtual reality exposure (VRE) is potentially an efficient and cost-effective treatment of anxiety disorders. VRE therapy has been successful in reducing the fear of heights in the first known controlled study of virtual reality in the treatment of a psychological disorder. Outcome was assessed on measures of anxiety, avoidance, attitudes, and distress. Significant group differences were found on all measures such that the VRE group was significantly improved at posttreatment but the control group was unchanged. The efficacy of virtual reality exposure therapy was also supported for the fear of flying in a case study. The potential for virtual reality exposure treatment for these and other disorders is explored.

  7. Dynamic concision for three-dimensional reconstruction of human organ built with virtual reality modelling language (VRML).

    PubMed

    Yu, Zheng-yang; Zheng, Shu-sen; Chen, Lei-ting; He, Xiao-qian; Wang, Jian-jun

    2005-07-01

    This research studies the process of 3D reconstruction and dynamic concision based on 2D medical digital images using virtual reality modelling language (VRML) and JavaScript language, with a focus on how to realize the dynamic concision of 3D medical model with script node and sensor node in VRML. The 3D reconstruction and concision of body internal organs can be built with such high quality that they are better than those obtained from the traditional methods. With the function of dynamic concision, the VRML browser can offer better windows for man-computer interaction in real-time environment than ever before. 3D reconstruction and dynamic concision with VRML can be used to meet the requirement for the medical observation of 3D reconstruction and have a promising prospect in the fields of medical imaging.

  8. An integrated pipeline to create and experience compelling scenarios in virtual reality

    NASA Astrophysics Data System (ADS)

    Springer, Jan P.; Neumann, Carsten; Reiners, Dirk; Cruz-Neira, Carolina

    2011-03-01

    One of the main barriers to create and use compelling scenarios in virtual reality is the complexity and time-consuming efforts for modeling, element integration, and the software development to properly display and interact with the content in the available systems. Still today, most virtual reality applications are tedious to create and they are hard-wired to the specific display and interaction system available to the developers when creating the application. Furthermore, it is not possible to alter the content or the dynamics of the content once the application has been created. We present our research on designing a software pipeline that enables the creation of compelling scenarios with a fair degree of visual and interaction complexity in a semi-automated way. Specifically, we are targeting drivable urban scenarios, ranging from large cities to sparsely populated rural areas that incorporate both static components (e. g., houses, trees) and dynamic components (e. g., people, vehicles) as well as events, such as explosions or ambient noise. Our pipeline has four basic components. First, an environment designer, where users sketch the overall layout of the scenario, and an automated method constructs the 3D environment from the information in the sketch. Second, a scenario editor used for authoring the complete scenario, incorporate the dynamic elements and events, fine tune the automatically generated environment, define the execution conditions of the scenario, and set up any data gathering that may be necessary during the execution of the scenario. Third, a run-time environment for different virtual-reality systems provides users with the interactive experience as designed with the designer and the editor. And fourth, a bi-directional monitoring system that allows for capturing and modification of information from the virtual environment. One of the interesting capabilities of our pipeline is that scenarios can be built and modified on-the-fly as they are

  9. 3D workflow for HDR image capture of projection systems and objects for CAVE virtual environments authoring with wireless touch-sensitive devices

    NASA Astrophysics Data System (ADS)

    Prusten, Mark J.; McIntyre, Michelle; Landis, Marvin

    2006-02-01

    A 3D workflow pipeline is presented for High Dynamic Range (HDR) image capture of projected scenes or objects for presentation in CAVE virtual environments. The methods of HDR digital photography of environments vs. objects are reviewed. Samples of both types of virtual authoring being the actual CAVE environment and a sculpture are shown. A series of software tools are incorporated into a pipeline called CAVEPIPE, allowing for high-resolution objects and scenes to be composited together in natural illumination environments [1] and presented in our CAVE virtual reality environment. We also present a way to enhance the user interface for CAVE environments. The traditional methods of controlling the navigation through virtual environments include: glove, HUD's and 3D mouse devices. By integrating a wireless network that includes both WiFi (IEEE 802.11b/g) and Bluetooth (IEEE 802.15.1) protocols the non-graphical input control device can be eliminated. Therefore wireless devices can be added that would include: PDA's, Smart Phones, TabletPC's, Portable Gaming consoles, and PocketPC's.

  10. An experiment on fear of public speaking in virtual reality.

    PubMed

    Pertaub, D P; Slater, M; Barker, C

    2001-01-01

    Can virtual reality exposure therapy be used to treat people with social phobia? To answer this question it is vital to known if people will respond to virtual humans (avatars) in a virtual social setting in the same way they would to real humans. If someone is extremely anxious with real people, will they also be anxious when faced with simulated people, despite knowing that the avatars are computer generated? In [17] we described a small pilot study that placed 10 people before a virtual audience. The purpose was to assess the extent to which social anxiety, specifically fear of public speaking, was induced by the virtual audience and the extent of influence of degree of immersion (head mounted display or desktop monitor. The current paper describes a follow up study conducted with 40 subjects and the results clearly show that not only is social anxiety induced by the audience, but the degree of anxiety experienced is directly related to the type of virtual audience feedback the speaker receives. In particular, a hostile negative audience scenario was found to generate strong affect in speakers, regardless of whether or not they normally suffered from fear of public speaking.

  11. Virtual Reality-based Telesurgery via Teleprogramming Scheme Combined with Semi-autonomous Control.

    PubMed

    Zhijiang, Du; Zhiheng, Jia; Minxiu, Kong

    2005-01-01

    Telesurgery systems have long been suffering variable and unpredictable Internet commutation time delay, operation fatigue, and other drawbacks. Based on virtual reality technology, a teleprogramming scheme combined with semi-autonomous control is introduced to guarantee the robustness and efficiency of teleoperation of HIT-RAOS, a robot-assisted orthopedic surgery system. In this system, without considering time delay, the operator can just interact with virtual environment which provides real-time 3D vision, stereophonic sound, and tactile and force feedback imitated by a parallel master manipulator. And several tasks can be managed simultaneously via semi-autonomous control. Finally, the method is experimentally demonstrated on an experiment of locking of intramedullary nails, and is shown to effectively provide stability and performances.

  12. The force pyramid: a spatial analysis of force application during virtual reality brain tumor resection.

    PubMed

    Azarnoush, Hamed; Siar, Samaneh; Sawaya, Robin; Zhrani, Gmaan Al; Winkler-Schwartz, Alexander; Alotaibi, Fahad Eid; Bugdadi, Abdulgadir; Bajunaid, Khalid; Marwa, Ibrahim; Sabbagh, Abdulrahman Jafar; Del Maestro, Rolando F

    2016-09-30

    OBJECTIVE Virtual reality simulators allow development of novel methods to analyze neurosurgical performance. The concept of a force pyramid is introduced as a Tier 3 metric with the ability to provide visual and spatial analysis of 3D force application by any instrument used during simulated tumor resection. This study was designed to answer 3 questions: 1) Do study groups have distinct force pyramids? 2) Do handedness and ergonomics influence force pyramid structure? 3) Are force pyramids dependent on the visual and haptic characteristics of simulated tumors? METHODS Using a virtual reality simulator, NeuroVR (formerly NeuroTouch), ultrasonic aspirator force application was continually assessed during resection of simulated brain tumors by neurosurgeons, residents, and medical students. The participants performed simulated resections of 18 simulated brain tumors with different visual and haptic characteristics. The raw data, namely, coordinates of the instrument tip as well as contact force values, were collected by the simulator. To provide a visual and qualitative spatial analysis of forces, the authors created a graph, called a force pyramid, representing force sum along the z-coordinate for different xy coordinates of the tool tip. RESULTS Sixteen neurosurgeons, 15 residents, and 84 medical students participated in the study. Neurosurgeon, resident and medical student groups displayed easily distinguishable 3D "force pyramid fingerprints." Neurosurgeons had the lowest force pyramids, indicating application of the lowest forces, followed by resident and medical student groups. Handedness, ergonomics, and visual and haptic tumor characteristics resulted in distinct well-defined 3D force pyramid patterns. CONCLUSIONS Force pyramid fingerprints provide 3D spatial assessment displays of instrument force application during simulated tumor resection. Neurosurgeon force utilization and ergonomic data form a basis for understanding and modulating resident force

  13. Enhancing the Induction Skill of Deaf and Hard-of-Hearing Children with Virtual Reality Technology.

    PubMed

    Passig, D; Eden, S

    2000-01-01

    Many researchers have found that for reasoning and reaching a reasoned conclusion, particularly when the process of induction is required, deaf and hard-of-hearing children have unusual difficulty. The purpose of this study was to investigate whether the practice of rotating virtual reality (VR) three-dimensional (3D) objects will have a positive effect on the ability of deaf and hard-of-hearing children to use inductive processes when dealing with shapes. Three groups were involved in the study: (1) experimental group, which included 21 deaf and hard-of-hearing children, who played a VR 3D game; (2) control group I, which included 23 deaf and hard-of-hearing children, who played a similar two-dimensional (2D) game (not VR game); and (3) control group II of 16 hearing children for whom no intervention was introduced. The results clearly indicate that practicing with VR 3D spatial rotations significantly improved inductive thinking used by the experimental group for shapes as compared with the first control group, who did not significantly improve their performance. Also, prior to the VR 3D experience, the deaf and hard-of-hearing children attained lower scores in inductive abilities than the children with normal hearing, (control group II). The results for the experimental group, after the VR 3D experience, improved to the extent that there was no noticeable difference between them and the children with normal hearing.

  14. The 3D visualization technology research of submarine pipeline based Horde3D GameEngine

    NASA Astrophysics Data System (ADS)

    Yao, Guanghui; Ma, Xiushui; Chen, Genlang; Ye, Lingjian

    2013-10-01

    With the development of 3D display and virtual reality technology, its application gets more and more widespread. This paper applies 3D display technology to the monitoring of submarine pipeline. We reconstruct the submarine pipeline and its surrounding submarine terrain in computer using Horde3D graphics rendering engine on the foundation database "submarine pipeline and relative landforms landscape synthesis database" so as to display the virtual scene of submarine pipeline based virtual reality and show the relevant data collected from the monitoring of submarine pipeline.

  15. Virtual Reality for the Psychophysiological Assessment of Phobic Fear: Responses during Virtual Tunnel Driving

    ERIC Educational Resources Information Center

    Muhlberger, Andreas; Bulthoff, Heinrich H.; Wiedemann, Georg; Pauli, Paul

    2007-01-01

    An overall assessment of phobic fear requires not only a verbal self-report of fear but also an assessment of behavioral and physiological responses. Virtual reality can be used to simulate realistic (phobic) situations and therefore should be useful for inducing emotions in a controlled, standardized way. Verbal and physiological fear reactions…

  16. Building a 3D Virtual Liver: Methods for Simulating Blood Flow and Hepatic Clearance on 3D Structures

    PubMed Central

    Rezania, Vahid; Tuszynski, Jack

    2016-01-01

    In this paper, we develop a spatio-temporal modeling approach to describe blood and drug flow, as well as drug uptake and elimination, on an approximation of the liver. Extending on previously developed computational approaches, we generate an approximation of a liver, which consists of a portal and hepatic vein vasculature structure, embedded in the surrounding liver tissue. The vasculature is generated via constrained constructive optimization, and then converted to a spatial grid of a selected grid size. Estimates for surrounding upscaled lobule tissue properties are then presented appropriate to the same grid size. Simulation of fluid flow and drug metabolism (hepatic clearance) are completed using discretized forms of the relevant convective-diffusive-reactive partial differential equations for these processes. This results in a single stage, uniformly consistent method to simulate equations for blood and drug flow, as well as drug metabolism, on a 3D structure representative of a liver. PMID:27649537

  17. A real-time 3D end-to-end augmented reality system (and its representation transformations)

    NASA Astrophysics Data System (ADS)

    Tytgat, Donny; Aerts, Maarten; De Busser, Jeroen; Lievens, Sammy; Rondao Alface, Patrice; Macq, Jean-Francois

    2016-09-01

    The new generation of HMDs coming to the market is expected to enable many new applications that allow free viewpoint experiences with captured video objects. Current applications usually rely on 3D content that is manually created or captured in an offline manner. In contrast, this paper focuses on augmented reality applications that use live captured 3D objects while maintaining free viewpoint interaction. We present a system that allows live dynamic 3D objects (e.g. a person who is talking) to be captured in real-time. Real-time performance is achieved by traversing a number of representation formats and exploiting their specific benefits. For instance, depth images are maintained for fast neighborhood retrieval and occlusion determination, while implicit surfaces are used to facilitate multi-source aggregation for both geometry and texture. The result is a 3D reconstruction system that outputs multi-textured triangle meshes at real-time rates. An end-to-end system is presented that captures and reconstructs live 3D data and allows for this data to be used on a networked (AR) device. For allocating the different functional blocks onto the available physical devices, a number of alternatives are proposed considering the available computational power and bandwidth for each of the components. As we will show, the representation format can play an important role in this functional allocation and allows for a flexible system that can support a highly heterogeneous infrastructure.

  18. [Image fusion, virtual reality, robotics and navigation. Effects on surgical practice].

    PubMed

    Maresceaux, J; Soler, L; Ceulemans, R; Garcia, A; Henri, M; Dutson, E

    2002-05-01

    In the new minimally invasive surgical era, virtual reality, robotics, and image merging have become topics on their own, offering the potential to revolutionize current surgical treatment and assessment. Improved patient care in the digital age seems to be the primary impetus for continued efforts in the field of telesurgery. The progress in endoscopic surgery with regard to telesurgery is manifested by digitization of the pre-, intra-, and postoperative interaction with the patients' surgical disease via computer system integration: so-called Computer Assisted Surgery (CAS). The preoperative assessment can be improved by 3D organ reconstruction, as in virtual colonoscopy or cholangiography, and by planning and practicing surgery using virtual or simulated organs. When integrating all of the data recorded during this preoperative stage, an enhanced reality can be made possible to improve intra-operative patient interactions. CAS allows for increased three-dimensional accuracy, improved precision and the reproducibility of procedures. The ability to store the actions of the surgeon as digitized information also allows for universal, rapid distribution: i.e., the surgeon's activity can be transmitted to the other side of the operating room or to a remote site via high-speed communications links, as was recently demonstrated by our own team during the Lindbergh operation. Furthermore, the surgeon will be able to share his expertise and skill through teleconsultation and telemanipulation, bringing the patient closer to the expert surgical team through electronic means and opening the way to advanced and continuous surgical learning. Finally, for postoperative interaction, virtual reality and simulation can provide us with 4 dimensional images, time being the fourth dimension. This should allow physicians to have a better idea of the disease process in evolution, and treatment modifications based on this view can be anticipated. We are presently determining the

  19. Training software using virtual-reality technology and pre-calculated effective dose data.

    PubMed

    Ding, Aiping; Zhang, Di; Xu, X George

    2009-05-01

    This paper describes the development of a software package, called VR Dose Simulator, which aims to provide interactive radiation safety and ALARA training to radiation workers using virtual-reality (VR) simulations. Combined with a pre-calculated effective dose equivalent (EDE) database, a virtual radiation environment was constructed in VR authoring software, EON Studio, using 3-D models of a real nuclear power plant building. Models of avatars representing two workers were adopted with arms and legs of the avatar being controlled in the software to simulate walking and other postures. Collision detection algorithms were developed for various parts of the 3-D power plant building and avatars to confine the avatars to certain regions of the virtual environment. Ten different camera viewpoints were assigned to conveniently cover the entire virtual scenery in different viewing angles. A user can control the avatar to carry out radiological engineering tasks using two modes of avatar navigation. A user can also specify two types of radiation source: Cs and Co. The location of the avatar inside the virtual environment during the course of the avatar's movement is linked to the EDE database. The accumulative dose is calculated and displayed on the screen in real-time. Based on the final accumulated dose and the completion status of all virtual tasks, a score is given to evaluate the performance of the user. The paper concludes that VR-based simulation technologies are interactive and engaging, thus potentially useful in improving the quality of radiation safety training. The paper also summarizes several challenges: more streamlined data conversion, realistic avatar movement and posture, more intuitive implementation of the data communication between EON Studio and VB.NET, and more versatile utilization of EDE data such as a source near the body, etc., all of which needs to be addressed in future efforts to develop this type of software.

  20. Accuracy and reproducibility of virtual cutting guides and 3D-navigation for osteotomies of the mandible and maxilla

    PubMed Central

    Bernstein, Jonathan M.; Daly, Michael J.; Chan, Harley; Qiu, Jimmy; Goldstein, David; Muhanna, Nidal; de Almeida, John R.; Irish, Jonathan C.

    2017-01-01

    Background We set out to determine the accuracy of 3D-navigated mandibular and maxillary osteotomies with the ultimate aim to integrate virtual cutting guides and 3D-navigation into ablative and reconstructive head and neck surgery. Methods Four surgeons (two attending, two clinical fellows) completed 224 unnavigated and 224 3D-navigated osteotomies on anatomical models according to preoperative 3D plans. The osteotomized bones were scanned and analyzed. Results Median distance from the virtual plan was 2.1 mm unnavigated (IQR 2.6 mm, ≥3 mm in 33%) and 1.2 mm 3D-navigated (IQR 1.1 mm, ≥3 mm in 6%) (P<0.0001); median pitch was 4.5° unnavigated (IQR 7.1°) and 3.5° 3D-navigated (IQR 4.0°) (P<0.0001); median roll was 7.4° unnavigated (IQR 8.5°) and 2.6° 3D-navigated (IQR 3.8°) (P<0.0001). Conclusion 3D-rendering enables osteotomy navigation. 3 mm is an appropriate planning distance. The next steps are translating virtual cutting guides to free bone flap reconstruction and clinical use. PMID:28249001

  1. 3D Printed Models and Navigation for Skull Base Surgery: Case Report and Virtual Validation.

    PubMed

    Ritacco, Lucas E; Di Lella, Federico; Mancino, Axel; Gonzalez Bernaldo de Quiros, Fernan; Boccio, Carlos; Milano, Federico E

    2015-01-01

    In recent years, computer-assisted surgery tools have become more versatile. Having access to a 3D printed model expands the possibility for surgeons to practice with the particular anatomy of a patient before surgery and improve their skills. Optical navigation is capable of guiding a surgeon according to a previously defined plan. These methods improve accuracy and safety at the moment of executing the operation. We intend to carry on a validation process for computed-assisted tools. The aim of this project is to propose a comparative validation method to enable physicians to evaluate differences between a virtual planned approach trajectory and a real executed course. Summarily, this project is focused on decoding data in order to obtain numerical values so as to establish the quality of surgical procedures.

  2. Sensor Spatial Distortion, Visual Latency, and Update Rate Effects on 3D Tracking in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Ellis, S. R.; Adelstein, B. D.; Baumeler, S.; Jense, G. J.; Jacoby, R. H.; Trejo, Leonard (Technical Monitor)

    1998-01-01

    Several common defects that we have sought to minimize in immersing virtual environments are: static sensor spatial distortion, visual latency, and low update rates. Human performance within our environments during large amplitude 3D tracking was assessed by objective and subjective methods in the presence and absence of these defects. Results show that 1) removal of our relatively small spatial sensor distortion had minor effects on the tracking activity, 2) an Adapted Cooper-Harper controllability scale proved the most sensitive subjective indicator of the degradation of dynamic fidelity caused by increasing latency and decreasing frame rates, and 3) performance, as measured by normalized RMS tracking error or subjective impressions, was more markedly influenced by changing visual latency than by update rate.

  3. Predicting LER and LWR in SAQP with 3D virtual fabrication

    NASA Astrophysics Data System (ADS)

    Gu, Jiangjiang (Jimmy); Zhao, Dalong; Allampalli, Vasanth; Faken, Daniel; Greiner, Ken; Fried, David M.

    2016-03-01

    For the first time, process impact on line-edge roughness (LER) and line-width roughness (LWR) in a back-end-of-line (BEOL) self-aligned quadruple patterning (SAQP) flow has been systematically investigated through predictive 3D virtual fabrication. This frequency dependent LER study shows that both deposition and etching effectively reduce high frequency LER, while deposition is much more effective in reducing low frequency LER. Spacer-assisted patterning technology reduces LWR significantly by creating correlated edges, and further LWR improvement can be achieved by optimizing individual process effects on LER. Our study provides a guideline for the understanding and optimization of LER and LWR in advanced technology nodes.

  4. Astronaut Prepares for Mission With Virtual Reality Hardware

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Astronaut John M. Grunsfeld, STS-109 payload commander, uses virtual reality hardware at Johnson Space Center to rehearse some of his duties prior to the STS-109 mission. The most familiar form of virtual reality technology is some form of headpiece, which fits over your eyes and displays a three dimensional computerized image of another place. Turn your head left and right, and you see what would be to your sides; turn around, and you see what might be sneaking up on you. An important part of the technology is some type of data glove that you use to propel yourself through the virtual world. This technology allows NASA astronauts to practice International Space Station work missions in advance. Currently, the medical community is using the new technologies in four major ways: To see parts of the body more accurately, for study, to make better diagnosis of disease and to plan surgery in more detail; to obtain a more accurate picture of a procedure during surgery; to perform more types of surgery with the most noninvasive, accurate methods possible; and to model interactions among molecules at a molecular level.

  5. Fire training in a virtual-reality environment

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Jurgen; Bucken, Arno

    2005-03-01

    Although fire is very common in our daily environment - as a source of energy at home or as a tool in industry - most people cannot estimate the danger of a conflagration. Therefore it is important to train people in combating fire. Beneath training with propane simulators or real fires and real extinguishers, fire training can be performed in virtual reality, which means a pollution-free and fast way of training. In this paper we describe how to enhance a virtual-reality environment with a real-time fire simulation and visualisation in order to establish a realistic emergency-training system. The presented approach supports extinguishing of the virtual fire including recordable performance data as needed in teletraining environments. We will show how to get realistic impressions of fire using advanced particle-simulation and how to use the advantages of particles to trigger states in a modified cellular automata used for the simulation of fire-behaviour. Using particle systems that interact with cellular automata it is possible to simulate a developing, spreading fire and its reaction on different extinguishing agents like water, CO2 or oxygen. The methods proposed in this paper have been implemented and successfully tested on Cosimir, a commercial robot-and VR-simulation-system.

  6. Human Factors Issues in the Use of Virtual and Augmented Reality for Military Purposes - USA

    DTIC Science & Technology

    2005-12-01

    Human Factors Issues in the Use of Virtual and Augmented Reality for Military Purposes – USA Stephen Golberg US Army Research institute...Human Factors Issues in the Use of Virtual and Augmented Reality for Military Purposes USA 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...Rev. 8-98) Prescribed by ANSI Std Z39-18 HUMAN FACTORS ISSUES IN THE USE OF VIRTUAL AND AUGMENTED REALITY FOR MILITARY PURPOSES – USA 7 - 2 RTO

  7. Enhanced Visual-Attention Model for Perceptually Improved 3D Object Modeling in Virtual Environments

    NASA Astrophysics Data System (ADS)

    Chagnon-Forget, Maude; Rouhafzay, Ghazal; Cretu, Ana-Maria; Bouchard, Stéphane

    2016-12-01

    Three-dimensional object modeling and interactive virtual environment applications require accurate, but compact object models that ensure real-time rendering capabilities. In this context, the paper proposes a 3D modeling framework employing visual attention characteristics in order to obtain compact models that are more adapted to human visual capabilities. An enhanced computational visual attention model with additional saliency channels, such as curvature, symmetry, contrast and entropy, is initially employed to detect points of interest over the surface of a 3D object. The impact of the use of these supplementary channels is experimentally evaluated. The regions identified as salient by the visual attention model are preserved in a selectively-simplified model obtained using an adapted version of the QSlim algorithm. The resulting model is characterized by a higher density of points in the salient regions, therefore ensuring a higher perceived quality, while at the same time ensuring a less complex and more compact representation for the object. The quality of the resulting models is compared with the performance of other interest point detectors incorporated in a similar manner in the simplification algorithm. The proposed solution results overall in higher quality models, especially at lower resolutions. As an example of application, the selectively-densified models are included in a continuous multiple level of detail (LOD) modeling framework, in which an original neural-network solution selects the appropriate size and resolution of an object.

  8. Combinatorial Pharmacophore-Based 3D-QSAR Analysis and Virtual Screening of FGFR1 Inhibitors

    PubMed Central

    Zhou, Nannan; Xu, Yuan; Liu, Xian; Wang, Yulan; Peng, Jianlong; Luo, Xiaomin; Zheng, Mingyue; Chen, Kaixian; Jiang, Hualiang

    2015-01-01

    The fibroblast growth factor/fibroblast growth factor receptor (FGF/FGFR) signaling pathway plays crucial roles in cell proliferation, angiogenesis, migration, and survival. Aberration in FGFRs correlates with several malignancies and disorders. FGFRs have proved to be attractive targets for therapeutic intervention in cancer, and it is of high interest to find FGFR inhibitors with novel scaffolds. In this study, a combinatorial three-dimensional quantitative structure-activity relationship (3D-QSAR) model was developed based on previously reported FGFR1 inhibitors with diverse structural skeletons. This model was evaluated for its prediction performance on a diverse test set containing 232 FGFR inhibitors, and it yielded a SD value of 0.75 pIC50 units from measured inhibition affinities and a Pearson’s correlation coefficient R2 of 0.53. This result suggests that the combinatorial 3D-QSAR model could be used to search for new FGFR1 hit structures and predict their potential activity. To further evaluate the performance of the model, a decoy set validation was used to measure the efficiency of the model by calculating EF (enrichment factor). Based on the combinatorial pharmacophore model, a virtual screening against SPECS database was performed. Nineteen novel active compounds were successfully identified, which provide new chemical starting points for further structural optimization of FGFR1 inhibitors. PMID:26110383

  9. 3D modeling of the Strasbourg's Cathedral basements for interdisciplinary research and virtual visits

    NASA Astrophysics Data System (ADS)

    Landes, T.; Kuhnle, G.; Bruna, R.

    2015-08-01

    On the occasion of the millennium celebration of Strasbourg Cathedral, a transdisciplinary research group composed of archaeologists, surveyors, architects, art historians and a stonemason revised the 1966-1972 excavations under the St. Lawrence's Chapel of the Cathedral having remains of Roman and medieval masonry. The 3D modeling of the Chapel has been realized based on the combination of conventional surveying techniques for the network creation, laser scanning for the model creation and photogrammetric techniques for the texturing of a few parts. According to the requirements and the end-user of the model, the level of detail and level of accuracy have been adapted and assessed for every floor. The basement has been acquired and modeled with more details and a higher accuracy than the other parts. Thanks to this modeling work, archaeologists can confront their assumptions to those of other disciplines by simulating constructions of other worship edifices on the massive stones composing the basement. The virtual reconstructions provided evidence in support of these assumptions and served for communication via virtual visits.

  10. Analytical 3D views and virtual globes — scientific results in a familiar spatial context

    NASA Astrophysics Data System (ADS)

    Tiede, Dirk; Lang, Stefan

    In this paper we introduce analytical three-dimensional (3D) views as a means for effective and comprehensible information delivery, using virtual globes and the third dimension as an additional information carrier. Four case studies are presented, in which information extraction results from very high spatial resolution (VHSR) satellite images were conditioned and aggregated or disaggregated to regular spatial units. The case studies were embedded in the context of: (1) urban life quality assessment (Salzburg/Austria); (2) post-disaster assessment (Harare/Zimbabwe); (3) emergency response (Lukole/Tanzania); and (4) contingency planning (faked crisis scenario/Germany). The results are made available in different virtual globe environments, using the implemented contextual data (such as satellite imagery, aerial photographs, and auxiliary geodata) as valuable additional context information. Both day-to-day users and high-level decision makers are addressees of this tailored information product. The degree of abstraction required for understanding a complex analytical content is balanced with the ease and appeal by which the context is conveyed.

  11. One's Colonies: a virtual reality environment of oriental residences

    NASA Astrophysics Data System (ADS)

    Chi, Catherine

    2013-03-01

    This paper is a statement about my virtual reality environment project, One's Colonies, and a description of the creative process of the project. I was inspired by the buildings in my hometown-Taiwan, which is really different from the architectural style in the United States. By analyzing the unique style of dwellings in Taiwan, I want to demonstrate how the difference between geography, weather and culture change the appearance of the living space. Through this project I want to express the relationship between architectural style and cultural difference, and how the emotional condition or characteristics of the residents are affected by their residencies.

  12. A virtual reality browser for Space Station models

    NASA Technical Reports Server (NTRS)

    Goldsby, Michael; Pandya, Abhilash; Aldridge, Ann; Maida, James

    1993-01-01

    The Graphics Analysis Facility at NASA/JSC has created a visualization and learning tool by merging its database of detailed geometric models with a virtual reality system. The system allows an interactive walk-through of models of the Space Station and other structures, providing detailed realistic stereo images. The user can activate audio messages describing the function and connectivity of selected components within his field of view. This paper presents the issues and trade-offs involved in the implementation of the VR system and discusses its suitability for its intended purposes.

  13. Virtual Reality environment assisting post stroke hand rehabilitation: case report.

    PubMed

    Tsoupikova, Daria; Stoykov, Nikolay; Kamper, Derek; Vick, Randy

    2013-01-01

    We describe a novel art-empowered Virtual Reality (VR) system designed for hand rehabilitation therapy following stroke. The system was developed by an interdisciplinary team of engineers, art therapists, occupational therapists, and VR artist to improve patient's motivation and engagement. We describe system design, development, and user testing for efficiency, subject's satisfaction and clinical feasibility. We report initial results following use of the system on the first four subjects from the ongoing clinical efficacy trials as measured by standard clinical tests for upper extremity function. These cases demonstrate that the system is operational and can facilitate therapy for post stroke patients with upper extremity impairment.

  14. Research on distributed virtual reality system in electronic commerce

    NASA Astrophysics Data System (ADS)

    Xue, Qiang; Wang, Jiening; Sun, Jizhou

    2004-03-01

    In this paper, Distributed Virtual Reality (DVR) technology applied in Electronical Commerce (EC) is discussed. DVR has the capability of providing a new means for human being to recognize, analyze and resolve the large scale, complex problems, which makes it develop quickly in EC fields. The technology of CSCW (Computer Supported Cooperative Work) and middleware is introduced into the development of EC-DVR system to meet the need of a platform which can provide the necessary cooperation and communication services to avoid developing the basic module repeatedly. Finally, the paper gives a platform structure of EC-DVR system.

  15. Three-Dimensional User Interfaces for Immersive Virtual Reality

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1997-01-01

    The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.

  16. Creating a semantic-web interface with virtual reality

    NASA Astrophysics Data System (ADS)

    Cleary, David C.; O'Donoghue, Diarmuid

    2001-07-01

    Novel initiatives amongst the Internet community such as Internet2 and Qbone are based on the use of high bandwidth and powerful computers. However the experience amongst the majority of Internet users is light-years from these emerging technologies. We describe the construction of a distributed high performance search engine, utilising advanced threading techniques on a diskless Linux cluster. The resulting Virtual Reality scene is pass to the client machine for viewing. This search engine bridges the gap between the Internet of today, and the Internet of the future.

  17. Virtual reality applications to automated rendezvous and capture

    NASA Technical Reports Server (NTRS)

    Hale, Joseph; Oneil, Daniel

    1991-01-01

    Virtual Reality (VR) is a rapidly developing Human/Computer Interface (HCI) technology. The evolution of high-speed graphics processors and development of specialized anthropomorphic user interface devices, that more fully involve the human senses, have enabled VR technology. Recently, the maturity of this technology has reached a level where it can be used as a tool in a variety of applications. This paper provides an overview of: VR technology, VR activities at Marshall Space Flight Center (MSFC), applications of VR to Automated Rendezvous and Capture (AR&C), and identifies areas of VR technology that requires further development.

  18. Virtual reality applications for motor rehabilitation after stroke.

    PubMed

    Sisto, Sue Ann; Forrest, Gail F; Glendinning, Diana

    2002-01-01

    Hemiparesis is the primary physical impairment underlying functional disability after stroke. A goal of rehabilitation is to enhance motor skill acquisition, which is a direct result of practice. However, frequency and duration of practice are limited in rehabilitation. Virtual reality (VR) is a computer technology that simulates real-life learning while providing augmented feedback and increased frequency, duration, and intensity of practiced tasks. The rate and extent of relearning of motor tasks could affect the duration, effectiveness, and cost of patient care. The purpose of this article is to review the use of VR training for motor rehabilitation after stroke.

  19. Transforming Experience: The Potential of Augmented Reality and Virtual Reality for Enhancing Personal and Clinical Change

    PubMed Central

    Riva, Giuseppe; Baños, Rosa M.; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea

    2016-01-01

    During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies – augmented reality (AR) and virtual reality (VR) – exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual’s worldview. PMID:27746747

  20. Transforming Experience: The Potential of Augmented Reality and Virtual Reality for Enhancing Personal and Clinical Change.

    PubMed

    Riva, Giuseppe; Baños, Rosa M; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea

    2016-01-01

    During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies - augmented reality (AR) and virtual reality (VR) - exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual's worldview.

  1. a Geometric Processing Workflow for Transforming Reality-Based 3d Models in Volumetric Meshes Suitable for Fea

    NASA Astrophysics Data System (ADS)

    Gonizzi Barsanti, S.; Guidi, G.

    2017-02-01

    Conservation of Cultural Heritage is a key issue and structural changes and damages can influence the mechanical behaviour of artefacts and buildings. The use of Finite Elements Methods (FEM) for mechanical analysis is largely used in modelling stress behaviour. The typical workflow involves the use of CAD 3D models made by Non-Uniform Rational B-splines (NURBS) surfaces, representing the ideal shape of the object to be simulated. Nowadays, 3D documentation of CH has been widely developed through reality-based approaches, but the models are not suitable for a direct use in FEA: the mesh has in fact to be converted to volumetric, and the density has to be reduced since the computational complexity of a FEA grows exponentially with the number of nodes. The focus of this paper is to present a new method aiming at generate the most accurate 3D representation of a real artefact from highly accurate 3D digital models derived from reality-based techniques, maintaining the accuracy of the high-resolution polygonal models in the solid ones. The approach proposed is based on a wise use of retopology procedures and a transformation of this model to a mathematical one made by NURBS surfaces suitable for being processed by volumetric meshers typically embedded in standard FEM packages. The strong simplification with little loss of consistency possible with the retopology step is used for maintaining as much coherence as possible between the original acquired mesh and the simplified model, creating in the meantime a topology that is more favourable for the automatic NURBS conversion.

  2. Virtual reality robotic telesurgery simulations using MEMICA haptic system

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Mavroidis, Constantinos; Bouzit, Mourad; Dolgin, Benjamin; Harm, Deborah L.; Kopchok, George E.; White, Rodney

    2001-01-01

    The authors conceived a haptic mechanism called MEMICA (Remote Mechanical Mirroring using Controlled stiffness and Actuators) that can enable the design of high dexterity, rapid response, and large workspace haptic system. The development of a novel MEMICA gloves and virtual reality models are being explored to allow simulation of telesurgery and other applications. The MEMICA gloves are being designed to provide intuitive mirroring of the conditions at a virtual site where a robot simulates the presence of a human operator. The key components of MEMICA are miniature electrically controlled stiffness (ECS) elements and electrically controlled force and stiffness (ECFS) actuators that are based on the use of Electro-Rheological Fluids (ERF. In this paper the design of the MEMICA system and initial experimental results are presented.

  3. Chavir: Virtual reality simulation for interventions in nuclear installations

    SciTech Connect

    Thevenon, J. B.; Tirel, O.; Lopez, L.; Chodorge, L.; Desbats, P.

    2006-07-01

    Companies involved in the nuclear industry have to prepare for interventions by precisely analyzing the radiological risks and rapidly evaluating the consequences of their operational choices. They also need to consolidate the experiences gained in the field with greater responsiveness and lower costs. This paper brings out the advantages of using virtual reality technology to meet the demands in the industry. The CHAVIR software allows the operators to prepare (and repeat) all the operations they would have to do in a safe virtual world, before performing the actual work inside the facilities. Since the decommissioning or maintenance work is carried out in an environment where there is radiation, the amount of radiation that the operator would be exposed to is calculated and integrated into the simulator. (authors)

  4. Dots and dashes: art, virtual reality, and the telegraph

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia; Chang, Ben

    2009-02-01

    Dots and Dashes is a virtual reality artwork that explores online romance over the telegraph, based on Ella Cheever Thayer's novel Wired Love - a Romance in Dots and Dashes (an Old Story Told in a New Way)1. The uncanny similarities between this story and the world of today's virtual environments provides the springboard for an exploration of a wealth of anxieties and dreams, including the construction of identities in an electronically mediated environment, the shifting boundaries between the natural and machine worlds, and the spiritual dimensions of science and technology. In this paper we examine the parallels between the telegraph networks and our current conceptions of cyberspace, as well as unique social and cultural impacts specific to the telegraph. These include the new opportunities and roles available to women in the telegraph industry and the connection between the telegraph and the Spiritualist movement. We discuss the development of the artwork, its structure and aesthetics, and the technical development of the work.

  5. 3D-ANTLERS: Virtual Reconstruction and Three-Dimensional Measurement

    NASA Astrophysics Data System (ADS)

    Barba, S.; Fiorillo, F.; De Feo, E.

    2013-02-01

    . In the ARTEC digital mock-up for example, it shows the ability to select the individual frames, already polygonal and geo-referenced at the time of capture; however, it is not possible to make an automated texturization differently from the low-cost environment which allows to produce a good graphics' definition. Once the final 3D models were obtained, we have proceeded to do a geometric and graphic comparison of the results. Therefore, in order to provide an accuracy requirement and an assessment for the 3D reconstruction we have taken into account the following benchmarks: cost, captured points, noise (local and global), shadows and holes, operability, degree of definition, quality and accuracy. Subsequently, these studies carried out in an empirical way on the virtual reconstructions, a 3D documentation was codified with a procedural method endorsing the use of terrestrial sensors for the documentation of antlers. The results thus pursued were compared with the standards set by the current provisions (see "Manual de medición" of Government of Andalusia-Spain); to date, in fact, the identification is based on data such as length, volume, colour, texture, openness, tips, structure, etc. Data, which is currently only appreciated with traditional instruments, such as tape measure, would be well represented by a process of virtual reconstruction and cataloguing.

  6. The VEPSY updated project: virtual reality in clinical psychology.

    PubMed

    Riva, G; Alcañiz, M; Anolli, L; Bacchetta, M; Baños, R; Beltrame, F; Botella, C; Galimberti, C; Gamberini, L; Gaggioli, A; Molinari, E; Mantovani, G; Nugues, P; Optale, G; Orsi, G; Perpina, C; Troiañi, R

    2001-08-01

    Many of us grew up with the naive assumption that couches are the best used therapeutic tools in psychotherapy. But tools for psychotherapy are evolving in a much more complex environment than a designer's chaise lounge. In particular, virtual reality (VR) devices have the potential for appearing soon in many consulting rooms. The use of VR in medicine is not a novelty. Applications of virtual environments for health care have been developed in the following areas: surgical procedures (remote surgery or telepresence, augmented or enhanced surgery, and planning and simulation of procedures before surgery); preventive medicine and patient education; medical education and training; visualization of massive medical databases; and architectural design for health care facilities. However, there is a growing recognition that VR can play an important role in clinical psychology, too. To exploit and understand this potential is the main goal of the Telemedicine and Portable Virtual Environment in Clinical Psychology--VEPSY Updated--a European Community-funded research project (IST-2000-25323, http://www.vepsy.com). The project will provide innovative tools-telemedicine and portable-for the treatment of patients, clinical trials to verify their viability, and action plans for dissemination of its results to an extended audience-potential users and influential groups. The project will also develop different personal computer (PC)-based virtual reality modules to be used in clinical assessment and treatment. In particular, the developed modules will address the following pathologies: anxiety disorders; male impotence and premature ejaculation; and obesity, bulimia, and binge-eating disorders.

  7. Virtual Reality: Developing a VR space for Academic activities

    NASA Astrophysics Data System (ADS)

    Kaimaris, D.; Stylianidis, E.; Karanikolas, N.

    2014-05-01

    Virtual reality (VR) is extensively used in various applications; in industry, in academia, in business, and is becoming more and more affordable for end users from the financial point of view. At the same time, in academia and higher education more and more applications are developed, like in medicine, engineering, etc. and students are inquiring to be well-prepared for their professional life after their educational life cycle. Moreover, VR is providing the benefits having the possibility to improve skills but also to understand space as well. This paper presents the methodology used during a course, namely "Geoinformatics applications" at the School of Spatial Planning and Development (Eng.), Aristotle University of Thessaloniki, to create a virtual School space. The course design focuses on the methods and techniques to be used in order to develop the virtual environment. In addition the project aspires to become more and more effective for the students and provide a real virtual environment with useful information not only for the students but also for any citizen interested in the academic life at the School.

  8. Finite element visualization in the cave virtual reality environment

    SciTech Connect

    Plaskacz, E.J.; Kuhn, M.A.

    1996-03-01

    Through the use of the post-processing software, Virtual Reality visualization (VRviz), and the Cave Automatic Virtual Environment (CAVE), finite element representations can be viewed as they would be in real life. VRviz is a program written in ANSI C to translate the mathematical results generated by finite element analysis programs into a virtual representation. This virtual representation is projected into the CAVE environment and the results are animated. The animation is fully controllable. A user is able to translate the image, rotate about any axis and scale the image at any time. The user is also able to freeze the animation at any time step and control the image update rate. This allows the user to navigate around, or even inside, the image in order to effectively analyze possible failure points and redesign as necessary. Through the use of the CAVE and the real life image that is being produced by VRviz, engineers are able to save considerable time, money, and effort in the design process.

  9. Therapists' perception of benefits and costs of using virtual reality treatments.

    PubMed

    Segal, Robert; Bhatia, Maneet; Drapeau, Martin

    2011-01-01

    Research indicates that virtual reality is effective in the treatment of many psychological difficulties and is being used more frequently. However, little is known about therapists' perception of the benefits and costs related to the use of virtual therapy in treatment delivery. In the present study, 271 therapists completed an online questionnaire that assessed their perceptions about the potential benefits and costs of using virtual reality in psychotherapy. Results indicated that therapists perceived the potential benefits as outweighing the potential costs. Therapists' self-reported knowledge of virtual reality, theoretical orientation, and interest in using virtual reality were found to be associated with perceptual measures. These findings contribute to the current knowledge of the perception of virtual reality amongst psychotherapists.

  10. Game controller modification for fMRI hyperscanning experiments in a cooperative virtual reality environment

    PubMed Central

    Trees, Jason; Snider, Joseph; Falahpour, Maryam; Guo, Nick; Lu, Kun; Johnson, Douglas C.; Poizner, Howard; Liu, Thomas T.

    2014-01-01

    Hyperscanning, an emerging technique in which data from multiple interacting subjects’ brains are simultaneously recorded, has become an increasingly popular way to address complex topics, such as “theory of mind.” However, most previous fMRI hyperscanning experiments have been limited to abstract social interactions (e.g. phone conversations). Our new method utilizes a virtual reality (VR) environment used for military training, Virtual Battlespace 2 (VBS2), to create realistic avatar-avatar interactions and cooperative tasks. To control the virtual avatar, subjects use a MRI compatible Playstation 3 game controller, modified by removing all extraneous metal components and replacing any necessary ones with 3D printed plastic models. Control of both scanners’ operation is initiated by a VBS2 plugin to sync scanner time to the known time within the VR environment. Our modifications include:•Modification of game controller to be MRI compatible.•Design of VBS2 virtual environment for cooperative interactions.•Syncing two MRI machines for simultaneous recording. PMID:26150964

  11. Effect of viewing mode on pathfinding in immersive Virtual Reality.

    PubMed

    White, Paul J; Byagowi, Ahmad; Moussavi, Zahra

    2015-08-01

    The use of Head Mounted Displays (HMDs) to view Virtual Reality Environments (VREs) has received much attention recently. This paper reports on the difference between actual humans' navigation in a VRE viewed through an HMD compared to that in the same VRE viewed on a laptop PC display. A novel Virtual Reality (VR) Navigation input device (VRNChair), designed by our team, was paired with an Oculus Rift DK2 Head-Mounted Display (HMD). People used the VRNChair to navigate a VRE, and we analyzed their navigational trajectories with and without the HMD to investigate plausible differences in performance due to the display device. It was found that people's navigational trajectories were more accurate while wearing the HMD compared to viewing an LCD monitor; however, the duration to complete a navigation task remained the same. This implies that increased immersion in VR results in an improvement in pathfinding. In addition, motion sickness caused by using an HMD can be reduced if one uses an input device such as our VRNChair. The VRNChair paired with an HMD provides vestibular stimulation as one moves in the VRE, because movements in the VRE are synchronized with movements in the real environment.

  12. Validation of a Novel Virtual Reality Simulator for Robotic Surgery

    PubMed Central

    Schreuder, Henk W. R.; Persson, Jan E. U.; Wolswijk, Richard G. H.; Ihse, Ingmar; Schijven, Marlies P.; Verheijen, René H. M.

    2014-01-01

    Objective. With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. Methods. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Results. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were “time to complete” and “economy of motion” (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Conclusions. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery. PMID:24600328

  13. The challenge of using virtual reality in telerehabilitation.

    PubMed

    Rizzo, Albert A; Strickland, Dorothy; Bouchard, Stéphane

    2004-01-01

    Continuing advances in virtual reality (VR) technology along with concomitant system cost reductions have supported the development of more useful and accessible VR systems that can uniquely target a wide range of physical, psychological, and cognitive rehabilitation concerns and research questions. VR offers the potential to deliver systematic human testing, training, and treatment environments that allow for the precise control of complex dynamic three-dimensional stimulus presentations, within which sophisticated interaction, behavioral tracking, and performance recording is possible. The next step in this evolution will allow for Internet accessibility to libraries of VR scenarios as a likely form of distribution and use. VR applications that are Internet deliverable could open up new possibilities for home-based therapy and rehabilitation. If executed thoughtfully, they could increase client involvement, enhance outcomes and reduce costs. However, before this vision can be achieved, a number of significant challenges will need to be addressed and solved. This article will first present three fictional case vignettes that illustrate the ways that VR telerehabilitation might be implemented with varying degrees of success in the future. We then describe a system that is currently being used to deliver virtual worlds over the Internet for training safety skills to children with learning disabilities. From these illustrative fictional and reality-based applications, we will then briefly discuss the technical, practical, and user-based challenges for implementing VR telerehabilitation, along with views regarding the future of this emerging clinical application.

  14. Brief virtual reality therapy for public speaking anxiety.

    PubMed

    Harris, Sandra R; Kemmerling, Robert L; North, Max M

    2002-12-01

    The primary goal of this research program was to investigate the effectiveness of virtual reality therapy (VRT) in reducing public speaking anxiety of university students. The prevalence and impact of public speaking anxiety as a type of Social Phobia are discussed. Studies of VRT as an emerging treatment for psychological problems are reviewed. In the present study, eight students completed VRT individual treatment and post-testing, and six students in a Wait-List control group completed post-testing. Assessment measures included four self-report inventories, self-report of Subjective Units of Discomfort during exposure to VRT and physiological measurements of heart rate during speaking tasks. Four weekly individual exposure treatment sessions of approximately 15 min each were conducted by the author serving as therapist. Results on self-report and physiological measures appear to indicate that four virtual reality treatment sessions were effective in reducing public speaking anxiety in university students, corroborating earlier studies of VRT's effectiveness as a psychotherapeutic modality. Future research directions are discussed, primarily the need for research on younger populations, to assess the effectiveness of VRT for earlier intervention with public speaking anxiety.

  15. Low-Cost, Portable, Multi-Wall Virtual Reality

    NASA Technical Reports Server (NTRS)

    Miller, Samuel A.; Misch, Noah J.; Dalton, Aaron J.

    2005-01-01

    Virtual reality systems make compelling outreach displays, but some such systems, like the CAVE, have design features that make their use for that purpose inconvenient. In the case of the CAVE, the equipment is difficult to disassemble, transport, and reassemble, and typically CAVEs can only be afforded by large-budget research facilities. We implemented a system like the CAVE that costs less than $30,000, weighs about 500 pounds, and fits into a fifteen-passenger van. A team of six people have unpacked, assembled, and calibrated the system in less than two hours. This cost reduction versus similar virtual-reality systems stems from the unique approach we took to stereoscopic projection. We used an assembly of optical chopper wheels and commodity LCD projectors to create true active stereo at less than a fifth of the cost of comparable active-stereo technologies. The screen and frame design also optimized portability; the frame assembles in minutes with only two fasteners, and both it and the screen pack into small bundles for easy and secure shipment.

  16. Virtual Superheroes: Using Superpowers in Virtual Reality to Encourage Prosocial Behavior

    PubMed Central

    Rosenberg, Robin S.; Baughman, Shawnee L.; Bailenson, Jeremy N.

    2013-01-01

    Background Recent studies have shown that playing prosocial video games leads to greater subsequent prosocial behavior in the real world. However, immersive virtual reality allows people to occupy avatars that are different from them in a perceptually realistic manner. We examine how occupying an avatar with the superhero ability to fly increases helping behavior. Principal Findings Using a two-by-two design, participants were either given the power of flight (their arm movements were tracked to control their flight akin to Superman’s flying ability) or rode as a passenger in a helicopter, and were assigned one of two tasks, either to help find a missing diabetic child in need of insulin or to tour a virtual city. Participants in the “super-flight” conditions helped the experimenter pick up spilled pens after their virtual experience significantly more than those who were virtual passengers in a helicopter. Conclusion The results indicate that having the “superpower” of flight leads to greater helping behavior in the real world, regardless of how participants used that power. A possible mechanism for this result is that having the power of flight primed concepts and prototypes associated with superheroes (e.g., Superman). This research illustrates the potential of using experiences in virtual reality technology to increase prosocial behavior in the physical world. PMID:23383029

  17. Virtual reality anatomy: is it comparable with traditional methods in the teaching of human forearm musculoskeletal anatomy?

    PubMed

    Codd, Anthony M; Choudhury, Bipasha

    2011-01-01

    The use of cadavers to teach anatomy is well established, but limitations with this approach have led to the introduction of alternative teaching methods. One such method is the use of three-dimensional virtual reality computer models. An interactive, three-dimensional computer model of human forearm anterior compartment musculoskeletal anatomy was produced using the open source 3D imaging program "Blender." The aim was to evaluate the use of 3D virtual reality when compared with traditional anatomy teaching methods. Three groups were identified from the University of Manchester second year Human Anatomy Research Skills Module class: a "control" group (no prior knowledge of forearm anatomy), a "traditional methods" group (taught using dissection and textbooks), and a "model" group (taught solely using e-resource). The groups were assessed on anatomy of the forearm by a ten question practical examination. ANOVA analysis showed the model group mean test score to be significantly higher than the control group (mean 7.25 vs. 1.46, P < 0.001) and not significantly different to the traditional methods group (mean 6.87, P > 0.5). Feedback from all users of the e-resource was positive. Virtual reality anatomy learning can be used to compliment traditional teaching methods effectively.

  18. VEVI: A Virtual Reality Tool For Robotic Planetary Explorations

    NASA Technical Reports Server (NTRS)

    Piguet, Laurent; Fong, Terry; Hine, Butler; Hontalas, Phil; Nygren, Erik

    1994-01-01

    The Virtual Environment Vehicle Interface (VEVI), developed by the NASA Ames Research Center's Intelligent Mechanisms Group, is a modular operator interface for direct teleoperation and supervisory control of robotic vehicles. Virtual environments enable the efficient display and visualization of complex data. This characteristic allows operators to perceive and control complex systems in a natural fashion, utilizing the highly-evolved human sensory system. VEVI utilizes real-time, interactive, 3D graphics and position / orientation sensors to produce a range of interface modalities from the flat panel (windowed or stereoscopic) screen displays to head mounted/head-tracking stereo displays. The interface provides generic video control capability and has been used to control wheeled, legged, air bearing, and underwater vehicles in a variety of different environments. VEVI was designed and implemented to be modular, distributed and easily operated through long-distance communication links, using a communication paradigm called SYNERGY.

  19. Virtual Reality Astronomy Education Using AAS WorldWide Telescope and Oculus Rift

    NASA Astrophysics Data System (ADS)

    Weigel, A. David; Moraitis, Christina D.

    2017-01-01

    The Boyd E. Christenberry Planetarium at Samford University (Birmingham, AL) offers family friendly, live, and interactive planetarium presentations that educate the public on topics from astronomy basics to current cutting edge astronomical discoveries. With limited funding, it is not possible to provide state of the art planetarium hardware for these community audiences. In a society in which many people, even young children, have access to high resolution smart phones and highly realistic video games, it is important to leverage cutting-edge technology to intrigue young and old minds alike. We use an Oculus Rift virtual reality headset running AAS WorldWide Telescope software to visualize 3D data in a fully immersive environment. We create interactive experiences and videos to highlight astronomical concepts and also to communicate the beauty of our universe. The ease of portability enables us to set up at Virtual Reality (VR) experience at various events, festivals, and even in classrooms to provide a community outreach that a fixed planetarium cannot. This VR experience adds the “wow” factor that encourages children and adults to engage in our various planetarium events to learn more about astronomy and continue to explore the final frontier of space. These VR experiences encourages our college students to participate in our astronomy education resulting in increased interest in STEM fields, particularly physics and math.

  20. The development, assessment and validation of virtual reality for human anatomy instruction

    NASA Technical Reports Server (NTRS)

    Marshall, Karen Benn

    1996-01-01

    This research project seeks to meet the objective of science training by developing, assessing, validating and utilizing VR as a human anatomy training medium. Current anatomy instruction is primarily in the form of lectures and usage of textbooks. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three-dimensional, unlike the one-dimensional depiction found in textbooks and the two-dimensional depiction found on the computer. Virtual reality allows one to step through the computer screen into a 3-D artificial world. The primary objective of this project is to produce a virtual reality application of the abdominopelvic region of a human cadaver that can be taken back to the classroom. The hypothesis is that an immersive learning environment affords quicker anatomic recognition and orientation and a greater level of retention in human anatomy instruction. The goal is to augment not replace traditional modes of instruction.

  1. Virtual reality applied to hepatic surgery simulation: the next revolution.

    PubMed Central

    Marescaux, J; Clément, J M; Tassetti, V; Koehl, C; Cotin, S; Russier, Y; Mutter, D; Delingette, H; Ayache, N

    1998-01-01

    OBJECTIVE: This article describes a preliminary work on virtual reality applied to liver surgery and discusses the repercussions of assisted surgical strategy and surgical simulation on tomorrow's surgery. SUMMARY BACKGROUND DATA: Liver surgery is considered difficult because of the complexity and variability of the organ. Common generic tools for presurgical medical image visualization do not fulfill the requirements for the liver, restricting comprehension of a patient's specific liver anatomy. METHODS: Using data from the National Library of Medicine, a realistic three-dimensional image was created, including the envelope and the four internal arborescences. A computer interface was developed to manipulate the organ and to define surgical resection planes according to internal anatomy. The first step of surgical simulation was implemented, providing the organ with real-time deformation computation. RESULTS: The three-dimensional anatomy of the liver could be clearly visualized. The virtual organ could be manipulated and a resection defined depending on the anatomic relations between the arborescences, the tumor, and the external envelope. The resulting parts could also be visualized and manipulated. The simulation allowed the deformation of a liver model in real time by means of a realistic laparoscopic tool. CONCLUSIONS: Three-dimensional visualization of the organ in relation to the pathology is of great help to appreciate the complex anatomy of the liver. Using virtual reality concepts (navigation, interaction, and immersion), surgical planning, training, and teaching for this complex surgical procedure may be possible. The ability to practice a given gesture repeatedly will revolutionize surgical training, and the combination of surgical planning and simulation will improve the efficiency of intervention, leading to optimal care delivery. Images Figure 1. Figure 2. Figure 3. Figure 4. Figure 5. Figure 6. Figure 7. Figure 8. PMID:9833800

  2. From Vesalius to virtual reality: How embodied cognition facilitates the visualization of anatomy

    NASA Astrophysics Data System (ADS)

    Jang, Susan

    This study examines the facilitative effects of embodiment of a complex internal anatomical structure through three-dimensional ("3-D") interactivity in a virtual reality ("VR") program. Since Shepard and Metzler's influential 1971 study, it has been known that 3-D objects (e.g., multiple-armed cube or external body parts) are visually and motorically embodied in our minds. For example, people take longer to rotate mentally an image of their hand not only when there is a greater degree of rotation, but also when the images are presented in a manner incompatible with their natural body movement (Parsons, 1987a, 1994; Cooper & Shepard, 1975; Sekiyama, 1983). Such findings confirm the notion that our mental images and rotations of those images are in fact confined by the laws of physics and biomechanics, because we perceive, think and reason in an embodied fashion. With the advancement of new technologies, virtual reality programs for medical education now enable users to interact directly in a 3-D environment with internal anatomical structures. Given that such structures are not readily viewable to users and thus not previously susceptible to embodiment, coupled with the VR environment also affording all possible degrees of rotation, how people learn from these programs raises new questions. If we embody external anatomical parts we can see, such as our hands and feet, can we embody internal anatomical parts we cannot see? Does manipulating the anatomical part in virtual space facilitate the user's embodiment of that structure and therefore the ability to visualize the structure mentally? Medical students grouped in yoked-pairs were tasked with mastering the spatial configuration of an internal anatomical structure; only one group was allowed to manipulate the images of this anatomical structure in a 3-D VR environment, whereas the other group could only view the manipulation. The manipulation group outperformed the visual group, suggesting that the interactivity

  3. Developing a Novel Measure of Body Satisfaction Using Virtual Reality

    PubMed Central

    Purvis, Clare K.; Jones, Megan; Bailey, Jakki O.; Bailenson, Jeremy; Taylor, C. Barr

    2015-01-01

    Body image disturbance (BID), considered a key feature in eating disorders, is a pervasive issue among young women. Accurate assessment of BID is critical, but the field is currently limited to self-report assessment methods. In the present study, we build upon existing research, and explore the utility of virtual reality (VR) to elicit and detect changes in BID across various immersive virtual environments. College-aged women with elevated weight and shape concerns (n = 38) and a non-weight and shape concerned control group (n = 40) were randomly exposed to four distinct virtual environments with high or low levels of body salience and social presence (i.e., presence of virtual others). Participants interacted with avatars of thin, normal weight, and overweight body size (BMI of approximately 18, 22, and 27 respectively) in virtual social settings (i.e., beach, party). We measured state-level body satisfaction (state BD) immediately after exposure to each environment. In addition, we measured participants’ minimum interpersonal distance, visual attention, and approach preference toward avatars of each size. Women with higher baseline BID reported significantly higher state BD in all settings compared to controls. Both groups reported significantly higher state BD in a beach with avatars as compared to other environments. In addition, women with elevated BID approached closer to normal weight avatars and looked longer at thin avatars compared to women in the control group. Our findings indicate that VR may serve as a novel tool for measuring state-level BID, with applications for measuring treatment outcomes. Implications for future research and clinical interventions are discussed. PMID:26469860

  4. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  5. Heard on The Street: GIS-Guided Immersive 3D Models as an Augmented Reality for Team Collaboration

    NASA Astrophysics Data System (ADS)

    Quinn, B. B.

    2007-12-01

    Grid computing can be configured to run physics simulations for spatially contiguous virtual 3D model spaces. Each cell is run by a single processor core simulating 1/16 square kilometer of surface and can contain up to 15,000 objects. In this work, a model of one urban block was constructed in the commercial 3D online digital world Second Life http://secondlife.com to prove concept that GIS data can guide the build of an accurate in-world model. Second Life simulators support terrain modeling at two-meter grid intervals. Access to the Second Life grid is worldwide if connections to the US-based servers are possible. This immersive 3D model allows visitors to explore the space at will, with physics simulated for object collisions, gravity, and wind forces about 40 times per second. Visitors view this world as renderings by their 3-D display card of graphic objects and raster textures that are streamed from the simulator grid to the Second Life client, based on that client's instantaneous field of view. Visitors to immersive 3D models experience a virtual world that engages their innate abilities to relate to the real immersive 3D world in which humans have evolved. These abilities enable far more complex and dynamic 3D environments to be quickly and accurately comprehended by more visitors than most non-immersive 3D environments. Objects of interest at ground surface and below can be walked around, possibly entered, viewed at arm's length or flown over at 500 meters above. Videos of renderings have been recorded (as machinima) to share a visit as part of public presentations. Key to this experience is that dozens of simultaneous visitors can experience the model at the same time, each exploring it at will and seeing (if not colliding with) one another---like twenty geology students on a virtual outcrop, where each student might fly if they chose to. This work modeled the downtown Berkeley, CA, transit station in the Second Life region "Gualala" near [170, 35, 35

  6. Psychology Student Opinion of Virtual Reality as a Tool to Educate about Schizophrenia

    ERIC Educational Resources Information Center

    Tichon, Jennifer; Loh, Jennifer; King, Robert

    2004-01-01

    Virtual Reality (VR) techniques are increasingly being used in e-health education, training and in trial clinical programs in the treatment of certain types of mental illness. Undergraduate psychology student opinion of the use of Virtual Reality (VR) to teach them about schizophrenia at the University of Queensland, was determined with reference…

  7. Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis.

    PubMed

    Bergeron, Mathieu; Lortie, Catherine L; Guitton, Matthieu J

    2015-01-01

    Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies.

  8. Virtual Reality as Treatment for Fear of Flying: A Review of Recent Research

    ERIC Educational Resources Information Center

    Price, Matthew; Anderson, Page; Rothbaum, Barbara O.

    2008-01-01

    Virtual reality exposure has recently emerged as an important tool for exposure therapy in the treatment of fear of flying. There have been numerous empirical studies that have evaluated the effectiveness of virtual reality exposure as compared to other treatments including in vivo exposure, progressive muscle relaxation, cognitive therapy,…

  9. The Impact of Virtual Reality Programs in Career and Technical Education

    ERIC Educational Resources Information Center

    Catterson, Anna J.

    2013-01-01

    Instructional technology has evolved from blackboards with chalk to in some cases three-dimensional virtual reality environments in which students are interacting and engaging with other students worldwide. The use of this new instructional methodology, known as "virtual reality," has experienced substantial growth in higher education…

  10. The Use of Virtual Reality Tools in the Reading-Language Arts Classroom

    ERIC Educational Resources Information Center

    Pilgrim, J. Michael; Pilgrim, Jodi

    2016-01-01

    This article presents virtual reality as a tool for classroom literacy instruction. Building on the traditional use of images as a way to scaffold prior knowledge, we extend this idea to share ways virtual reality enables experiential learning through field trip-like experiences. The use of technology tools such Google Street view, Google…

  11. Use of Virtual Reality Tools for Vestibular Disorders Rehabilitation: A Comprehensive Analysis

    PubMed Central

    Bergeron, Mathieu; Lortie, Catherine L.; Guitton, Matthieu J.

    2015-01-01

    Classical peripheral vestibular disorders rehabilitation is a long and costly process. While virtual reality settings have been repeatedly suggested to represent possible tools to help the rehabilitation process, no systematic study had been conducted so far. We systematically reviewed the current literature to analyze the published protocols documenting the use of virtual reality settings for peripheral vestibular disorders rehabilitation. There is an important diversity of settings and protocols involving virtual reality settings for the treatment of this pathology. Evaluation of the symptoms is often not standardized. However, our results unveil a clear effect of virtual reality settings-based rehabilitation of the patients' symptoms, assessed by objectives tools such as the DHI (mean decrease of 27 points), changing symptoms handicap perception from moderate to mild impact on life. Furthermore, we detected a relationship between the duration of the exposure to virtual reality environments and the magnitude of the therapeutic effects, suggesting that virtual reality treatments should last at least 150 minutes of cumulated exposure to ensure positive outcomes. Virtual reality offers a pleasant and safe environment for the patient. Future studies should standardize evaluation tools, document putative side effects further, compare virtual reality to conventional physical therapy, and evaluate economical costs/benefits of such strategies. PMID:26556560

  12. CamMedNP: Building the Cameroonian 3D structural natural products database for virtual screening

    PubMed Central

    2013-01-01

    Background Computer-aided drug design (CADD) often involves virtual screening (VS) of large compound datasets and the availability of such is vital for drug discovery protocols. We present CamMedNP - a new database beginning with more than 2,500 compounds of natural origin, along with some of their derivatives which were obtained through hemisynthesis. These are pure compounds which have been previously isolated and characterized using modern spectroscopic methods and published by several research teams spread across Cameroon. Description In the present study, 224 distinct medicinal plant species belonging to 55 plant families from the Cameroonian flora have been considered. About 80 % of these have been previously published and/or referenced in internationally recognized journals. For each compound, the optimized 3D structure, drug-like properties, plant source, collection site and currently known biological activities are given, as well as literature references. We have evaluated the “drug-likeness” of this database using Lipinski’s “Rule of Five”. A diversity analysis has been carried out in comparison with the ChemBridge diverse database. Conclusion CamMedNP could be highly useful for database screening and natural product lead generation programs. PMID:23590173

  13. Assessing endocranial variations in great apes and humans using 3D data from virtual endocasts.

    PubMed

    Bienvenu, Thibaut; Guy, Franck; Coudyzer, Walter; Gilissen, Emmanuel; Roualdès, Georges; Vignaud, Patrick; Brunet, Michel

    2011-06-01

    Modern humans are characterized by their large, complex, and specialized brain. Human brain evolution can be addressed through direct evidence provided by fossil hominid endocasts (i.e. paleoneurology), or through indirect evidence of extant species comparative neurology. Here we use the second approach, providing an extant comparative framework for hominid paleoneurological studies. We explore endocranial size and shape differences among great apes and humans, as well as between sexes. We virtually extracted 72 endocasts, sampling all extant great ape species and modern humans, and digitized 37 landmarks on each for 3D generalized Procrustes analysis. All species can be differentiated by their endocranial shape. Among great apes, endocranial shapes vary from short (orangutans) to long (gorillas), perhaps in relation to different facial orientations. Endocranial shape differences among African apes are partly allometric. Major endocranial traits distinguishing humans from great apes are endocranial globularity, reflecting neurological reorganization, and features linked to structural responses to posture and bipedal locomotion. Human endocasts are also characterized by posterior location of foramina rotunda relative to optic canals, which could be correlated to lesser subnasal prognathism compared to living great apes. Species with larger brains (gorillas and humans) display greater sexual dimorphism in endocranial size, while sexual dimorphism in endocranial shape is restricted to gorillas, differences between males and females being at least partly due to allometry. Our study of endocranial variations in extant great apes and humans provides a new comparative dataset for studies of fossil hominid endocasts.

  14. The effects of task difficulty on visual search strategy in virtual 3D displays

    PubMed Central

    Pomplun, Marc; Garaas, Tyler W.; Carrasco, Marisa

    2013-01-01

    Analyzing the factors that determine our choice of visual search strategy may shed light on visual behavior in everyday situations. Previous results suggest that increasing task difficulty leads to more systematic search paths. Here we analyze observers' eye movements in an “easy” conjunction search task and a “difficult” shape search task to study visual search strategies in stereoscopic search displays with virtual depth induced by binocular disparity. Standard eye-movement variables, such as fixation duration and initial saccade latency, as well as new measures proposed here, such as saccadic step size, relative saccadic selectivity, and x−y target distance, revealed systematic effects on search dynamics in the horizontal-vertical plane throughout the search process. We found that in the “easy” task, observers start with the processing of display items in the display center immediately after stimulus onset and subsequently move their gaze outwards, guided by extrafoveally perceived stimulus color. In contrast, the “difficult” task induced an initial gaze shift to the upper-left display corner, followed by a systematic left-right and top-down search process. The only consistent depth effect was a trend of initial saccades in the easy task with smallest displays to the items closest to the observer. The results demonstrate the utility of eye-movement analysis for understanding search strategies and provide a first step toward studying search strategies in actual 3D scenarios. PMID:23986539

  15. Virtual-reality-based educational laboratories in fiber optic engineering

    NASA Astrophysics Data System (ADS)

    Hayes, Dana; Turczynski, Craig; Rice, Jonny; Kozhevnikov, Michael

    2014-07-01

    Researchers and educators have observed great potential in virtual reality (VR) technology as an educational tool due to its ability to engage and spark interest in students, thus providing them with a deeper form of knowledge about a subject. The focus of this project is to develop an interactive VR educational module, Laser Diode Characteristics and Coupling to Fibers, to integrate into a fiber optics laboratory course. The developed module features a virtual laboratory populated with realistic models of optical devices in which students can set up and perform an optical experiment dealing with laser diode characteristics and fiber coupling. The module contains three increasingly complex levels for students to navigate through, with a short built-in quiz after each level to measure the student's understanding of the subject. Seventeen undergraduate students learned fiber coupling concepts using the designed computer simulation in a non-immersive desktop virtual environment (VE) condition. The analysis of students' responses on the updated pre- and post tests show statistically significant improvement of the scores for the post-test as compared to the pre-test. In addition, the students' survey responses suggest that they found the module very useful and engaging. The conducted study clearly demonstrated the feasibility of the proposed instructional technology for engineering education, where both the model of instruction and the enabling technology are equally important, in providing a better learning environment to improve students' conceptual understanding as compared to other instructional approaches.

  16. Event-Based Data Distribution for Mobile Augmented Reality and Virtual Environments

    DTIC Science & Technology

    2004-04-01

    demonstrated in the Battlefield Augmented Reality System (BARS) situation awareness system, composed of several mobile augmented reality systems, immersive...connectivity and their bandwidth can be highly constrained. This paper presents a robust event-based data distribution mechanism for mobile augmented ... reality and virtual environments. It is based on replicated databases, pluggable networking protocols, and communication channels. The mechanism is

  17. An Event-Based Data Distribution Mechanism for Collaborative Mobile Augmented Reality and Virtual Environments

    DTIC Science & Technology

    2003-01-01

    mechanism in the Battlefield Augmented Reality System (BARS) situation awareness system, which is composed of several mobile augmented reality systems...connectivity and their bandwidth can be highly constrained. In this paper we present a robust event based data distribution mechanism for mobile augmented ... reality and virtual environments. It is based on replicated databases, pluggable networking protocols, and communication channels. We demonstrate the

  18. Surviving sepsis--a 3D integrative educational simulator.

    PubMed

    Ježek, Filip; Tribula, Martin; Kulhánek, Tomáš; Mateják, Marek; Privitzer, Pavol; Šilar, Jan; Kofránek, Jiří; Lhotská, Lenka

    2015-08-01

    Computer technology offers greater educational possibilities, notably simulation and virtual reality. This paper presents a technology which serves to integrate multiple modalities, namely 3D virtual reality, node-based simulator, Physiomodel explorer and explanatory physiological simulators employing Modelica language and Unity3D platform. This emerging tool chain should allow the authors to concentrate more on educational content instead of application development. The technology is demonstrated through Surviving sepsis educational scenario, targeted on Microsoft Windows Store platform.

  19. Analyzing industrial furnace efficiency using comparative visualization in a virtual reality environment.

    SciTech Connect

    Freitag, L.; Urness, T.

    1999-02-10

    We describe an interactive toolkit used to perform comparative analysis of two or more data sets arising from numerical simulations. Several techniques have been incorporated into this toolkit, including (1) successive visualization of individual data sets, (2) data comparison techniques such as computation and visualization of the differences between data sets, and (3) image comparison methods such as scalar field height profiles plotted in a common coordinate system. We describe each technique in detail and show example usage in an industrial application aimed at designing an efficient, low-NOX burner for industrial furnaces. Critical insights are obtained by interactively adjusted color maps, data culling, and data manipulation. New paradigms for scaling small values in the data comparison technique are described. The display device used for this application was the CAVE virtual reality theater, and we describe the user interface to the visualization toolkit and the benefits of immersive 3D visualization for comparative analysis.

  20. Virtual reality in planning and operations from research topic to practical issue

    SciTech Connect

    Rindahl, G.; Johnsen, T.; Mark, N. K. F.; Meyer, G.

    2006-07-01

    During the last decade of research and development on advanced visualization systems for the nuclear industry, the available technology has evolved significantly. In the same period, nuclear companies have entered a more competitive environment due to the increasingly open electricity market, resulting in strong demands on cost effective operations. This paper reports on some of the 3D applications developed by Inst. for Energy Technology in this time period, and on the emerging possibilities for practical applications of Virtual and Augmented Reality. Finally the paper proposes that well-considered deployment of recent and on-going technological advances in this field can be a contribution to improving economy and efficiency without compromising safety. (authors)