Science.gov

Sample records for 3-d virtual reality

  1. 3D Virtual Reality Check: Learner Engagement and Constructivist Theory

    ERIC Educational Resources Information Center

    Bair, Richard A.

    2013-01-01

    The inclusion of three-dimensional (3D) virtual tools has created a need to communicate the engagement of 3D tools and specify learning gains that educators and the institutions, which are funding 3D tools, can expect. A review of literature demonstrates that specific models and theories for 3D Virtual Reality (VR) learning do not exist "per…

  2. 3D Virtual Reality for Teaching Astronomy

    NASA Astrophysics Data System (ADS)

    Speck, Angela; Ruzhitskaya, L.; Laffey, J.; Ding, N.

    2012-01-01

    We are developing 3D virtual learning environments (VLEs) as learning materials for an undergraduate astronomy course, in which will utilize advances both in technologies available and in our understanding of the social nature of learning. These learning materials will be used to test whether such VLEs can indeed augment science learning so that it is more engaging, active, visual and effective. Our project focuses on the challenges and requirements of introductory college astronomy classes. Here we present our virtual world of the Jupiter system and how we plan to implement it to allow students to learn course material - physical laws and concepts in astronomy - while engaging them into exploration of the Jupiter's system, encouraging their imagination, curiosity, and motivation. The VLE can allow students to work individually or collaboratively. The 3D world also provides an opportunity for research in astronomy education to investigate impact of social interaction, gaming features, and use of manipulatives offered by a learning tool on students’ motivation and learning outcomes. Use of this VLE is also a valuable source for exploration of how the learners’ spatial awareness can be enhanced by working in 3D environment. We will present the Jupiter-system environment along with a preliminary study of the efficacy and usability of our Jupiter 3D VLE.

  3. Dynamic 3D echocardiography in virtual reality

    PubMed Central

    van den Bosch, Annemien E; Koning, Anton HJ; Meijboom, Folkert J; McGhie, Jackie S; Simoons, Maarten L; van der Spek, Peter J; Bogers, Ad JJC

    2005-01-01

    Background This pilot study was performed to evaluate whether virtual reality is applicable for three-dimensional echocardiography and if three-dimensional echocardiographic 'holograms' have the potential to become a clinically useful tool. Methods Three-dimensional echocardiographic data sets from 2 normal subjects and from 4 patients with a mitral valve pathological condition were included in the study. The three-dimensional data sets were acquired with the Philips Sonos 7500 echo-system and transferred to the BARCO (Barco N.V., Kortrijk, Belgium) I-space. Ten independent observers assessed the 6 three-dimensional data sets with and without mitral valve pathology. After 10 minutes' instruction in the I-Space, all of the observers could use the virtual pointer that is necessary to create cut planes in the hologram. Results The 10 independent observers correctly assessed the normal and pathological mitral valve in the holograms (analysis time approximately 10 minutes). Conclusion this report shows that dynamic holographic imaging of three-dimensional echocardiographic data is feasible. However, the applicability and use-fullness of this technology in clinical practice is still limited. PMID:16375768

  4. Building intuitive 3D interfaces for virtual reality systems

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Seitel, Mathias; Mullick, Rakesh

    2007-03-01

    An exploration of techniques for developing intuitive, and efficient user interfaces for virtual reality systems. Work seeks to understand which paradigms from the better-understood world of 2D user interfaces remain viable within 3D environments. In order to establish this a new user interface was created that applied various understood principles of interface design. A user study was then performed where it was compared with an earlier interface for a series of medical visualization tasks.

  5. Organizational Learning Goes Virtual?: A Study of Employees' Learning Achievement in Stereoscopic 3D Virtual Reality

    ERIC Educational Resources Information Center

    Lau, Kung Wong

    2015-01-01

    Purpose: This study aims to deepen understanding of the use of stereoscopic 3D technology (stereo3D) in facilitating organizational learning. The emergence of advanced virtual technologies, in particular to the stereo3D virtual reality, has fundamentally changed the ways in which organizations train their employees. However, in academic or…

  6. Virtual reality 3D headset based on DMD light modulators

    SciTech Connect

    Bernacki, Bruce E.; Evans, Allan; Tang, Edward

    2014-06-13

    We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micro-mirror devices (DMD). Our approach leverages silicon micro mirrors offering 720p resolution displays in a small form-factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high resolution and low power consumption. Applications include night driving, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design is described in which light from the DMD is imaged to infinity and the user’s own eye lens forms a real image on the user’s retina.

  7. The virtual reality 3D city of Ningbo

    NASA Astrophysics Data System (ADS)

    Chen, Weimin; Wu, Dun

    2009-09-01

    In 2005, Ningbo Design Research Institute of Mapping & Surveying started the development of concepts and an implementation of Virtual Reality Ningbo System (VRNS). VRNS is being developed under the digital city technological framework and well supported by computing advances, space technologies, and commercial innovations. It has become the best solution for integrating, managing, presenting, and distributing complex city information. VRNS is not only a 3D-GIS launch project but also a technology innovation. The traditional domain of surveying and mapping has changed greatly in Ningbo. Geo-information systems are developing towards a more reality-, three dimension- and Service-Oriented Architecture-based system. The VRNS uses technology such as 3D modeling, user interface design, view scene modeling, real-time rendering and interactive roaming under a virtual environment. Two applications of VRNS already being used are for city planning and high-rise buildings' security management. The final purpose is to develop VRNS into a powerful public information platform, and to achieve that heterogeneous city information resources share this one single platform.

  8. The virtual reality 3D city of Ningbo

    NASA Astrophysics Data System (ADS)

    Chen, Weimin; Wu, Dun

    2010-11-01

    In 2005, Ningbo Design Research Institute of Mapping & Surveying started the development of concepts and an implementation of Virtual Reality Ningbo System (VRNS). VRNS is being developed under the digital city technological framework and well supported by computing advances, space technologies, and commercial innovations. It has become the best solution for integrating, managing, presenting, and distributing complex city information. VRNS is not only a 3D-GIS launch project but also a technology innovation. The traditional domain of surveying and mapping has changed greatly in Ningbo. Geo-information systems are developing towards a more reality-, three dimension- and Service-Oriented Architecture-based system. The VRNS uses technology such as 3D modeling, user interface design, view scene modeling, real-time rendering and interactive roaming under a virtual environment. Two applications of VRNS already being used are for city planning and high-rise buildings' security management. The final purpose is to develop VRNS into a powerful public information platform, and to achieve that heterogeneous city information resources share this one single platform.

  9. Virtual reality 3D headset based on DMD light modulators

    NASA Astrophysics Data System (ADS)

    Bernacki, Bruce E.; Evans, Allan; Tang, Edward

    2014-06-01

    We present the design of an immersion-type 3D headset suitable for virtual reality applications based upon digital micromirror devices (DMD). Current methods for presenting information for virtual reality are focused on either polarizationbased modulators such as liquid crystal on silicon (LCoS) devices, or miniature LCD or LED displays often using lenses to place the image at infinity. LCoS modulators are an area of active research and development, and reduce the amount of viewing light by 50% due to the use of polarization. Viewable LCD or LED screens may suffer low resolution, cause eye fatigue, and exhibit a "screen door" or pixelation effect due to the low pixel fill factor. Our approach leverages a mature technology based on silicon micro mirrors delivering 720p resolution displays in a small form-factor with high fill factor. Supporting chip sets allow rapid integration of these devices into wearable displays with high-definition resolution and low power consumption, and many of the design methods developed for DMD projector applications can be adapted to display use. Potential applications include night driving with natural depth perception, piloting of UAVs, fusion of multiple sensors for pilots, training, vision diagnostics and consumer gaming. Our design concept is described in which light from the DMD is imaged to infinity and the user's own eye lens forms a real image on the user's retina resulting in a virtual retinal display.

  10. Sensorized Garment Augmented 3D Pervasive Virtual Reality System

    NASA Astrophysics Data System (ADS)

    Gulrez, Tauseef; Tognetti, Alessandro; de Rossi, Danilo

    Virtual reality (VR) technology has matured to a point where humans can navigate in virtual scenes; however, providing them with a comfortable fully immersive role in VR remains a challenge. Currently available sensing solutions do not provide ease of deployment, particularly in the seated position due to sensor placement restrictions over the body, and optic-sensing requires a restricted indoor environment to track body movements. Here we present a 52-sensor laden garment interfaced with VR, which offers both portability and unencumbered user movement in a VR environment. This chapter addresses the systems engineering aspects of our pervasive computing solution of the interactive sensorized 3D VR and presents the initial results and future research directions. Participants navigated in a virtual art gallery using natural body movements that were detected by their wearable sensor shirt and then mapped the signals to electrical control signals responsible for VR scene navigation. The initial results are positive, and offer many opportunities for use in computationally intelligentman-machine multimedia control.

  11. 3-D Sound for Virtual Reality and Multimedia

    NASA Technical Reports Server (NTRS)

    Begault, Durand R.; Trejo, Leonard J. (Technical Monitor)

    2000-01-01

    Technology and applications for the rendering of virtual acoustic spaces are reviewed. Chapter 1 deals with acoustics and psychoacoustics. Chapters 2 and 3 cover cues to spatial hearing and review psychoacoustic literature. Chapter 4 covers signal processing and systems overviews of 3-D sound systems. Chapter 5 covers applications to computer workstations, communication systems, aeronautics and space, and sonic arts. Chapter 6 lists resources. This TM is a reprint of the 1994 book from Academic Press.

  12. Anesthesiology training using 3D imaging and virtual reality

    NASA Astrophysics Data System (ADS)

    Blezek, Daniel J.; Robb, Richard A.; Camp, Jon J.; Nauss, Lee A.

    1996-04-01

    Current training for regional nerve block procedures by anesthesiology residents requires expert supervision and the use of cadavers; both of which are relatively expensive commodities in today's cost-conscious medical environment. We are developing methods to augment and eventually replace these training procedures with real-time and realistic computer visualizations and manipulations of the anatomical structures involved in anesthesiology procedures, such as nerve plexus injections (e.g., celiac blocks). The initial work is focused on visualizations: both static images and rotational renderings. From the initial results, a coherent paradigm for virtual patient and scene representation will be developed.

  13. Three Primary School Students' Cognition about 3D Rotation in a Virtual Reality Learning Environment

    ERIC Educational Resources Information Center

    Yeh, Andy

    2010-01-01

    This paper reports on three primary school students' explorations of 3D rotation in a virtual reality learning environment (VRLE) named VRMath. When asked to investigate if you would face the same direction when you turn right 45 degrees first then roll up 45 degrees, or when you roll up 45 degrees first then turn right 45 degrees, the students…

  14. The Learner Characteristics, Features of Desktop 3D Virtual Reality Environments, and College Chemistry Instruction: A Structural Equation Modeling Analysis

    ERIC Educational Resources Information Center

    Merchant, Zahira; Goetz, Ernest T.; Keeney-Kennicutt, Wendy; Kwok, Oi-man; Cifuentes, Lauren; Davis, Trina J.

    2012-01-01

    We examined a model of the impact of a 3D desktop virtual reality environment on the learner characteristics (i.e. perceptual and psychological variables) that can enhance chemistry-related learning achievements in an introductory college chemistry class. The relationships between the 3D virtual reality features and the chemistry learning test as…

  15. Mackay campus of environmental education and digital cultural construction: the application of 3D virtual reality

    NASA Astrophysics Data System (ADS)

    Chien, Shao-Chi; Chung, Yu-Wei; Lin, Yi-Hsuan; Huang, Jun-Yi; Chang, Jhih-Ting; He, Cai-Ying; Cheng, Yi-Wen

    2012-04-01

    This study uses 3D virtual reality technology to create the "Mackay campus of the environmental education and digital cultural 3D navigation system" for local historical sites in the Tamsui (Hoba) area, in hopes of providing tourism information and navigation through historical sites using a 3D navigation system. We used Auto CAD, Sketch Up, and SpaceEyes 3D software to construct the virtual reality scenes and create the school's historical sites, such as the House of Reverends, the House of Maidens, the Residence of Mackay, and the Education Hall. We used this technology to complete the environmental education and digital cultural Mackay campus . The platform we established can indeed achieve the desired function of providing tourism information and historical site navigation. The interactive multimedia style and the presentation of the information will allow users to obtain a direct information response. In addition to showing the external appearances of buildings, the navigation platform can also allow users to enter the buildings to view lifelike scenes and textual information related to the historical sites. The historical sites are designed according to their actual size, which gives users a more realistic feel. In terms of the navigation route, the navigation system does not force users along a fixed route, but instead allows users to freely control the route they would like to take to view the historical sites on the platform.

  16. Early pregnancy placental bed and fetal vascular volume measurements using 3-D virtual reality.

    PubMed

    Reus, Averil D; Klop-van der Aa, Josine; Rifouna, Maria S; Koning, Anton H J; Exalto, Niek; van der Spek, Peter J; Steegers, Eric A P

    2014-08-01

    In this study, a new 3-D Virtual Reality (3D VR) technique for examining placental and uterine vasculature was investigated. The validity of placental bed vascular volume (PBVV) and fetal vascular volume (FVV) measurements was assessed and associations of PBVV and FVV with embryonic volume, crown-rump length, fetal birth weight and maternal parity were investigated. One hundred thirty-two patients were included in this study, and measurements were performed in 100 patients. Using V-Scope software, 100 3-D Power Doppler data sets of 100 pregnancies at 12 wk of gestation were analyzed with 3D VR in the I-Space Virtual Reality system. Volume measurements were performed with semi-automatic, pre-defined parameters. The inter-observer and intra-observer agreement was excellent with all intra-class correlation coefficients >0.93. PBVVs of multiparous women were significantly larger than the PBVVs of primiparous women (p = 0.008). In this study, no other associations were found. In conclusion, V-Scope offers a reproducible method for measuring PBVV and FVV at 12 wk of gestation, although we are unsure whether the volume measured represents the true volume of the vasculature. Maternal parity influences PBVV. PMID:24798392

  17. Using the CAVE virtual-reality environment as an aid to 3-D electromagnetic field computation

    SciTech Connect

    Turner, L.R.; Levine, D.; Huang, M.; Papka, M; Kettunen, L.

    1995-08-01

    One of the major problems in three-dimensional (3-D) field computation is visualizing the resulting 3-D field distributions. A virtual-reality environment, such as the CAVE, (CAVE Automatic Virtual Environment) is helping to overcome this problem, thus making the results of computation more usable for designers and users of magnets and other electromagnetic devices. As a demonstration of the capabilities of the CAVE, the elliptical multipole wiggler (EMW), an insertion device being designed for the Advanced Photon Source (APS) now being commissioned at Argonne National Laboratory (ANL), wa made visible, along with its fields and beam orbits. Other uses of the CAVE in preprocessing and postprocessing computation for electromagnetic applications are also discussed.

  18. Effects of 3D Virtual Reality of Plate Tectonics on Fifth Grade Students' Achievement and Attitude toward Science

    ERIC Educational Resources Information Center

    Kim, Paul

    2006-01-01

    This study examines the effects of a teaching method using 3D virtual reality simulations on achievement and attitude toward science. An experiment was conducted with fifth-grade students (N = 41) to examine the effects of 3D simulations, designed to support inquiry-based science curriculum. An ANOVA analysis revealed that the 3D group scored…

  19. Modulation of cortical activity in 2D versus 3D virtual reality environments: an EEG study.

    PubMed

    Slobounov, Semyon M; Ray, William; Johnson, Brian; Slobounov, Elena; Newell, Karl M

    2015-03-01

    There is a growing empirical evidence that virtual reality (VR) is valuable for education, training, entertaining and medical rehabilitation due to its capacity to represent real-life events and situations. However, the neural mechanisms underlying behavioral confounds in VR environments are still poorly understood. In two experiments, we examined the effect of fully immersive 3D stereoscopic presentations and less immersive 2D VR environments on brain functions and behavioral outcomes. In Experiment 1 we examined behavioral and neural underpinnings of spatial navigation tasks using electroencephalography (EEG). In Experiment 2, we examined EEG correlates of postural stability and balance. Our major findings showed that fully immersive 3D VR induced a higher subjective sense of presence along with enhanced success rate of spatial navigation compared to 2D. In Experiment 1 power of frontal midline EEG (FM-theta) was significantly higher during the encoding phase of route presentation in the 3D VR. In Experiment 2, the 3D VR resulted in greater postural instability and modulation of EEG patterns as a function of 3D versus 2D environments. The findings support the inference that the fully immersive 3D enriched-environment requires allocation of more brain and sensory resources for cognitive/motor control during both tasks than 2D presentations. This is further evidence that 3D VR tasks using EEG may be a promising approach for performance enhancement and potential applications in clinical/rehabilitation settings. PMID:25448267

  20. 3D Visualization of Cultural Heritage Artefacts with Virtual Reality devices

    NASA Astrophysics Data System (ADS)

    Gonizzi Barsanti, S.; Caruso, G.; Micoli, L. L.; Covarrubias Rodriguez, M.; Guidi, G.

    2015-08-01

    Although 3D models are useful to preserve the information about historical artefacts, the potential of these digital contents are not fully accomplished until they are not used to interactively communicate their significance to non-specialists. Starting from this consideration, a new way to provide museum visitors with more information was investigated. The research is aimed at valorising and making more accessible the Egyptian funeral objects exhibited in the Sforza Castle in Milan. The results of the research will be used for the renewal of the current exhibition, at the Archaeological Museum in Milan, by making it more attractive. A 3D virtual interactive scenario regarding the "path of the dead", an important ritual in ancient Egypt, was realized to augment the experience and the comprehension of the public through interactivity. Four important artefacts were considered for this scope: two ushabty, a wooden sarcophagus and a heart scarab. The scenario was realized by integrating low-cost Virtual Reality technologies, as the Oculus Rift DK2 and the Leap Motion controller, and implementing a specific software by using Unity. The 3D models were implemented by adding responsive points of interest in relation to important symbols or features of the artefact. This allows highlighting single parts of the artefact in order to better identify the hieroglyphs and provide their translation. The paper describes the process for optimizing the 3D models, the implementation of the interactive scenario and the results of some test that have been carried out in the lab.

  1. Design and fabrication of concave-convex lens for head mounted virtual reality 3D glasses

    NASA Astrophysics Data System (ADS)

    Deng, Zhaoyang; Cheng, Dewen; Hu, Yuan; Huang, Yifan; Wang, Yongtian

    2015-08-01

    As a kind of light-weighted and convenient tool to achieve stereoscopic vision, virtual reality glasses are gaining more popularity nowadays. For these glasses, molded plastic lenses are often adopted to handle both the imaging property and the cost of massive production. However, the as-built performance of the glass depends on both the optical design and the injection molding process, and maintaining the profile of the lens during injection molding process presents particular challenges. In this paper, optical design is combined with processing simulation analysis to obtain a design result suitable for injection molding. Based on the design and analysis results, different experiments are done using high-quality equipment to optimize the process parameters of injection molding. Finally, a single concave-convex lens is designed with a field-of-view of 90° for the virtual reality 3D glasses. The as-built profile error of the glass lens is controlled within 5μm, which indicates that the designed shape of the lens is fairly realized and the designed optical performance can thus be achieved.

  2. Design and implementation of a 3D ocean virtual reality and visualization engine

    NASA Astrophysics Data System (ADS)

    Chen, Ge; Li, Bo; Tian, Fenglin; Ji, Pengbo; Li, Wenqing

    2012-12-01

    In this study, a 3D virtual reality and visualization engine for rendering the ocean, named VV-Ocean, is designed for marine applications. The design goals of VV-Ocean aim at high fidelity simulation of ocean environment, visualization of massive and multidimensional marine data, and imitation of marine lives. VV-Ocean is composed of five modules, i.e. memory management module, resources management module, scene management module, rendering process management module and interaction management module. There are three core functions in VV-Ocean: reconstructing vivid virtual ocean scenes, visualizing real data dynamically in real time, imitating and simulating marine lives intuitively. Based on VV-Ocean, we establish a sea-land integration platform which can reproduce drifting and diffusion processes of oil spilling from sea bottom to surface. Environment factors such as ocean current and wind field have been considered in this simulation. On this platform oil spilling process can be abstracted as movements of abundant oil particles. The result shows that oil particles blend with water well and the platform meets the requirement for real-time and interactive rendering. VV-Ocean can be widely used in ocean applications such as demonstrating marine operations, facilitating maritime communications, developing ocean games, reducing marine hazards, forecasting the weather over oceans, serving marine tourism, and so on. Finally, further technological improvements of VV-Ocean are discussed.

  3. Re-Dimensional Thinking in Earth Science: From 3-D Virtual Reality Panoramas to 2-D Contour Maps

    ERIC Educational Resources Information Center

    Park, John; Carter, Glenda; Butler, Susan; Slykhuis, David; Reid-Griffin, Angelia

    2008-01-01

    This study examines the relationship of gender and spatial perception on student interactivity with contour maps and non-immersive virtual reality. Eighteen eighth-grade students elected to participate in a six-week activity-based course called "3-D GeoMapping." The course included nine days of activities related to topographic mapping. At the end…

  4. A 3-D Virtual Reality Model of the Sun and the Moon for E-Learning at Elementary Schools

    ERIC Educational Resources Information Center

    Sun, Koun-Tem; Lin, Ching-Ling; Wang, Sheng-Min

    2010-01-01

    The relative positions of the sun, moon, and earth, their movements, and their relationships are abstract and difficult to understand astronomical concepts in elementary school science. This study proposes a three-dimensional (3-D) virtual reality (VR) model named the "Sun and Moon System." This e-learning resource was designed by combining…

  5. Assessing the precision of gaze following using a stereoscopic 3D virtual reality setting.

    PubMed

    Atabaki, Artin; Marciniak, Karolina; Dicke, Peter W; Thier, Peter

    2015-07-01

    Despite the ecological importance of gaze following, little is known about the underlying neuronal processes, which allow us to extract gaze direction from the geometric features of the eye and head of a conspecific. In order to understand the neuronal mechanisms underlying this ability, a careful description of the capacity and the limitations of gaze following at the behavioral level is needed. Previous studies of gaze following, which relied on naturalistic settings have the disadvantage of allowing only very limited control of potentially relevant visual features guiding gaze following, such as the contrast of iris and sclera, the shape of the eyelids and--in the case of photographs--they lack depth. Hence, in order to get full control of potentially relevant features we decided to study gaze following of human observers guided by the gaze of a human avatar seen stereoscopically. To this end we established a stereoscopic 3D virtual reality setup, in which we tested human subjects' abilities to detect at which target a human avatar was looking at. Following the gaze of the avatar showed all the features of the gaze following of a natural person, namely a substantial degree of precision associated with a consistent pattern of systematic deviations from the target. Poor stereo vision affected performance surprisingly little (only in certain experimental conditions). Only gaze following guided by targets at larger downward eccentricities exhibited a differential effect of the presence or absence of accompanying movements of the avatar's eyelids and eyebrows. PMID:25982719

  6. The Input-Interface of Webcam Applied in 3D Virtual Reality Systems

    ERIC Educational Resources Information Center

    Sun, Huey-Min; Cheng, Wen-Lin

    2009-01-01

    Our research explores a virtual reality application based on Web camera (Webcam) input-interface. The interface can replace with the mouse to control direction intention of a user by the method of frame difference. We divide a frame into nine grids from Webcam and make use of the background registration to compute the moving object. In order to…

  7. Exploring 3-D Virtual Reality Technology for Spatial Ability and Chemistry Achievement

    ERIC Educational Resources Information Center

    Merchant, Z.; Goetz, E. T.; Keeney-Kennicutt, W.; Cifuentes, L.; Kwok, O.; Davis, T. J.

    2013-01-01

    We investigated the potential of Second Life® (SL), a three-dimensional (3-D) virtual world, to enhance undergraduate students' learning of a vital chemistry concept. A quasi-experimental pre-posttest control group design was used to conduct the study. A total of 387 participants completed three assignment activities either in SL or using…

  8. Vasculogenesis and angiogenesis in the first trimester human placenta: an innovative 3D study using an immersive Virtual Reality system.

    PubMed

    van Oppenraaij, R H F; Koning, A H J; Lisman, B A; Boer, K; van den Hoff, M J B; van der Spek, P J; Steegers, E A P; Exalto, N

    2009-03-01

    First trimester human villous vascularization is mainly studied by conventional two-dimensional (2D) microscopy. With this (2D) technique it is not possible to observe the spatial arrangement of the haemangioblastic cords and vessels, transition of cords into vessels and the transition of vasculogenesis to angiogenesis. The Confocal Laser Scanning Microscopy (CLSM) allows for a three-dimensional (3D) reconstruction of images of early pregnancy villous vascularization. These 3D reconstructions, however, are normally analyzed on a 2D medium, lacking depth perception. We performed a descriptive morphologic study, using an immersive Virtual Reality system to utilize the full third dimension completely. This innovative 3D technique visualizes 3D datasets as enlarged 3D holograms and provided detailed insight in the spatial arrangement of first trimester villous vascularization, the beginning of lumen formation within various junctions of haemangioblastic cords between 5 and 7 weeks gestational age and in the gradual transition of vasculogenesis to angiogenesis. This innovative immersive Virtual Reality system enables new perspectives for vascular research and will be implemented for future investigation. PMID:19185915

  9. M3D (Media 3D): a new programming language for web-based virtual reality in E-Learning and Edutainment

    NASA Astrophysics Data System (ADS)

    Chakaveh, Sepideh; Skaley, Detlef; Laine, Patricia; Haeger, Ralf; Maad, Soha

    2003-01-01

    Today, interactive multimedia educational systems are well established, as they prove useful instruments to enhance one's learning capabilities. Hitherto, the main difficulty with almost all E-Learning systems was latent in the rich media implementation techniques. This meant that each and every system should be created individually as reapplying the media, be it only a part, or the whole content was not directly possible, as everything must be applied mechanically i.e. by hand. Consequently making E-learning systems exceedingly expensive to generate, both in time and money terms. Media-3D or M3D is a new platform independent programming language, developed at the Fraunhofer Institute Media Communication to enable visualisation and simulation of E-Learning multimedia content. M3D is an XML-based language, which is capable of distinguishing between the3D models from that of the 3D scenes, as well as handling provisions for animations, within the programme. Here we give a technical account of M3D programming language and briefly describe two specific application scenarios where M3D is applied to create virtual reality E-Learning content for training of technical personnel.

  10. Generic precise augmented reality guiding system and its calibration method based on 3D virtual model.

    PubMed

    Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua

    2016-05-30

    Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique. PMID:27410124

  11. Using virtual reality technology and hand tracking technology to create software for training surgical skills in 3D game

    NASA Astrophysics Data System (ADS)

    Zakirova, A. A.; Ganiev, B. A.; Mullin, R. I.

    2015-11-01

    The lack of visible and approachable ways of training surgical skills is one of the main problems in medical education. Existing simulation training devices are not designed to teach students, and are not available due to the high cost of the equipment. Using modern technologies such as virtual reality and hands movements fixation technology we want to create innovative method of learning the technics of conducting operations in 3D game format, which can make education process interesting and effective. Creating of 3D format virtual simulator will allow to solve several conceptual problems at once: opportunity of practical skills improvement unlimited by the time without the risk for patient, high realism of environment in operational and anatomic body structures, using of game mechanics for information perception relief and memorization of methods acceleration, accessibility of this program.

  12. 3D graphics, virtual reality, and motion-onset visual evoked potentials in neurogaming.

    PubMed

    Beveridge, R; Wilson, S; Coyle, D

    2016-01-01

    A brain-computer interface (BCI) offers movement-free control of a computer application and is achieved by reading and translating the cortical activity of the brain into semantic control signals. Motion-onset visual evoked potentials (mVEP) are neural potentials employed in BCIs and occur when motion-related stimuli are attended visually. mVEP dynamics are correlated with the position and timing of the moving stimuli. To investigate the feasibility of utilizing the mVEP paradigm with video games of various graphical complexities including those of commercial quality, we conducted three studies over four separate sessions comparing the performance of classifying five mVEP responses with variations in graphical complexity and style, in-game distractions, and display parameters surrounding mVEP stimuli. To investigate the feasibility of utilizing contemporary presentation modalities in neurogaming, one of the studies compared mVEP classification performance when stimuli were presented using the oculus rift virtual reality headset. Results from 31 independent subjects were analyzed offline. The results show classification performances ranging up to 90% with variations in conditions in graphical complexity having limited effect on mVEP performance; thus, demonstrating the feasibility of using the mVEP paradigm within BCI-based neurogaming. PMID:27590974

  13. Visuomotor learning in immersive 3D virtual reality in Parkinson's disease and in aging.

    PubMed

    Messier, Julie; Adamovich, Sergei; Jack, David; Hening, Wayne; Sage, Jacob; Poizner, Howard

    2007-05-01

    Successful adaptation to novel sensorimotor contexts critically depends on efficient sensory processing and integration mechanisms, particularly those required to combine visual and proprioceptive inputs. If the basal ganglia are a critical part of specialized circuits that adapt motor behavior to new sensorimotor contexts, then patients who are suffering from basal ganglia dysfunction, as in Parkinson's disease should show sensorimotor learning impairments. However, this issue has been under-explored. We tested the ability of 8 patients with Parkinson's disease (PD), off medication, ten healthy elderly subjects and ten healthy young adults to reach to a remembered 3D location presented in an immersive virtual environment. A multi-phase learning paradigm was used having four conditions: baseline, initial learning, reversal learning and aftereffect. In initial learning, the computer altered the position of a simulated arm endpoint used for movement feedback by shifting its apparent location diagonally, requiring thereby both horizontal and vertical compensations. This visual distortion forced subjects to learn new coordinations between what they saw in the virtual environment and the actual position of their limbs, which they had to derive from proprioceptive information (or efference copy). In reversal learning, the sign of the distortion was reversed. Both elderly subjects and PD patients showed learning phase-dependent difficulties. First, elderly controls were slower than young subjects when learning both dimensions of the initial biaxial discordance. However, their performance improved during reversal learning and as a result elderly and young controls showed similar adaptation rates during reversal learning. Second, in striking contrast to healthy elderly subjects, PD patients were more profoundly impaired during the reversal phase of learning. PD patients were able to learn the initial biaxial discordance but were on average slower than age-matched controls

  14. 3D Elevation Program—Virtual USA in 3D

    USGS Publications Warehouse

    Lukas, Vicki; Stoker, J.M.

    2016-01-01

    The U.S. Geological Survey (USGS) 3D Elevation Program (3DEP) uses a laser system called ‘lidar’ (light detection and ranging) to create a virtual reality map of the Nation that is very accurate. 3D maps have many uses with new uses being discovered all the time.  

  15. Three-dimensional immersive virtual reality for studying cellular compartments in 3D models from EM preparations of neural tissues.

    PubMed

    Calì, Corrado; Baghabra, Jumana; Boges, Daniya J; Holst, Glendon R; Kreshuk, Anna; Hamprecht, Fred A; Srinivasan, Madhusudhanan; Lehväslaiho, Heikki; Magistretti, Pierre J

    2016-01-01

    Advances in the application of electron microscopy (EM) to serial imaging are opening doors to new ways of analyzing cellular structure. New and improved algorithms and workflows for manual and semiautomated segmentation allow us to observe the spatial arrangement of the smallest cellular features with unprecedented detail in full three-dimensions. From larger samples, higher complexity models can be generated; however, they pose new challenges to data management and analysis. Here we review some currently available solutions and present our approach in detail. We use the fully immersive virtual reality (VR) environment CAVE (cave automatic virtual environment), a room in which we are able to project a cellular reconstruction and visualize in 3D, to step into a world created with Blender, a free, fully customizable 3D modeling software with NeuroMorph plug-ins for visualization and analysis of EM preparations of brain tissue. Our workflow allows for full and fast reconstructions of volumes of brain neuropil using ilastik, a software tool for semiautomated segmentation of EM stacks. With this visualization environment, we can walk into the model containing neuronal and astrocytic processes to study the spatial distribution of glycogen granules, a major energy source that is selectively stored in astrocytes. The use of CAVE was key to the observation of a nonrandom distribution of glycogen, and led us to develop tools to quantitatively analyze glycogen clustering and proximity to other subcellular features. PMID:26179415

  16. Design and application of a virtual reality 3D engine based on rapid indices

    NASA Astrophysics Data System (ADS)

    Jiang, Nan; Mai, Jin

    2007-06-01

    This article proposes a data structure of a 3D engine based on rapid indices. Taking a model for a construction unit, this data structure can construct a coordinate array with 3D vertex rapidly and arrange those vertices in a sequence of triangle strips or triangle fans, which can be rendered rapidly by OpenGL. This data structure is easy to extend. It can hold texture coordinates, normal coordinates of vertices and a model matrix. Other models can be added to it, deleted from it, or transformed by model matrix, so it is flexible. This data structure also improves the render speed of OpenGL when it holds a large amount of data.

  17. A Learner-Centered Approach for Training Science Teachers through Virtual Reality and 3D Visualization Technologies: Practical Experience for Sharing

    ERIC Educational Resources Information Center

    Yeung, Yau-Yuen

    2004-01-01

    This paper presentation will report on how some science educators at the Science Department of The Hong Kong Institute of Education have successfully employed an array of innovative learning media such as three-dimensional (3D) and virtual reality (VR) technologies to create seven sets of resource kits, most of which are being placed on the…

  18. Virtual Reality.

    ERIC Educational Resources Information Center

    Newby, Gregory B.

    1993-01-01

    Discusses the current state of the art in virtual reality (VR), its historical background, and future possibilities. Highlights include applications in medicine, art and entertainment, science, business, and telerobotics; and VR for information science, including graphical display of bibliographic data, libraries and books, and cyberspace.…

  19. 3D chromosome rendering from Hi-C data using virtual reality

    NASA Astrophysics Data System (ADS)

    Zhu, Yixin; Selvaraj, Siddarth; Weber, Philip; Fang, Jennifer; Schulze, Jürgen P.; Ren, Bing

    2015-01-01

    Most genome browsers display DNA linearly, using single-dimensional depictions that are useful to examine certain epigenetic mechanisms such as DNA methylation. However, these representations are insufficient to visualize intrachromosomal interactions and relationships between distal genome features. Relationships between DNA regions may be difficult to decipher or missed entirely if those regions are distant in one dimension but could be spatially proximal when mapped to three-dimensional space. For example, the visualization of enhancers folding over genes is only fully expressed in three-dimensional space. Thus, to accurately understand DNA behavior during gene expression, a means to model chromosomes is essential. Using coordinates generated from Hi-C interaction frequency data, we have created interactive 3D models of whole chromosome structures and its respective domains. We have also rendered information on genomic features such as genes, CTCF binding sites, and enhancers. The goal of this article is to present the procedure, findings, and conclusions of our models and renderings.

  20. Visualization and Interpretation in 3D Virtual Reality of Topographic and Geophysical Data from the Chicxulub Impact Crater

    NASA Astrophysics Data System (ADS)

    Rosen, J.; Kinsland, G. L.; Borst, C.

    2011-12-01

    We have assembled Shuttle Radar Topography Mission (SRTM) data (Borst and Kinsland, 2005), gravity data (Bedard, 1977), horizontal gravity gradient data (Hildebrand et al., 1995), magnetic data (Pilkington et al., 2000) and GPS topography data (Borst and Kinsland, 2005) from the Chicxulub Impact Crater buried on the Yucatan Peninsula of Mexico. These data sets are imaged as gridded surfaces and are all georegistered, within an interactive 3D virtual reality (3DVR) visualization and interpretation system created and maintained in the Center for Advanced Computer Studies at the University of Louisiana at Lafayette. We are able to view and interpret the data sets individually or together and to scale and move the data or to move our physical head position so as to achieve the best viewing perspective for interpretation. A feature which is especially valuable for understanding the relationships between the various data sets is our ability to "interlace" the 3D images. "Interlacing" is a technique we have developed whereby the data surfaces are moved along a common axis so that they interpenetrate. This technique leads to rapid and positive identification of spatially corresponding features in the various data sets. We present several images from the 3D system, which demonstrate spatial relationships amongst the features in the data sets. Some of the anomalies in gravity are very nearly coincident with anomalies in the magnetic data as one might suspect if the causal bodies are the same. Other gravity and magnetic anomalies are not spatially coincident indicating different causal bodies. Topographic anomalies display a strong spatial correspondence with many gravity anomalies. In some cases small gravity anomalies and topographic valleys are caused by shallow dissolution within the Tertiary cover along faults or fractures propagated upward from the buried structure. In other cases the sources of the gravity anomalies are in the more deeply buried structure from which

  1. Transforming Clinical Imaging and 3D Data for Virtual Reality Learning Objects: HTML5 and Mobile Devices Implementation

    ERIC Educational Resources Information Center

    Trelease, Robert B.; Nieder, Gary L.

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android…

  2. The use of a low-cost visible light 3D scanner to create virtual reality environment models of actors and objects

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    A low-cost 3D scanner has been developed with a parts cost of approximately USD $5,000. This scanner uses visible light sensing to capture both structural as well as texture and color data of a subject. This paper discusses the use of this type of scanner to create 3D models for incorporation into a virtual reality environment. It describes the basic scanning process (which takes under a minute for a single scan), which can be repeated to collect multiple positions, if needed for actor model creation. The efficacy of visible light versus other scanner types is also discussed.

  3. On the Usability and Usefulness of 3d (geo)visualizations - a Focus on Virtual Reality Environments

    NASA Astrophysics Data System (ADS)

    Çöltekin, A.; Lokka, I.; Zahner, M.

    2016-06-01

    Whether and when should we show data in 3D is an on-going debate in communities conducting visualization research. A strong opposition exists in the information visualization (Infovis) community, and seemingly unnecessary/unwarranted use of 3D, e.g., in plots, bar or pie charts, is heavily criticized. The scientific visualization (Scivis) community, on the other hand, is more supportive of the use of 3D as it allows `seeing' invisible phenomena, or designing and printing things that are used in e.g., surgeries, educational settings etc. Geographic visualization (Geovis) stands between the Infovis and Scivis communities. In geographic information science, most visuo-spatial analyses have been sufficiently conducted in 2D or 2.5D, including analyses related to terrain and much of the urban phenomena. On the other hand, there has always been a strong interest in 3D, with similar motivations as in Scivis community. Among many types of 3D visualizations, a popular one that is exploited both for visual analysis and visualization is the highly realistic (geo)virtual environments. Such environments may be engaging and memorable for the viewers because they offer highly immersive experiences. However, it is not yet well-established if we should opt to show the data in 3D; and if yes, a) what type of 3D we should use, b) for what task types, and c) for whom. In this paper, we identify some of the central arguments for and against the use of 3D visualizations around these three considerations in a concise interdisciplinary literature review.

  4. Quality of Grasping and the Role of Haptics in a 3-D Immersive Virtual Reality Environment in Individuals With Stroke.

    PubMed

    Levin, Mindy F; Magdalon, Eliane C; Michaelsen, Stella M; Quevedo, Antonio A F

    2015-11-01

    Reaching and grasping parameters with and without haptic feedback were characterized in people with chronic post-stroke behaviors. Twelve (67 ± 10 years) individuals with chronic stroke and arm/hand paresis (Fugl-Meyer Assessment-Arm: ≥ 46/66 pts) participated. Three dimensional (3-D) temporal and spatial kinematics of reaching and grasping movements to three objects (can: cylindrical grasp; screwdriver: power grasp; pen: precision grasp) in a physical environment (PE) with and without additional haptic feedback and a 3-D virtual environment (VE) with haptic feedback were recorded. Participants reached, grasped and transported physical and virtual objects using similar movement strategies in all conditions. Reaches made in VE were less smooth and slower compared to the PE. Arm and trunk kinematics were similar in both environments and glove conditions. For grasping, stroke subjects preserved aperture scaling to object size but used wider hand apertures with longer delays between times to maximal reaching velocity and maximal grasping aperture. Wearing the glove decreased reaching velocity. Our results in a small group of subjects suggest that providing haptic information in the VE did not affect the validity of reaching and grasping movement. Small disparities in movement parameters between environments may be due to differences in perception of object distance in VE. Reach-to-grasp kinematics to smaller objects may be improved by better 3-D rendering. Comparable kinematics between environments and conditions is encouraging for the incorporation of high quality VEs in rehabilitation programs aimed at improving upper limb recovery. PMID:25594971

  5. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  6. 3D augmented reality with integral imaging display

    NASA Astrophysics Data System (ADS)

    Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-06-01

    In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.

  7. Transforming clinical imaging and 3D data for virtual reality learning objects: HTML5 and mobile devices implementation.

    PubMed

    Trelease, Robert B; Nieder, Gary L

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android tablets. This article describes complementary methods for creating comparable, multiplatform VR learning objects in the new HTML5 standard format, circumventing platform-specific limitations imposed by the QuickTime VR multimedia file format. Multiple types or "dimensions" of anatomical information can be embedded in such learning objects, supporting different kinds of online learning applications, including interactive atlases, examination questions, and complex, multi-structure presentations. Such HTML5 VR learning objects are usable on new mobile devices that do not support QuickTime VR, as well as on personal computers. Furthermore, HTML5 VR learning objects can be embedded in "ebook" document files, supporting the development of new types of electronic textbooks on mobile devices that are increasingly popular and self-adopted for mobile learning. PMID:23212750

  8. 3D Virtual Reality Applied in Tectonic Geomorphic Study of the Gombori Range of Greater Caucasus Mountains

    NASA Astrophysics Data System (ADS)

    Sukhishvili, Lasha; Javakhishvili, Zurab

    2016-04-01

    Gombori Range represents the southern part of the young Greater Caucasus Mountains and stretches from NW to SE. The range separates Alazani and Iori basins within the eastern Georgian province of Kakheti. The active phase of Caucasian orogeny started in the Pliocene, but according to alluvial sediments of Gombori range (mapped in the Soviet geologic map), we observe its uplift process to be Quaternary event. The highest peak of the Gombori range has an absolute elevation of 1991 m, while its neighboring Alazani valley gains only 400 m. We assume the range has a very fast uplift rate and it could trigger streams flow direction course reverse in Quaternary. To check this preliminary assumptions we are going to use a tectonic and fluvial geomorphic and stratigraphic approaches including paleocurrent analyses and various affordable absolute dating techniques to detect the evidence of river course reverses and date them. For these purposes we have selected river Turdo outcrop. The river itself flows northwards from the Gombori range and nearby region`s main city of Telavi generates 30-40 m high continuous outcrop along 1 km section. Turdo outcrop has very steep walls and requires special climbing skills to work on it. The goal of this particularly study is to avoid time and resource consuming ground survey process of this steep, high and wide outcrop and test 3D aerial and ground base photogrammetric modelling and analyzing approaches in initial stage of the tectonic geomorphic study. Using this type of remote sensing and virtual lab analyses of 3D outcrop model, we roughly delineated stratigraphic layers, selected exact locations for applying various research techniques and planned safe and suitable climbing routes for getting to the investigation sites.

  9. Virtual Reality as Metaphor.

    ERIC Educational Resources Information Center

    Gozzi, Raymond, Jr.

    1996-01-01

    Suggests that virtual reality technology has become popular because it is a miniaturization, a model, of something that already exists. Compares virtual reality to the news media, which centers on the gory, the sensational, and the distorted. (PA)

  10. Computer Vision Assisted Virtual Reality Calibration

    NASA Technical Reports Server (NTRS)

    Kim, W.

    1999-01-01

    A computer vision assisted semi-automatic virtual reality (VR) calibration technology has been developed that can accurately match a virtual environment of graphically simulated three-dimensional (3-D) models to the video images of the real task environment.

  11. Virtual Reality: An Overview.

    ERIC Educational Resources Information Center

    Franchi, Jorge

    1994-01-01

    Highlights of this overview of virtual reality include optics; interface devices; virtual worlds; potential applications, including medicine and archaeology; problems, including costs; current research and development; future possibilities; and a listing of vendors and suppliers of virtual reality products. (Contains 11 references.) (LRW)

  12. Robotics and virtual reality: the development of a life-sized 3-D system for the rehabilitation of motor function.

    PubMed

    Patton, J L; Dawe, G; Scharver, C; Mussa-Ivaldi, F A; Kenyon, R

    2004-01-01

    We have been developing and combining state-of-art devices that allow humans to visualize and feel synthetic objects superimposed on the real world. This effort stems from the need of platform for extending experiments on motor control and learning to realistic human motor tasks and environments, not currently represented in the practice of research. This paper's goal is to outline our motivations, progress, and objectives. Because the system is a general tool, we also hope to motivate researchers in related fields to join in. The platform under development, an augmented reality system combined with a haptic-interface robot, will be a new tool for contributing to the scientific knowledge base in the area of human movement control and rehabilitation robotics. Because this is a prototype, the system will also guide new methods by probing the levels of quality necessary for future design cycles and related technology. Inevitably, it should also lead the way to commercialization of such systems. PMID:17271395

  13. Virtual Reality Enhanced Instructional Learning

    ERIC Educational Resources Information Center

    Nachimuthu, K.; Vijayakumari, G.

    2009-01-01

    Virtual Reality (VR) is a creation of virtual 3D world in which one can feel and sense the world as if it is real. It is allowing engineers to design machines and Educationists to design AV [audiovisual] equipment in real time but in 3-dimensional hologram as if the actual material is being made and worked upon. VR allows a least-cost (energy…

  14. New weather depiction technology for night vision goggle (NVG) training: 3D virtual/augmented reality scene-weather-atmosphere-target simulation

    NASA Astrophysics Data System (ADS)

    Folaron, Michelle; Deacutis, Martin; Hegarty, Jennifer; Vollmerhausen, Richard; Schroeder, John; Colby, Frank P.

    2007-04-01

    US Navy and Marine Corps pilots receive Night Vision Goggle (NVG) training as part of their overall training to maintain the superiority of our forces. This training must incorporate realistic targets; backgrounds; and representative atmospheric and weather effects they may encounter under operational conditions. An approach for pilot NVG training is to use the Night Imaging and Threat Evaluation Laboratory (NITE Lab) concept. The NITE Labs utilize a 10' by 10' static terrain model equipped with both natural and cultural lighting that are used to demonstrate various illumination conditions, and visual phenomena which might be experienced when utilizing night vision goggles. With this technology, the military can safely, systematically, and reliably expose pilots to the large number of potentially dangerous environmental conditions that will be experienced in their NVG training flights. A previous SPIE presentation described our work for NAVAIR to add realistic atmospheric and weather effects to the NVG NITE Lab training facility using the NVG - WDT(Weather Depiction Technology) system (Colby, et al.). NVG -WDT consist of a high end multiprocessor server with weather simulation software, and several fixed and goggle mounted Heads Up Displays (HUDs). Atmospheric and weather effects are simulated using state-of-the-art computer codes such as the WRF (Weather Research μ Forecasting) model; and the US Air Force Research Laboratory MODTRAN radiative transport model. Imagery for a variety of natural and man-made obscurations (e.g. rain, clouds, snow, dust, smoke, chemical releases) are being calculated and injected into the scene observed through the NVG via the fixed and goggle mounted HUDs. This paper expands on the work described in the previous presentation and will describe the 3D Virtual/Augmented Reality Scene - Weather - Atmosphere - Target Simulation part of the NVG - WDT. The 3D virtual reality software is a complete simulation system to generate realistic

  15. Virtual sound for virtual reality

    SciTech Connect

    Blattner, M.M. ||; Papp, A.L. III |

    1993-02-01

    The computational limitations of real-time interactive computing do not meet our requirements for producing realistic images for virtual reality in a convincing manner. Regardless of the real-time restrictions on virtual reality interfaces, the representations can be no better than the graphics. Computer graphics is still limited in its ability to generate complex objects such as landscapes and humans. Nevertheless, useful and convincing visualizations are made through a variety of techniques. The central theme of this article is that a similar situation is true with sound for virtual reality. It is beyond our abilityto create interactive soundscapes that create a faithful reproduction of real world sounds, however, by choosing one`s application carefully and using sound to enhance a display rather than only mimic real-world scenes, a very effective use of sound can be made.

  16. Virtual sound for virtual reality

    SciTech Connect

    Blattner, M.M. Cancer Center, Houston, TX . Dept. of Biomathematics Lawrence Livermore National Lab., CA California Univ., Davis, CA ); Papp, A.L. III Lawrence Livermore National Lab., CA )

    1993-02-01

    The computational limitations of real-time interactive computing do not meet our requirements for producing realistic images for virtual reality in a convincing manner. Regardless of the real-time restrictions on virtual reality interfaces, the representations can be no better than the graphics. Computer graphics is still limited in its ability to generate complex objects such as landscapes and humans. Nevertheless, useful and convincing visualizations are made through a variety of techniques. The central theme of this article is that a similar situation is true with sound for virtual reality. It is beyond our abilityto create interactive soundscapes that create a faithful reproduction of real world sounds, however, by choosing one's application carefully and using sound to enhance a display rather than only mimic real-world scenes, a very effective use of sound can be made.

  17. Virtual reality via photogrammetry

    NASA Astrophysics Data System (ADS)

    Zahrt, John D.; Papcun, George; Childers, Randy A.; Rubin, Naama

    1996-03-01

    We wish to walk into a photograph just as Alice walked into the looking glass. From a mathematical perspective, this problem is exceedingly ill-posed (e.g. Is that a large, distant object or a small, nearby object?). A human expert can supply a large amount of a priori information that can function as mathematical constraints. The constrained problem can then be attacked with photogrammetry to obtain a great deal of quantitative information which is otherwise only qualitatively apparent. The user determines whether the object to be analyzed contains two or three vanishing points, then selects an appropriate number of points from the photon to enable the code to compute the locations of the vanishing points. Using this information and the standard photogrammetric geometric algorithms, the location of the camera, relative to the structure, is determined. The user must also enter information regarding an absolute sense of scale. As the vectors from the camera to the various points chosen from the photograph are determined, the vector components (coordinates) are handed to a virtual reality software package. Once the objects are entered, the appropriate surfaces of the 3D object are `wallpapered' with the surface from the photograph. The user is then able to move through the virtual scene. A video will demonstrate our work.

  18. Virtual Reality Lab Assistant

    NASA Technical Reports Server (NTRS)

    Saha, Hrishikesh; Palmer, Timothy A.

    1996-01-01

    Virtual Reality Lab Assistant (VRLA) demonstration model is aligned for engineering and material science experiments to be performed by undergraduate and graduate students in the course as a pre-lab simulation experience. This will help students to get a preview of how to use the lab equipment and run experiments without using the lab hardware/software equipment. The quality of the time available for laboratory experiments can be significantly improved through the use of virtual reality technology.

  19. Learning in Virtual Reality.

    ERIC Educational Resources Information Center

    Bricken, William

    The essence of the computer revolution is yet to come, for computers are essentially generators of realities. Virtual reality (VR) is the next step in the evolutionary path; the user is placed inside the image and becomes a participant within the computational space. A VR computer generates a direct experience of the computational environment. The…

  20. Inertial Motion-Tracking Technology for Virtual 3-D

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In the 1990s, NASA pioneered virtual reality research. The concept was present long before, but, prior to this, the technology did not exist to make a viable virtual reality system. Scientists had theories and ideas they knew that the concept had potential, but the computers of the 1970s and 1980s were not fast enough, sensors were heavy and cumbersome, and people had difficulty blending fluidly with the machines. Scientists at Ames Research Center built upon the research of previous decades and put the necessary technology behind them, making the theories of virtual reality a reality. Virtual reality systems depend on complex motion-tracking sensors to convey information between the user and the computer to give the user the feeling that he is operating in the real world. These motion-tracking sensors measure and report an object s position and orientation as it changes. A simple example of motion tracking would be the cursor on a computer screen moving in correspondence to the shifting of the mouse. Tracking in 3-D, necessary to create virtual reality, however, is much more complex. To be successful, the perspective of the virtual image seen on the computer must be an accurate representation of what is seen in the real world. As the user s head or camera moves, turns, or tilts, the computer-generated environment must change accordingly with no noticeable lag, jitter, or distortion. Historically, the lack of smooth and rapid tracking of the user s motion has thwarted the widespread use of immersive 3-D computer graphics. NASA uses virtual reality technology for a variety of purposes, mostly training of astronauts. The actual missions are costly and dangerous, so any opportunity the crews have to practice their maneuvering in accurate situations before the mission is valuable and instructive. For that purpose, NASA has funded a great deal of virtual reality research, and benefited from the results.

  1. Spacecraft 3D Augmented Reality Mobile App

    NASA Technical Reports Server (NTRS)

    Hussey, Kevin J.; Doronila, Paul R.; Kumanchik, Brian E.; Chan, Evan G.; Ellison, Douglas J.; Boeck, Andrea; Moore, Justin M.

    2013-01-01

    The Spacecraft 3D application allows users to learn about and interact with iconic NASA missions in a new and immersive way using common mobile devices. Using Augmented Reality (AR) techniques to project 3D renditions of the mission spacecraft into real-world surroundings, users can interact with and learn about Curiosity, GRAIL, Cassini, and Voyager. Additional updates on future missions, animations, and information will be ongoing. Using a printed AR Target and camera on a mobile device, users can get up close with these robotic explorers, see how some move, and learn about these engineering feats, which are used to expand knowledge and understanding about space. The software receives input from the mobile device's camera to recognize the presence of an AR marker in the camera's field of view. It then displays a 3D rendition of the selected spacecraft in the user's physical surroundings, on the mobile device's screen, while it tracks the device's movement in relation to the physical position of the spacecraft's 3D image on the AR marker.

  2. Virtual Reality and the Virtual Library.

    ERIC Educational Resources Information Center

    Oppenheim, Charles

    1993-01-01

    Explains virtual reality, including proper and improper uses of the term, and suggests ways that libraries might be affected by it. Highlights include elements of virtual reality systems; possible virtual reality applications, including architecture, the chemical industry, transport planning, armed forces, and entertainment; and the virtual…

  3. Identifying Virtual 3D Geometric Shapes with a Vibrotactile Glove.

    PubMed

    Martínez, Jonatan; García, Arturo; Oliver, Miguel; Molina, José Pascual; González, Pascual

    2016-01-01

    The emergence of off-screen interaction devices is bringing the field of virtual reality to a broad range of applications where virtual objects can be manipulated without the use of traditional peripherals. However, to facilitate object interaction, other stimuli such as haptic feedback are necessary to improve the user experience. To enable the identification of virtual 3D objects without visual feedback, a haptic display based on a vibrotactile glove and multiple points of contact gives users an enhanced sensation of touching a virtual object with their hands. Experimental results demonstrate the capacity of this technology in practical applications. PMID:25137722

  4. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients.

    PubMed

    Lledó, Luis D; Díez, Jorge A; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J; Sabater-Navarro, José M; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  5. A Comparative Analysis of 2D and 3D Tasks for Virtual Reality Therapies Based on Robotic-Assisted Neurorehabilitation for Post-stroke Patients

    PubMed Central

    Lledó, Luis D.; Díez, Jorge A.; Bertomeu-Motos, Arturo; Ezquerro, Santiago; Badesa, Francisco J.; Sabater-Navarro, José M.; García-Aracil, Nicolás

    2016-01-01

    Post-stroke neurorehabilitation based on virtual therapies are performed completing repetitive exercises shown in visual electronic devices, whose content represents imaginary or daily life tasks. Currently, there are two ways of visualization of these task. 3D virtual environments are used to get a three dimensional space that represents the real world with a high level of detail, whose realism is determinated by the resolucion and fidelity of the objects of the task. Furthermore, 2D virtual environments are used to represent the tasks with a low degree of realism using techniques of bidimensional graphics. However, the type of visualization can influence the quality of perception of the task, affecting the patient's sensorimotor performance. The purpose of this paper was to evaluate if there were differences in patterns of kinematic movements when post-stroke patients performed a reach task viewing a virtual therapeutic game with two different type of visualization of virtual environment: 2D and 3D. Nine post-stroke patients have participated in the study receiving a virtual therapy assisted by PUPArm rehabilitation robot. Horizontal movements of the upper limb were performed to complete the aim of the tasks, which consist in reaching peripheral or perspective targets depending on the virtual environment shown. Various parameter types such as the maximum speed, reaction time, path length, or initial movement are analyzed from the data acquired objectively by the robotic device to evaluate the influence of the task visualization. At the end of the study, a usability survey was provided to each patient to analysis his/her satisfaction level. For all patients, the movement trajectories were enhanced when they completed the therapy. This fact suggests that patient's motor recovery was increased. Despite of the similarity in majority of the kinematic parameters, differences in reaction time and path length were higher using the 3D task. Regarding the success rates

  6. Virtual reality for emergency training

    SciTech Connect

    Altinkemer, K.

    1995-12-31

    Virtual reality is a sequence of scenes generated by a computer as a response to the five different senses. These senses are sight, sound, taste, touch, smell. Other senses that can be used in virtual reality include balance, pheromonal, and immunological senses. Many application areas include: leisure and entertainment, medicine, architecture, engineering, manufacturing, and training. Virtual reality is especially important when it is used for emergency training and management of natural disasters including earthquakes, floods, tornados and other situations which are hard to emulate. Classical training methods for these extraordinary environments lack the realistic surroundings that virtual reality can provide. In order for virtual reality to be a successful training tool the design needs to include certain aspects; such as how real virtual reality should be and how much fixed cost is entailed in setting up the virtual reality trainer. There are also pricing questions regarding the price per training session on virtual reality trainer, and the appropriate training time length(s).

  7. Virtual Reality Calibration for Telerobotic Servicing

    NASA Technical Reports Server (NTRS)

    Kim, W.

    1994-01-01

    A virtual reality calibration technique of matching a virtual environment of simulated graphics models in 3-D geometry and perspective with actual camera views of the remote site task environment has been developed to enable high-fidelity preview/predictive displays with calibrated graphics overlay on live video.

  8. Magical Stories: Blending Virtual Reality and Artificial Intelligence.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    Artificial intelligence (AI) techniques and virtual reality (VR) make possible powerful interactive stories, and this paper focuses on examples of virtual characters in three dimensional (3-D) worlds. Waldern, a virtual reality game designer, has theorized about and implemented software design of virtual teammates and opponents that incorporate AI…

  9. Virtual reality systems

    NASA Technical Reports Server (NTRS)

    Johnson, David W.

    1992-01-01

    Virtual realities are a type of human-computer interface (HCI) and as such may be understood from a historical perspective. In the earliest era, the computer was a very simple, straightforward machine. Interaction was human manipulation of an inanimate object, little more than the provision of an explicit instruction set to be carried out without deviation. In short, control resided with the user. In the second era of HCI, some level of intelligence and control was imparted to the system to enable a dialogue with the user. Simple context sensitive help systems are early examples, while more sophisticated expert system designs typify this era. Control was shared more equally. In this, the third era of the HCI, the constructed system emulates a particular environment, constructed with rules and knowledge about 'reality'. Control is, in part, outside the realm of the human-computer dialogue. Virtual reality systems are discussed.

  10. Constructing Meaning with Virtual Reality.

    ERIC Educational Resources Information Center

    Iaonnou-Georgiou, Sophie

    2002-01-01

    Presents a constructivist rationale for introducing virtual reality in language learning and teaching and describes various virtual reality environments that are available. Ways of implementing constuctivist learning through virtual reality are suggested as well as basic guidelines for successful implementation in the classroom. (Author/VWL)

  11. Novel interactive virtual showcase based on 3D multitouch technology

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Liu, Yue; Lu, You; Wang, Yongtian

    2009-11-01

    A new interactive virtual showcase is proposed in this paper. With the help of virtual reality technology, the user of the proposed system can watch the virtual objects floating in the air from all four sides and interact with the virtual objects by touching the four surfaces of the virtual showcase. Unlike traditional multitouch system, this system cannot only realize multi-touch on a plane to implement 2D translation, 2D scaling, and 2D rotation of the objects; it can also realize the 3D interaction of the virtual objects by recognizing and analyzing the multi-touch that can be simultaneously captured from the four planes. Experimental results show the potential of the proposed system to be applied in the exhibition of historical relics and other precious goods.

  12. Virtual Representations in 3D Learning Environments

    ERIC Educational Resources Information Center

    Shonfeld, Miri; Kritz, Miki

    2013-01-01

    This research explores the extent to which virtual worlds can serve as online collaborative learning environments for students by increasing social presence and engagement. 3D environments enable learning, which simulates face-to-face encounters while retaining the advantages of online learning. Students in Education departments created avatars…

  13. Transparent 3D display for augmented reality

    NASA Astrophysics Data System (ADS)

    Lee, Byoungho; Hong, Jisoo

    2012-11-01

    Two types of transparent three-dimensional display systems applicable for the augmented reality are demonstrated. One of them is a head-mounted-display-type implementation which utilizes the principle of the system adopting the concave floating lens to the virtual mode integral imaging. Such configuration has an advantage in that the threedimensional image can be displayed at sufficiently far distance resolving the accommodation conflict with the real world scene. Incorporating the convex half mirror, which shows a partial transparency, instead of the concave floating lens, makes it possible to implement the transparent three-dimensional display system. The other type is the projection-type implementation, which is more appropriate for the general use than the head-mounted-display-type implementation. Its imaging principle is based on the well-known reflection-type integral imaging. We realize the feature of transparent display by imposing the partial transparency to the array of concave mirror which is used for the screen of reflection-type integral imaging. Two types of configurations, relying on incoherent and coherent light sources, are both possible. For the incoherent configuration, we introduce the concave half mirror array, whereas the coherent one adopts the holographic optical element which replicates the functionality of the lenslet array. Though the projection-type implementation is beneficial than the head-mounted-display in principle, the present status of the technical advance of the spatial light modulator still does not provide the satisfactory visual quality of the displayed three-dimensional image. Hence we expect that the head-mounted-display-type and projection-type implementations will come up in the market in sequence.

  14. Virtual reality at work

    NASA Technical Reports Server (NTRS)

    Brooks, Frederick P., Jr.

    1991-01-01

    The utility of virtual reality computer graphics in telepresence applications is not hard to grasp and promises to be great. When the virtual world is entirely synthetic, as opposed to real but remote, the utility is harder to establish. Vehicle simulators for aircraft, vessels, and motor vehicles are proving their worth every day. Entertainment applications such as Disney World's StarTours are technologically elegant, good fun, and economically viable. Nevertheless, some of us have no real desire to spend our lifework serving the entertainment craze of our sick culture; we want to see this exciting technology put to work in medicine and science. The topics covered include the following: testing a force display for scientific visualization -- molecular docking; and testing a head-mounted display for scientific and medical visualization.

  15. Virtual Reality, Combat, and Communication.

    ERIC Educational Resources Information Center

    Thrush, Emily Austin; Bodary, Michael

    2000-01-01

    Presents a brief examination of the evolution of virtual reality devices that illustrates how the development of this new medium is influenced by emerging technologies and by marketing pressures. Notes that understanding these influences may help prepare for the role of technical communicators in building virtual reality applications for education…

  16. [3D virtual endoscopy of heart].

    PubMed

    Du, Aan; Yang, Xin; Xue, Haihong; Yao, Liping; Sun, Kun

    2012-10-01

    In this paper, we present a virtual endoscopy (VE) for diagnosis of heart diseases, which is proved efficient and affordable, easy to popularize for viewing the interior of the heart. The dual source CT (DSCT) data were used as primary data in our system. The 3D structure of virtual heart was reconstructed with 3D texture mapping technology based on graphics processing unit (GPU), and could be displayed dynamically in real time. When we displayed it in real time, we could not only observe the inside of the chambers of heart but also examine from the new angle of view by the 3D data which were already clipped according to doctor's desire. In the pattern of observation, we used both mutual interactive mode and auto mode. In the auto mode, we used Dijkstra Algorithm which treated the 3D Euler distance as weighting factor to find out the view path quickly, and, used view path to calculate the four chamber plane. PMID:23198444

  17. Engineering applications of virtual reality

    NASA Astrophysics Data System (ADS)

    Smith, James R.; Grimes, Robert V.; Plant, Tony A.

    1996-04-01

    This paper addresses some of the practical applications, advantages and difficulties associated with the engineering applications of virtual reality. The paper tracks actual investigative work in progress on this subject at the BNR research lab in RTP, NC. This work attempts to demonstrate the actual value added to the engineering process by using existing 3D CAD data for interactive information navigation and evaluation of design concepts and products. Specifically, the work includes translation of Parametric Technology's Pro/ENGINEER models into a virtual world to evaluate potential attributes such as multiple concept exploration and product installation assessment. Other work discussed in this paper includes extensive evaluation of two new tools, VRML and SGI's/Template Graphics' WebSpace for navigation through Pro/ENGINEER models with links to supporting technical documentation and data. The benefits of using these tolls for 3D interactive navigation and exploration throughout three key phases of the physical design process is discussed in depth. The three phases are Design Concept Development, Product Design Evaluation and Product Design Networking. The predicted values added include reduced time to `concept ready', reduced prototype iterations, increased `design readiness' and shorter manufacturing introduction cycles.

  18. Virtual VMASC: A 3D Game Environment

    NASA Technical Reports Server (NTRS)

    Manepalli, Suchitra; Shen, Yuzhong; Garcia, Hector M.; Lawsure, Kaleen

    2010-01-01

    The advantages of creating interactive 3D simulations that allow viewing, exploring, and interacting with land improvements, such as buildings, in digital form are manifold and range from allowing individuals from anywhere in the world to explore those virtual land improvements online, to training military personnel in dealing with war-time environments, and to making those land improvements available in virtual worlds such as Second Life. While we haven't fully explored the true potential of such simulations, we have identified a requirement within our organization to use simulations like those to replace our front-desk personnel and allow visitors to query, naVigate, and communicate virtually with various entities within the building. We implemented the Virtual VMASC 3D simulation of the Virginia Modeling Analysis and Simulation Center (VMASC) office building to not only meet our front-desk requirement but also to evaluate the effort required in designing such a simulation and, thereby, leverage the experience we gained in future projects of this kind. This paper describes the goals we set for our implementation, the software approach taken, the modeling contribution made, and the technologies used such as XNA Game Studio, .NET framework, Autodesk software packages, and, finally, the applicability of our implementation on a variety of architectures including Xbox 360 and PC. This paper also summarizes the result of our evaluation and the lessons learned from our effort.

  19. Surgery applications of virtual reality

    NASA Technical Reports Server (NTRS)

    Rosen, Joseph

    1994-01-01

    Virtual reality is a computer-generated technology which allows information to be displayed in a simulated, bus lifelike, environment. In this simulated 'world', users can move and interact as if they were actually a part of that world. This new technology will be useful in many different fields, including the field of surgery. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations, simulate and perform surgical procedures (telesurgery), and predict the outcomes of surgery. The authors of this paper describe the basic components of a virtual reality surgical system. These components include: the virtual world, the virtual tools, the anatomical model, the software platform, the host computer, the interface, and the head-coupled display. In the chapter they also review the progress towards using virtual reality for surgical training, planning, telesurgery, and predicting outcomes. Finally, the authors present a training system being developed for the practice of new procedures in abdominal surgery.

  20. Augmented Virtual Reality Laboratory

    NASA Technical Reports Server (NTRS)

    Tully-Hanson, Benjamin

    2015-01-01

    Real time motion tracking hardware has for the most part been cost prohibitive for research to regularly take place until recently. With the release of the Microsoft Kinect in November 2010, researchers now have access to a device that for a few hundred dollars is capable of providing redgreenblue (RGB), depth, and skeleton data. It is also capable of tracking multiple people in real time. For its original intended purposes, i.e. gaming, being used with the Xbox 360 and eventually Xbox One, it performs quite well. However, researchers soon found that although the sensor is versatile, it has limitations in real world applications. I was brought aboard this summer by William Little in the Augmented Virtual Reality (AVR) Lab at Kennedy Space Center to find solutions to these limitations.

  1. Overestimation of heights in virtual reality is influenced more by perceived distal size than by the 2-D versus 3-D dimensionality of the display

    NASA Technical Reports Server (NTRS)

    Dixon, Melissa W.; Proffitt, Dennis R.; Kaiser, M. K. (Principal Investigator)

    2002-01-01

    One important aspect of the pictorial representation of a scene is the depiction of object proportions. Yang, Dixon, and Proffitt (1999 Perception 28 445-467) recently reported that the magnitude of the vertical-horizontal illusion was greater for vertical extents presented in three-dimensional (3-D) environments compared to two-dimensional (2-D) displays. However, because all of the 3-D environments were large and all of the 2-D displays were small, the question remains whether the observed magnitude differences were due solely to the dimensionality of the displays (2-D versus 3-D) or to the perceived distal size of the extents (small versus large). We investigated this question by comparing observers' judgments of vertical relative to horizontal extents on a large but 2-D display compared to the large 3-D and the small 2-D displays used by Yang et al (1999). The results confirmed that the magnitude differences for vertical overestimation between display media are influenced more by the perceived distal object size rather than by the dimensionality of the display.

  2. Virtual Reality: The Promise of the Future.

    ERIC Educational Resources Information Center

    Lanier, Jaron

    1992-01-01

    Defines virtual reality and describes the equipment or clothing necessary to achieve the illusion of being in a virtual world. Recent developments with this technology and current virtual reality applications are discussed, including experiential prototyping, telepresence, and educational applications. (MES)

  3. Telemedicine, virtual reality, and surgery

    NASA Technical Reports Server (NTRS)

    Mccormack, Percival D.; Charles, Steve

    1994-01-01

    Two types of synthetic experience are covered: virtual reality (VR) and surgery, and telemedicine. The topics are presented in viewgraph form and include the following: geometric models; physiological sensors; surgical applications; virtual cadaver; VR surgical simulation; telesurgery; VR Surgical Trainer; abdominal surgery pilot study; advanced abdominal simulator; examples of telemedicine; and telemedicine spacebridge.

  4. A Collaborative Virtual Environment for Situated Language Learning Using VEC3D

    ERIC Educational Resources Information Center

    Shih, Ya-Chun; Yang, Mau-Tsuen

    2008-01-01

    A 3D virtually synchronous communication architecture for situated language learning has been designed to foster communicative competence among undergraduate students who have studied English as a foreign language (EFL). We present an innovative approach that offers better e-learning than the previous virtual reality educational applications. The…

  5. Virtual hand: a 3D tactile interface to virtual environments

    NASA Astrophysics Data System (ADS)

    Rogowitz, Bernice E.; Borrel, Paul

    2008-02-01

    We introduce a novel system that allows users to experience the sensation of touch in a computer graphics environment. In this system, the user places his/her hand on an array of pins, which is moved about space on a 6 degree-of-freedom robot arm. The surface of the pins defines a surface in the virtual world. This "virtual hand" can move about the virtual world. When the virtual hand encounters an object in the virtual world, the heights of the pins are adjusted so that they represent the object's shape, surface, and texture. A control system integrates pin and robot arm motions to transmit information about objects in the computer graphics world to the user. It also allows the user to edit, change and move the virtual objects, shapes and textures. This system provides a general framework for touching, manipulating, and modifying objects in a 3-D computer graphics environment, which may be useful in a wide range of applications, including computer games, computer aided design systems, and immersive virtual worlds.

  6. Learning in 3-D Virtual Worlds: Rethinking Media Literacy

    ERIC Educational Resources Information Center

    Qian, Yufeng

    2008-01-01

    3-D virtual worlds, as a new form of learning environments in the 21st century, hold great potential in education. Learning in such environments, however, demands a broader spectrum of literacy skills. This article identifies a new set of media literacy skills required in 3-D virtual learning environments by reviewing exemplary 3-D virtual…

  7. Virtual Reality: The Future of Animated Virtual Instructor, the Technology and Its Emergence to a Productive E-Learning Environment.

    ERIC Educational Resources Information Center

    Jiman, Juhanita

    This paper discusses the use of Virtual Reality (VR) in e-learning environments where an intelligent three-dimensional (3D) virtual person plays the role of an instructor. With the existence of this virtual instructor, it is hoped that the teaching and learning in the e-environment will be more effective and productive. This virtual 3D animated…

  8. A specification of 3D manipulation in virtual environments

    NASA Technical Reports Server (NTRS)

    Su, S. Augustine; Furuta, Richard

    1994-01-01

    In this paper we discuss the modeling of three basic kinds of 3-D manipulations in the context of a logical hand device and our virtual panel architecture. The logical hand device is a useful software abstraction representing hands in virtual environments. The virtual panel architecture is the 3-D component of the 2-D window systems. Both of the abstractions are intended to form the foundation for adaptable 3-D manipulation.

  9. Molecular Rift: Virtual Reality for Drug Designers.

    PubMed

    Norrby, Magnus; Grebner, Christoph; Eriksson, Joakim; Boström, Jonas

    2015-11-23

    Recent advances in interaction design have created new ways to use computers. One example is the ability to create enhanced 3D environments that simulate physical presence in the real world--a virtual reality. This is relevant to drug discovery since molecular models are frequently used to obtain deeper understandings of, say, ligand-protein complexes. We have developed a tool (Molecular Rift), which creates a virtual reality environment steered with hand movements. Oculus Rift, a head-mounted display, is used to create the virtual settings. The program is controlled by gesture-recognition, using the gaming sensor MS Kinect v2, eliminating the need for standard input devices. The Open Babel toolkit was integrated to provide access to powerful cheminformatics functions. Molecular Rift was developed with a focus on usability, including iterative test-group evaluations. We conclude with reflections on virtual reality's future capabilities in chemistry and education. Molecular Rift is open source and can be downloaded from GitHub. PMID:26558887

  10. Virtual Reality: Ready or Not!

    ERIC Educational Resources Information Center

    Lewis, Joan E.

    1994-01-01

    Describes the development and current status of virtual reality (VR) and VR research. Market potentials for VR are discussed, including the entertainment industry, health care and medical training, flight and other simulators, and educational possibilities. A glossary of VR-related terms is included. (LRW)

  11. Embedding speech into virtual realities

    NASA Astrophysics Data System (ADS)

    Bohn, Christian-Arved; Krueger, Wolfgang

    1993-05-01

    In this work a speaker-independent speech recognition system is presented, which is suitable for implementation in Virtual Reality applications. The use of an artificial neural network in connection with a special compression of the acoustic input leads to a system, which is robust, fast, easy to use and needs no additional hardware, beside a common VR-equipment.

  12. Virtual reality and planetary exploration

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.

  13. Embedding speech into virtual realities

    NASA Technical Reports Server (NTRS)

    Bohn, Christian-Arved; Krueger, Wolfgang

    1993-01-01

    In this work a speaker-independent speech recognition system is presented, which is suitable for implementation in Virtual Reality applications. The use of an artificial neural network in connection with a special compression of the acoustic input leads to a system, which is robust, fast, easy to use and needs no additional hardware, beside a common VR-equipment.

  14. Virtual reality and stereoscopic telepresence

    SciTech Connect

    Mertens, E.P.

    1994-12-01

    Virtual reality technology is commonly thought to have few, if any, applications beyond the national research laboratories, the aerospace industry, and the entertainment world. A team at Westinghouse Hanford Company (WHC) is developing applications for virtual reality technology that make it a practical, viable, portable, and cost-effective business and training tool. The technology transfer is particularly applicable to the waste management industry and has become a tool that can serve the entire work force spectrum, from industrial sites to business offices. For three and a half years, a small team of WHC personnel has been developing an effective and practical method of bringing virtual reality technology to the job site. The applications are practical, the results are repeatable, and the equipment costs are within the range of present-day office machines. That combination can evolve into a competitive advantage for commercial business interests. The WHC team has contained system costs by using commercially available equipment and personal computers to create effective virtual reality work stations for less than $20,000.

  15. Immersive virtual reality simulations in nursing education.

    PubMed

    Kilmon, Carol A; Brown, Leonard; Ghosh, Sumit; Mikitiuk, Artur

    2010-01-01

    This article explores immersive virtual reality as a potential educational strategy for nursing education and describes an immersive learning experience now being developed for nurses. This pioneering project is a virtual reality application targeting speed and accuracy of nurse response in emergency situations requiring cardiopulmonary resuscitation. Other potential uses and implications for the development of virtual reality learning programs are discussed. PMID:21086871

  16. Virtual Reality: You Are There

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Telepresence or "virtual reality," allows a person, with assistance from advanced technology devices, to figuratively project himself into another environment. This technology is marketed by several companies, among them Fakespace, Inc., a former Ames Research Center contractor. Fakespace developed a teleoperational motion platform for transmitting sounds and images from remote locations. The "Molly" matches the user's head motion and, when coupled with a stereo viewing device and appropriate software, creates the telepresence experience. Its companion piece is the BOOM-the user's viewing device that provides the sense of involvement in the virtual environment. Either system may be used alone. Because suits, gloves, headphones, etc. are not needed, a whole range of commercial applications is possible, including computer-aided design techniques and virtual reality visualizations. Customers include Sandia National Laboratories, Stanford Research Institute and Mattel Toys.

  17. Simulated maintenance a virtual reality

    SciTech Connect

    Lirvall, P.

    1995-10-01

    The article describes potential applications of personal computer-based virtual reality software. The applications are being investigated by Atomic Energy of Canada Limited`s (AECL) Chalk River Laboratories for the Canadian deuterium-uranium (Candu) reactor. Objectives include: (1) reduction of outage duration and improved safety, (2) cost-effective and safe maintenance of equipment, (3) reduction of exposure times and identification of overexposure situations, (4) cost-effective training in a virtual control room simulator, (5) human factors evaluation of design interface, and (6) visualization of conceptual and detailed designs of critical nuclear field environments. A demonstration model of a typical reactor control room, the use of virtual reality in outage planning, and safety issues are outlined.

  18. Marshall Engineers Use Virtual Reality

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  19. Virtual Libraries: Service Realities.

    ERIC Educational Resources Information Center

    Novak, Jan

    This paper discusses client service issues to be considered when transitioning to a virtual library situation. Themes related to the transitional nature of society in the knowledge era are presented, including: paradox and a contradictory nature; blurring of boundaries; networks, systems, and holistic thinking; process/not product, becoming/not…

  20. Mobile viewer system for virtual 3D space using infrared LED point markers and camera

    NASA Astrophysics Data System (ADS)

    Sakamoto, Kunio; Taneji, Shoto

    2006-09-01

    The authors have developed a 3D workspace system using collaborative imaging devices. A stereoscopic display enables this system to project 3D information. In this paper, we describe the position detecting system for a see-through 3D viewer. A 3D display system is useful technology for virtual reality, mixed reality and augmented reality. We have researched spatial imaging and interaction system. We have ever proposed 3D displays using the slit as a parallax barrier, the lenticular screen and the holographic optical elements(HOEs) for displaying active image 1)2)3)4). The purpose of this paper is to propose the interactive system using these 3D imaging technologies. The observer can view virtual images in the real world when the user watches the screen of a see-through 3D viewer. The goal of our research is to build the display system as follows; when users see the real world through the mobile viewer, the display system gives users virtual 3D images, which is floating in the air, and the observers can touch these floating images and interact them such that kids can make play clay. The key technologies of this system are the position recognition system and the spatial imaging display. The 3D images are presented by the improved parallax barrier 3D display. Here the authors discuss the measuring method of the mobile viewer using infrared LED point markers and a camera in the 3D workspace (augmented reality world). The authors show the geometric analysis of the proposed measuring method, which is the simplest method using a single camera not the stereo camera, and the results of our viewer system.

  1. Learning in 3D Virtual Environments: Collaboration and Knowledge Spirals

    ERIC Educational Resources Information Center

    Burton, Brian G.; Martin, Barbara N.

    2010-01-01

    The purpose of this case study was to determine if learning occurred within a 3D virtual learning environment by determining if elements of collaboration and Nonaka and Takeuchi's (1995) knowledge spiral were present. A key portion of this research was the creation of a Virtual Learning Environment. This 3D VLE utilized the Torque Game Engine…

  2. Virtual Reality--Learning by Immersion.

    ERIC Educational Resources Information Center

    Dunning, Jeremy

    1998-01-01

    Discusses the use of virtual reality in educational software. Topics include CAVE (Computer-Assisted Virtual Environments); cost-effective virtual environment tools including QTVR (Quick Time Virtual Reality); interactive exercises; educational criteria for technology-based educational tools; and examples of screen displays. (LRW)

  3. Virtual reality visualization of accelerator magnets

    SciTech Connect

    Huang, M.; Papka, M.; DeFanti, T.; Levine, D.; Turner, L.; Kettunen, L.

    1995-05-01

    The authors describe the use of the CAVE virtual reality visualization environment as an aid to the design of accelerator magnets. They have modeled an elliptical multipole wiggler magnet being designed for use at the Advanced Photon Source at Argonne National Laboratory. The CAVE environment allows the authors to explore and interact with the 3-D visualization of the magnet. Capabilities include changing the number of periods the magnet displayed, changing the icons used for displaying the magnetic field, and changing the current in the electromagnet and observing the effect on the magnetic field and particle beam trajectory through the field.

  4. Virtual reality applied to teletesting

    NASA Astrophysics Data System (ADS)

    van den Berg, Thomas J.; Smeenk, Roland J. M.; Mazy, Alain; Jacques, Patrick; Arguello, Luis; Mills, Simon

    2003-05-01

    The activity "Virtual Reality applied to Teletesting" is related to a wider European Space Agency (ESA) initiative of cost reduction, in particular the reduction of test costs. Reduction of costs of space related projects have to be performed on test centre operating costs and customer company costs. This can accomplished by increasing the automation and remote testing ("teletesting") capabilities of the test centre. Main problems related to teletesting are a lack of situational awareness and the separation of control over the test environment. The objective of the activity is to evaluate the use of distributed computing and Virtual Reality technology to support the teletesting of a payload under vacuum conditions, and to provide a unified man-machine interface for the monitoring and control of payload, vacuum chamber and robotics equipment. The activity includes the development and testing of a "Virtual Reality Teletesting System" (VRTS). The VRTS is deployed at one of the ESA certified test centres to perform an evaluation and test campaign using a real payload. The VRTS is entirely written in the Java programming language, using the J2EE application model. The Graphical User Interface runs as an applet in a Web browser, enabling easy access from virtually any place.

  5. Direct Manipulation in Virtual Reality

    NASA Technical Reports Server (NTRS)

    Bryson, Steve

    2003-01-01

    Virtual Reality interfaces offer several advantages for scientific visualization such as the ability to perceive three-dimensional data structures in a natural way. The focus of this chapter is direct manipulation, the ability for a user in virtual reality to control objects in the virtual environment in a direct and natural way, much as objects are manipulated in the real world. Direct manipulation provides many advantages for the exploration of complex, multi-dimensional data sets, by allowing the investigator the ability to intuitively explore the data environment. Because direct manipulation is essentially a control interface, it is better suited for the exploration and analysis of a data set than for the publishing or communication of features found in that data set. Thus direct manipulation is most relevant to the analysis of complex data that fills a volume of three-dimensional space, such as a fluid flow data set. Direct manipulation allows the intuitive exploration of that data, which facilitates the discovery of data features that would be difficult to find using more conventional visualization methods. Using a direct manipulation interface in virtual reality, an investigator can, for example, move a data probe about in space, watching the results and getting a sense of how the data varies within its spatial volume.

  6. Designing Virtual Museum Using Web3D Technology

    NASA Astrophysics Data System (ADS)

    Zhao, Jianghai

    VRT was born to have the potentiality of constructing an effective learning environment due to its 3I characteristics: Interaction, Immersion and Imagination. It is now applied in education in a more profound way along with the development of VRT. Virtual Museum is one of the applications. The Virtual Museum is based on the WEB3D technology and extensibility is the most important factor. Considering the advantage and disadvantage of each WEB3D technology, VRML, CULT3D AND VIEWPOINT technologies are chosen. A web chatroom based on flash and ASP technology is also been created in order to make the Virtual Museum an interactive learning environment.

  7. 3D Viewing: Odd Perception - Illusion? reality? or both?

    NASA Astrophysics Data System (ADS)

    Kisimoto, K.; Iizasa, K.

    2008-12-01

    We live in the three dimensional space, don't we? It could be at least four dimensions, but that is another story. In either way our perceptual capability of 3D-Viewing is constrained by our 2D-perception (our intrinsic tools of perception). I carried out a few visual experiments using topographic data to show our intrinsic (or biological) disability (or shortcoming) in 3D-recognition of our world. Results of the experiments suggest: (1) 3D-surface model displayed on a 2D-computer screen (or paper) always has two interpretations of the 3D- surface geometry, if we choose one of the interpretation (in other word, if we are hooked by one perception of the two), we maintain its perception even if the 3D-model changes its viewing perspective in time shown on the screen, (2) more interesting is that 3D-real solid object (e.g.,made of clay) also gives above mentioned two interpretations of the geometry of the object, if we observe the object with one-eye. Most famous example of this viewing illusion is exemplified by a magician, who died in 2007, Jerry Andrus who made a super-cool paper crafted dragon which causes visual illusion to one-eyed viewer. I, by the experiments, confirmed this phenomenon in another perceptually persuasive (deceptive?) way. My conclusion is that this illusion is intrinsic, i.e. reality for human, because, even if we live in 3D-space, our perceptional tool (eyes) is composed of 2D sensors whose information is reconstructed or processed to 3D by our experience-based brain. So, (3) when we observe the 3D-surface-model on the computer screen, we are always one eye short even if we use both eyes. One last suggestion from my experiments is that recent highly sophisticated 3D- models might include too many information that human perceptions cannot handle properly, i.e. we might not be understanding the 3D world (geospace) at all, just illusioned.

  8. Mixed reality orthognathic surgical simulation by entity model manipulation and 3D-image display

    NASA Astrophysics Data System (ADS)

    Shimonagayoshi, Tatsunari; Aoki, Yoshimitsu; Fushima, Kenji; Kobayashi, Masaru

    2005-12-01

    In orthognathic surgery, the framing of 3D-surgical planning that considers the balance between the front and back positions and the symmetry of the jawbone, as well as the dental occlusion of teeth, is essential. In this study, a support system for orthodontic surgery to visualize the changes in the mandible and the occlusal condition and to determine the optimum position in mandibular osteotomy has been developed. By integrating the operating portion of a tooth model that is to determine the optimum occlusal position by manipulating the entity tooth model and the 3D-CT skeletal images (3D image display portion) that are simultaneously displayed in real-time, the determination of the mandibular position and posture in which the improvement of skeletal morphology and occlusal condition is considered, is possible. The realistic operation of the entity model and the virtual 3D image display enabled the construction of a surgical simulation system that involves augmented reality.

  9. Development of visual 3D virtual environment for control software

    NASA Technical Reports Server (NTRS)

    Hirose, Michitaka; Myoi, Takeshi; Amari, Haruo; Inamura, Kohei; Stark, Lawrence

    1991-01-01

    Virtual environments for software visualization may enable complex programs to be created and maintained. A typical application might be for control of regional electric power systems. As these encompass broader computer networks than ever, construction of such systems becomes very difficult. Conventional text-oriented environments are useful in programming individual processors. However, they are obviously insufficient to program a large and complicated system, that includes large numbers of computers connected to each other; such programming is called 'programming in the large.' As a solution for this problem, the authors are developing a graphic programming environment wherein one can visualize complicated software in virtual 3D world. One of the major features of the environment is the 3D representation of concurrent process. 3D representation is used to supply both network-wide interprocess programming capability (capability for 'programming in the large') and real-time programming capability. The authors' idea is to fuse both the block diagram (which is useful to check relationship among large number of processes or processors) and the time chart (which is useful to check precise timing for synchronization) into a single 3D space. The 3D representation gives us a capability for direct and intuitive planning or understanding of complicated relationship among many concurrent processes. To realize the 3D representation, a technology to enable easy handling of virtual 3D object is a definite necessity. Using a stereo display system and a gesture input device (VPL DataGlove), our prototype of the virtual workstation has been implemented. The workstation can supply the 'sensation' of the virtual 3D space to a programmer. Software for the 3D programming environment is implemented on the workstation. According to preliminary assessments, a 50 percent reduction of programming effort is achieved by using the virtual 3D environment. The authors expect that the 3D

  10. Virtual reality: Avatars in human spaceflight training

    NASA Astrophysics Data System (ADS)

    Osterlund, Jeffrey; Lawrence, Brad

    2012-02-01

    With the advancements in high spatial and temporal resolution graphics, along with advancements in 3D display capabilities to model, simulate, and analyze human-to-machine interfaces and interactions, the world of virtual environments is being used to develop everything from gaming, movie special affects and animations to the design of automobiles. The use of multiple object motion capture technology and digital human tools in aerospace has demonstrated to be a more cost effective alternative to the cost of physical prototypes, provides a more efficient, flexible and responsive environment to changes in the design and training, and provides early human factors considerations concerning the operation of a complex launch vehicle or spacecraft. United Space Alliance (USA) has deployed this technique and tool under Research and Development (R&D) activities on both spacecraft assembly and ground processing operations design and training on the Orion Crew Module. USA utilizes specialized products that were chosen based on functionality, including software and fixed based hardware (e.g., infrared and visible red cameras), along with cyber gloves to ensure fine motor dexterity of the hands. The key findings of the R&D were: mock-ups should be built to not obstruct cameras from markers being tracked; a mock-up toolkit be assembled to facilitate dynamic design changes; markers should be placed in accurate positions on humans and flight hardware to help with tracking; 3D models used in the virtual environment be striped of non-essential data; high computational capable workstations are required to handle the large model data sets; and Technology Interchange Meetings with vendors and other industries also utilizing virtual reality applications need to occur on a continual basis enabling USA to maintain its leading edge within this technology. Parameters of interest and benefit in human spaceflight simulation training that utilizes virtual reality technologies are to

  11. Introduction to augmented and virtual reality

    NASA Astrophysics Data System (ADS)

    Caudell, Thomas P.

    1995-12-01

    This paper introduces the field of augmented reality as a prolog to the body of papers in the remainder of this session. I describe the use of head-mounted display technologies to improve the efficiency and quality of human workers in their performance of engineering design, manufacturing, construction, testing, and maintenance activities. This technology is used to `augment' the visual field of the wearer with information necessary in the performance of the current task. The enabling technology is head-up (see-through) display head sets (HUDsets) combined with head position sensing, real world registration systems, and database access software. A primary difference between virtual reality (VR) and `augmented reality' (AR) is in the complexity of the perceived graphical objects. In AR systems, only simple wire frames, template outlines, designators, and text is displayed. An immediate result of this difference is that augmented reality systems can be driven by standard and inexpensive microprocessors. Many research issues must be addressed before this technology can be widely used, including tracking and registration, human 3D perception and reasoning, and human task performance issues.

  12. ESL Teacher Training in 3D Virtual Worlds

    ERIC Educational Resources Information Center

    Kozlova, Iryna; Priven, Dmitri

    2015-01-01

    Although language learning in 3D Virtual Worlds (VWs) has become a focus of recent research, little is known about the knowledge and skills teachers need to acquire to provide effective task-based instruction in 3D VWs and the type of teacher training that best prepares instructors for such an endeavor. This study employs a situated learning…

  13. What Are the Learning Affordances of 3-D Virtual Environments?

    ERIC Educational Resources Information Center

    Dalgarno, Barney; Lee, Mark J. W.

    2010-01-01

    This article explores the potential learning benefits of three-dimensional (3-D) virtual learning environments (VLEs). Drawing on published research spanning two decades, it identifies a set of unique characteristics of 3-D VLEs, which includes aspects of their representational fidelity and aspects of the learner-computer interactivity they…

  14. Educational Visualizations in 3D Collaborative Virtual Environments: A Methodology

    ERIC Educational Resources Information Center

    Fominykh, Mikhail; Prasolova-Forland, Ekaterina

    2012-01-01

    Purpose: Collaborative virtual environments (CVEs) have become increasingly popular in educational settings and the role of 3D content is becoming more and more important. Still, there are many challenges in this area, such as lack of empirical studies that provide design for educational activities in 3D CVEs and lack of norms of how to support…

  15. Virtual 3d City Modeling: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2013-08-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as Building, Tree, Vegetation, and some manmade feature belonging to urban area. There are various terms used for 3D city models such as "Cybertown", "Cybercity", "Virtual City", or "Digital City". 3D city models are basically a computerized or digital model of a city contains the graphic representation of buildings and other objects in 2.5 or 3D. Generally three main Geomatics approach are using for Virtual 3-D City models generation, in first approach, researcher are using Conventional techniques such as Vector Map data, DEM, Aerial images, second approach are based on High resolution satellite images with LASER scanning, In third method, many researcher are using Terrestrial images by using Close Range Photogrammetry with DSM & Texture mapping. We start this paper from the introduction of various Geomatics techniques for 3D City modeling. These techniques divided in to two main categories: one is based on Automation (Automatic, Semi-automatic and Manual methods), and another is Based on Data input techniques (one is Photogrammetry, another is Laser Techniques). After details study of this, finally in short, we are trying to give the conclusions of this study. In the last, we are trying to give the conclusions of this research paper and also giving a short view for justification and analysis, and present trend for 3D City modeling. This paper gives an overview about the Techniques related with "Generation of Virtual 3-D City models using Geomatics Techniques" and the Applications of Virtual 3D City models. Photogrammetry, (Close range, Aerial, Satellite), Lasergrammetry, GPS, or combination of these modern Geomatics techniques play a major role to create a virtual 3-D City model. Each and every techniques and method has some advantages and some drawbacks. Point cloud model is a modern trend for virtual 3-D city model. Photo-realistic, Scalable, Geo-referenced virtual 3

  16. Game-Like Language Learning in 3-D Virtual Environments

    ERIC Educational Resources Information Center

    Berns, Anke; Gonzalez-Pardo, Antonio; Camacho, David

    2013-01-01

    This paper presents our recent experiences with the design of game-like applications in 3-D virtual environments as well as its impact on student motivation and learning. Therefore our paper starts with a brief analysis of the motivational aspects of videogames and virtual worlds (VWs). We then go on to explore the possible benefits of both in the…

  17. Rapid prototyping 3D virtual world interfaces within a virtual factory environment

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.

  18. Integration of the virtual 3D model of a control system with the virtual controller

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2015-11-01

    Nowadays the design process includes simulation analysis of different components of a constructed object. It involves the need for integration of different virtual object to simulate the whole investigated technical system. The paper presents the issues related to the integration of a virtual 3D model of a chosen control system of with a virtual controller. The goal of integration is to verify the operation of an adopted object of in accordance with the established control program. The object of the simulation work is the drive system of a tunneling machine for trenchless work. In the first stage of work was created an interactive visualization of functioning of the 3D virtual model of a tunneling machine. For this purpose, the software of the VR (Virtual Reality) class was applied. In the elaborated interactive application were created adequate procedures allowing controlling the drive system of a translatory motion, a rotary motion and the drive system of a manipulator. Additionally was created the procedure of turning on and off the output crushing head, mounted on the last element of the manipulator. In the elaborated interactive application have been established procedures for receiving input data from external software, on the basis of the dynamic data exchange (DDE), which allow controlling actuators of particular control systems of the considered machine. In the next stage of work, the program on a virtual driver, in the ladder diagram (LD) language, was created. The control program was developed on the basis of the adopted work cycle of the tunneling machine. The element integrating the virtual model of the tunneling machine for trenchless work with the virtual controller is the application written in a high level language (Visual Basic). In the developed application was created procedures responsible for collecting data from the running, in a simulation mode, virtual controller and transferring them to the interactive application, in which is verified the

  19. Virtual reality and planetary exploration

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    NASA-Ames is intensively developing virtual-reality (VR) capabilities that can extend and augment computer-generated and remote spatial environments. VR is envisioned not only as a basis for improving human/machine interactions involved in planetary exploration, but also as a medium for the more widespread sharing of the experience of exploration, thereby broadening the support-base for the lunar and planetary-exploration endeavors. Imagery representative of Mars are being gathered for VR presentation at such terrestrial sites as Antarctica and Death Valley.

  20. Virtual reality in laparoscopic surgery.

    PubMed

    Uranüs, Selman; Yanik, Mustafa; Bretthauer, Georg

    2004-01-01

    Although the many advantages of laparoscopic surgery have made it an established technique, training in laparoscopic surgery posed problems not encountered in conventional surgical training. Virtual reality simulators open up new perspectives for training in laparoscopic surgery. Under realistic conditions in real time, trainees can tailor their sessions with the VR simulator to suit their needs and goals, and can repeat exercises as often as they wish. VR simulators reduce the number of experimental animals needed for training purposes and are suited to the pursuit of research in laparoscopic surgery. PMID:15747974

  1. Improvements in education in pathology: virtual 3D specimens.

    PubMed

    Kalinski, Thomas; Zwönitzer, Ralf; Jonczyk-Weber, Thomas; Hofmann, Harald; Bernarding, Johannes; Roessner, Albert

    2009-01-01

    Virtual three-dimensional (3D) specimens correspond to 3D visualizations of real pathological specimens on a computer display. We describe a simple method for the digitalization of such specimens from high-quality digital images. The images were taken during a whole rotation of a specimen, and merged together into a JPEG2000 multi-document file. The files were made available in the internet (http://patho.med.uni-magdeburg.de/research.shtml) and obtained very positive ratings by medical students. Virtual 3D specimens expand the application of digital techniques in pathology, and will contribute significantly to the successful introduction of knowledge databases and electronic learning platforms. PMID:19457621

  2. New Desktop Virtual Reality Technology in Technical Education

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2008-01-01

    Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…

  3. PC-Based Virtual Reality for CAD Model Viewing

    ERIC Educational Resources Information Center

    Seth, Abhishek; Smith, Shana S.-F.

    2004-01-01

    Virtual reality (VR), as an emerging visualization technology, has introduced an unprecedented communication method for collaborative design. VR refers to an immersive, interactive, multisensory, viewer-centered, 3D computer-generated environment and the combination of technologies required to build such an environment. This article introduces the…

  4. A new e-commerce platform based on virtual reality

    NASA Astrophysics Data System (ADS)

    Shen, Xiaoyun; Wang, Huo; Ma, Lan; Wan, Di; Ye, Lu

    2004-03-01

    Accompany with the rapid development of computer hardware, software and network technologies, applying virtual reality technique and 3D imaging technique in E-commerce is affordable and practical. This paper mainly focused on setting up a new E-commerce platform based on above techniques. A new method for speed up modeling and texture mapping is introduced.

  5. Transforming Clinical Imaging Data for Virtual Reality Learning Objects

    ERIC Educational Resources Information Center

    Trelease, Robert B.; Rosset, Antoine

    2008-01-01

    Advances in anatomical informatics, three-dimensional (3D) modeling, and virtual reality (VR) methods have made computer-based structural visualization a practical tool for education. In this article, the authors describe streamlined methods for producing VR "learning objects," standardized interactive software modules for anatomical sciences…

  6. Virtual reality in radiation therapy training.

    PubMed

    Boejen, Annette; Grau, Cai

    2011-09-01

    Integration of virtual reality (VR) in clinical training programs is a novel tool in radiotherapy. This paper presents a review of the experience with VR and Immersive visualization in 3D perspective for planning and delivery of external radiotherapy. Planning and delivering radiation therapy is a complex process involving physicians, physicists, radiographers and radiation therapists/nurses (RTT's). The specialists must be able to understand spatial relationships in the patient anatomy. Although still in its infancy, VR tools have become available for radiotherapy training, enabling students to simulate and train clinical situations without interfering with the clinical workflow, and without the risk of making errors. Immersive tools like a 3D linear accelerator and 3D display of dose distributions have been integrated into training, together with IT-labs with clinical software. Training in a VR environment seems to be cost-effective for the clinic. Initial reports suggest that 3D display of dose distributions may improve treatment planning and decision making. Whether VR training qualifies the students better than conventional training is still unsettled, but the first results are encouraging. PMID:20724144

  7. Virtual Reality and Its Potential Application in Education and Training.

    ERIC Educational Resources Information Center

    Milheim, William D.

    1995-01-01

    An overview is provided of current trends in virtual reality research and development, including discussion of hardware, types of virtual reality, and potential problems with virtual reality. Implications for education and training are explored. (Author/JKP)

  8. Visualizing Compound Rotations with Virtual Reality

    ERIC Educational Resources Information Center

    Flanders, Megan; Kavanagh, Richard C.

    2013-01-01

    Mental rotations are among the most difficult of all spatial tasks to perform, and even those with high levels of spatial ability can struggle to visualize the result of compound rotations. This pilot study investigates the use of the virtual reality-based Rotation Tool, created using the Virtual Reality Modeling Language (VRML) together with…

  9. Defining Virtual Reality: Dimensions Determining Telepresence.

    ERIC Educational Resources Information Center

    Steuer, Jonathan

    1992-01-01

    Defines virtual reality as a particular type of experience (in terms of "presence" and "telepresence") rather than as a collection of hardware. Maintains that media technologies can be classified and studied in terms of vividness and interactivity, two attributes on which virtual reality ranks very high. (SR)

  10. Sweaty Palms! Virtual Reality Applied to Training.

    ERIC Educational Resources Information Center

    Treiber, Karin

    A qualitative case study approach was used to identify the psychosocial effects of the high-fidelity, virtual reality simulation provided in the college-level air traffic control (ATC) training program offered at the Minnesota Air Traffic Control Training Center and to evaluate the applicability of virtual reality to academic/training situations.…

  11. 3D interactive augmented reality-enhanced digital learning systems for mobile devices

    NASA Astrophysics Data System (ADS)

    Feng, Kai-Ten; Tseng, Po-Hsuan; Chiu, Pei-Shuan; Yang, Jia-Lin; Chiu, Chun-Jie

    2013-03-01

    With enhanced processing capability of mobile platforms, augmented reality (AR) has been considered a promising technology for achieving enhanced user experiences (UX). Augmented reality is to impose virtual information, e.g., videos and images, onto a live-view digital display. UX on real-world environment via the display can be e ectively enhanced with the adoption of interactive AR technology. Enhancement on UX can be bene cial for digital learning systems. There are existing research works based on AR targeting for the design of e-learning systems. However, none of these work focuses on providing three-dimensional (3-D) object modeling for en- hanced UX based on interactive AR techniques. In this paper, the 3-D interactive augmented reality-enhanced learning (IARL) systems will be proposed to provide enhanced UX for digital learning. The proposed IARL systems consist of two major components, including the markerless pattern recognition (MPR) for 3-D models and velocity-based object tracking (VOT) algorithms. Realistic implementation of proposed IARL system is conducted on Android-based mobile platforms. UX on digital learning can be greatly improved with the adoption of proposed IARL systems.

  12. A 3-D mixed-reality system for stereoscopic visualization of medical dataset.

    PubMed

    Ferrari, Vincenzo; Megali, Giuseppe; Troia, Elena; Pietrabissa, Andrea; Mosca, Franco

    2009-11-01

    We developed a simple, light, and cheap 3-D visualization device based on mixed reality that can be used by physicians to see preoperative radiological exams in a natural way. The system allows the user to see stereoscopic "augmented images," which are created by mixing 3-D virtual models of anatomies obtained by processing preoperative volumetric radiological images (computed tomography or MRI) with real patient live images, grabbed by means of cameras. The interface of the system consists of a head-mounted display equipped with two high-definition cameras. Cameras are mounted in correspondence of the user's eyes and allow one to grab live images of the patient with the same point of view of the user. The system does not use any external tracker to detect movements of the user or the patient. The movements of the user's head and the alignment of virtual patient with the real one are done using machine vision methods applied on pairs of live images. Experimental results, concerning frame rate and alignment precision between virtual and real patient, demonstrate that machine vision methods used for localization are appropriate for the specific application and that systems based on stereoscopic mixed reality are feasible and can be proficiently adopted in clinical practice. PMID:19651551

  13. Realistic terrain visualization based on 3D virtual world technology

    NASA Astrophysics Data System (ADS)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2010-11-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  14. Realistic terrain visualization based on 3D virtual world technology

    NASA Astrophysics Data System (ADS)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2009-09-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  15. [Development of a software for 3D virtual phantom design].

    PubMed

    Zou, Lian; Xie, Zhao; Wu, Qi

    2014-02-01

    In this paper, we present a 3D virtual phantom design software, which was developed based on object-oriented programming methodology and dedicated to medical physics research. This software was named Magical Phan tom (MPhantom), which is composed of 3D visual builder module and virtual CT scanner. The users can conveniently construct any complex 3D phantom, and then export the phantom as DICOM 3.0 CT images. MPhantom is a user-friendly and powerful software for 3D phantom configuration, and has passed the real scene's application test. MPhantom will accelerate the Monte Carlo simulation for dose calculation in radiation therapy and X ray imaging reconstruction algorithm research. PMID:24804488

  16. An Onboard ISS Virtual Reality Trainer

    NASA Technical Reports Server (NTRS)

    Miralles, Evelyn

    2013-01-01

    Prior to the retirement of the Space Shuttle, many exterior repairs on the International Space Station (ISS) were carried out by shuttle astronauts, trained on the ground and flown to the Station to perform these specific repairs. With the retirement of the shuttle, this is no longer an available option. As such, the need for ISS crew members to review scenarios while on flight, either for tasks they already trained for on the ground or for contingency operations has become a very critical issue. NASA astronauts prepare for Extra-Vehicular Activities (EVA) or Spacewalks through numerous training media, such as: self-study, part task training, underwater training in the Neutral Buoyancy Laboratory (NBL), hands-on hardware reviews and training at the Virtual Reality Laboratory (VRLab). In many situations, the time between the last session of a training and an EVA task might be 6 to 8 months. EVA tasks are critical for a mission and as time passes the crew members may lose proficiency on previously trained tasks and their options to refresh or learn a new skill while on flight are limited to reading training materials and watching videos. In addition, there is an increased need for unplanned contingency repairs to fix problems arising as the Station ages. In order to help the ISS crew members maintain EVA proficiency or train for contingency repairs during their mission, the Johnson Space Center's VRLab designed an immersive ISS Virtual Reality Trainer (VRT). The VRT incorporates a unique optical system that makes use of the already successful Dynamic On-board Ubiquitous Graphics (DOUG) software to assist crew members with procedure reviews and contingency EVAs while on board the Station. The need to train and re-train crew members for EVAs and contingency scenarios is crucial and extremely demanding. ISS crew members are now asked to perform EVA tasks for which they have not been trained and potentially have never seen before. The Virtual Reality Trainer (VRT

  17. Consultation virtual collaborative environment for 3D medicine.

    PubMed

    Krsek, Premysl; Spanel, Michal; Svub, Miroslav; Stancl, Vít; Siler, Ondrej; Sára, Vítezslav

    2008-01-01

    This article focuses on the problems of consultation virtual collaborative environment, which is designed to support 3D medical applications. This system allows loading CT/MR data from PACS system, segmentation and 3D models of tissues. It allows distant 3D consultations of the data between technicians and surgeons. System is designed as three-layer client-server architecture. Communication between clients and server is done via HTTP/HTTPS protocol. Results and tests have confirmed, that today's standard network latency and dataflow do not affect the usability of our system. PMID:19162770

  18. Web-based Three-dimensional Virtual Body Structures: W3D-VBS

    PubMed Central

    Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex

    2002-01-01

    Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user’s progress through evaluation tools helps customize lesson plans. A self-guided “virtual tour” of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it. PMID:12223495

  19. Web-based three-dimensional Virtual Body Structures: W3D-VBS.

    PubMed

    Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex

    2002-01-01

    Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user's progress through evaluation tools helps customize lesson plans. A self-guided "virtual tour" of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it. PMID:12223495

  20. The SEE Experience: Edutainment in 3D Virtual Worlds.

    ERIC Educational Resources Information Center

    Di Blas, Nicoletta; Paolini, Paolo; Hazan, Susan

    Shared virtual worlds are innovative applications where several users, represented by Avatars, simultaneously access via Internet a 3D space. Users cooperate through interaction with the environment and with each other, manipulating objects and chatting as they go. Apart from in the well documented online action games industry, now often played…

  1. Cognitive Aspects of Collaboration in 3d Virtual Environments

    NASA Astrophysics Data System (ADS)

    Juřík, V.; Herman, L.; Kubíček, P.; Stachoň, Z.; Šašinka, Č.

    2016-06-01

    Human-computer interaction has entered the 3D era. The most important models representing spatial information — maps — are transferred into 3D versions regarding the specific content to be displayed. Virtual worlds (VW) become promising area of interest because of possibility to dynamically modify content and multi-user cooperation when solving tasks regardless to physical presence. They can be used for sharing and elaborating information via virtual images or avatars. Attractiveness of VWs is emphasized also by possibility to measure operators' actions and complex strategies. Collaboration in 3D environments is the crucial issue in many areas where the visualizations are important for the group cooperation. Within the specific 3D user interface the operators' ability to manipulate the displayed content is explored regarding such phenomena as situation awareness, cognitive workload and human error. For such purpose, the VWs offer a great number of tools for measuring the operators' responses as recording virtual movement or spots of interest in the visual field. Study focuses on the methodological issues of measuring the usability of 3D VWs and comparing them with the existing principles of 2D maps. We explore operators' strategies to reach and interpret information regarding the specific type of visualization and different level of immersion.

  2. Measuring Knowledge Acquisition in 3D Virtual Learning Environments.

    PubMed

    Nunes, Eunice P dos Santos; Roque, Licínio G; Nunes, Fatima de Lourdes dos Santos

    2016-01-01

    Virtual environments can contribute to the effective learning of various subjects for people of all ages. Consequently, they assist in reducing the cost of maintaining physical structures of teaching, such as laboratories and classrooms. However, the measurement of how learners acquire knowledge in such environments is still incipient in the literature. This article presents a method to evaluate the knowledge acquisition in 3D virtual learning environments (3D VLEs) by using the learner's interactions in the VLE. Three experiments were conducted that demonstrate the viability of using this method and its computational implementation. The results suggest that it is possible to automatically assess learning in predetermined contexts and that some types of user interactions in 3D VLEs are correlated with the user's learning differential. PMID:26915117

  3. Virtual view adaptation for 3D multiview video streaming

    NASA Astrophysics Data System (ADS)

    Petrovic, Goran; Do, Luat; Zinger, Sveta; de With, Peter H. N.

    2010-02-01

    Virtual views in 3D-TV and multi-view video systems are reconstructed images of the scene generated synthetically from the original views. In this paper, we analyze the performance of streaming virtual views over IP-networks with a limited and time-varying available bandwidth. We show that the average video quality perceived by the user can be improved with an adaptive streaming strategy aiming at maximizing the average video quality. Our adaptive 3D multi-view streaming can provide a quality improvement of 2 dB on the average - over non-adaptive streaming. We demonstrate that an optimized virtual view adaptation algorithm needs to be view-dependent and achieve an improvement of up to 0.7 dB. We analyze our adaptation strategies under dynamic available bandwidth in the network.

  4. Virtual reality training improves balance function

    PubMed Central

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-01-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function. PMID:25368651

  5. Virtual reality: an intuitive approach to robotics

    NASA Astrophysics Data System (ADS)

    Natonek, Emerico; Flueckiger, Lorenzo; Zimmerman, Thierry; Baur, Charles

    1995-12-01

    Tasks definition for manipulators or robotic systems (conventional or mobile) usually lack on performance and are sometimes impossible to design. The `On-line' programming methods are often time expensive or risky for the human operator or the robot itself. On the other hand, `Off-line' techniques are tedious and complex. In a virtual reality robotics environment (VRRE), users are not asked to write down complicated functions to specify robotic tasks. However a VRRE is only effective if all the environment changes and object movements are fed-back to the virtual manipulating system. Thus some kind of visual or multi-sensor feedback is needed. This paper describes a semi autonomous robot system composed of an industrial 5-axis robot and its virtual equivalent. The user is immersed in a 3-D space built out of the robot's environment models. He directly interacts with the virtual `components' in an intuitive way creating trajectories, tasks, and dynamically optimizing them. A vision system is used to recognize the position and orientation of the objects in the real robot workspace, and updates the VRRE through a bi-directional communication link. Once the tasks have been optimized on the VRRE, they are sent to the real robot and a semi autonomous process ensures their correct execution thanks to a camera directly mounted on the robot's end effector. Therefore, errors and drifts due to transmission delays can be locally processed and successfully avoided. The system can execute the tasks autonomously, independently of small environmental changes due to transmission delays. If the environmental changes are too important the robot stops re-actualizes the VRRE with the new environmental configuration and waits for task redesign.

  6. Seamless 3D interaction for virtual tables, projection planes, and CAVEs

    NASA Astrophysics Data System (ADS)

    Encarnacao, L. M.; Bimber, Oliver; Schmalstieg, Dieter; Barton, Robert J., III

    2000-08-01

    The Virtual Table presents stereoscopic graphics to a user in a workbench-like setting. This device shares with other large- screen display technologies (such as data walls and surround- screen projection systems) the lack of human-centered unencumbered user interfaces and 3D interaction technologies. Such shortcomings present severe limitations to the application of virtual reality (VR) technology to time- critical applications as well as employment scenarios that involve heterogeneous groups of end-users without high levels of computer familiarity and expertise. Traditionally such employment scenarios are common in planning-related application areas such as mission rehearsal and command and control. For these applications, a high grade of flexibility with respect to the system requirements (display and I/O devices) as well as to the ability to seamlessly and intuitively switch between different interaction modalities and interaction are sought. Conventional VR techniques may be insufficient to meet this challenge. This paper presents novel approaches for human-centered interfaces to Virtual Environments focusing on the Virtual Table visual input device. It introduces new paradigms for 3D interaction in virtual environments (VE) for a variety of application areas based on pen-and-clipboard, mirror-in-hand, and magic-lens metaphors, and introduces new concepts for combining VR and augmented reality (AR) techniques. It finally describes approaches toward hybrid and distributed multi-user interaction environments and concludes by hypothesizing on possible use cases for defense applications.

  7. STS-133 Crew Trains in Virtual Reality

    NASA Video Gallery

    In this episode of NASA "Behind the Scenes," STS-133 Pilot Eric Boe and space station Flight Director Royce Renfrew discuss how the virtual reality laboratory at the Johnson Space Center is helping...

  8. Virtual reality applications in T and D engineering

    SciTech Connect

    Breen, P.T. Jr.; Scott, W.G.

    1995-12-31

    Visualization Technology (VT)--the author`s more meaningful definition of Virtual Reality is a commercial reality. Visualization technology can provide a realistic model of the real world, place a user within the synthetic space and allow him or her to interact within that space through head mounted displays, CRTs, data gloves, and 3D mice. Existing commercial applications of VT include the emulation of power plant control room panels, 3D models of commercial and industrial buildings and virtual models of transportation systems to train the handicapped. The authors believe that VT can greatly reduce the costs and increase the productivity of training T and D personnel, especially for hazardous assignments such as live-line maintenance. VT can also reduce the costs of design, construction and maintenance of major facilities such as power plants, substations, vaults, transmission lines and underground facilities.

  9. Virtual Reality at the PC Level

    NASA Technical Reports Server (NTRS)

    Dean, John

    1998-01-01

    The main objective of my research has been to incorporate virtual reality at the desktop level; i.e., create virtual reality software that can be run fairly inexpensively on standard PC's. The standard language used for virtual reality on PC's is VRML (Virtual Reality Modeling Language). It is a new language so it is still undergoing a lot of changes. VRML 1.0 came out only a couple years ago and VRML 2.0 came out around last September. VRML is an interpreted language that is run by a web browser plug-in. It is fairly flexible in terms of allowing you to create different shapes and animations. Before this summer, I knew very little about virtual reality and I did not know VRML at all. I learned the VRML language by reading two books and experimenting on a PC. The following topics are presented: CAD to VRML, VRML 1.0 to VRML 2.0, VRML authoring tools, VRML browsers, finding virtual reality applications, the AXAF project, the VRML generator program, web communities and future plans.

  10. Data Visualization Using Immersive Virtual Reality Tools

    NASA Astrophysics Data System (ADS)

    Cioc, Alexandru; Djorgovski, S. G.; Donalek, C.; Lawler, E.; Sauer, F.; Longo, G.

    2013-01-01

    The growing complexity of scientific data poses serious challenges for an effective visualization. Data sets, e.g., catalogs of objects detected in sky surveys, can have a very high dimensionality, ~ 100 - 1000. Visualizing such hyper-dimensional data parameter spaces is essentially impossible, but there are ways of visualizing up to ~ 10 dimensions in a pseudo-3D display. We have been experimenting with the emerging technologies of immersive virtual reality (VR) as a platform for a scientific, interactive, collaborative data visualization. Our initial experiments used the virtual world of Second Life, and more recently VR worlds based on its open source code, OpenSimulator. There we can visualize up to ~ 100,000 data points in ~ 7 - 8 dimensions (3 spatial and others encoded as shapes, colors, sizes, etc.), in an immersive virtual space where scientists can interact with their data and with each other. We are now developing a more scalable visualization environment using the popular (practically an emerging standard) Unity 3D Game Engine, coded using C#, JavaScript, and the Unity Scripting Language. This visualization tool can be used through a standard web browser, or a standalone browser of its own. Rather than merely plotting data points, the application creates interactive three-dimensional objects of various shapes, colors, and sizes, and of course the XYZ positions, encoding various dimensions of the parameter space, that can be associated interactively. Multiple users can navigate through this data space simultaneously, either with their own, independent vantage points, or with a shared view. At this stage ~ 100,000 data points can be easily visualized within seconds on a simple laptop. The displayed data points can contain linked information; e.g., upon a clicking on a data point, a webpage with additional information can be rendered within the 3D world. A range of functionalities has been already deployed, and more are being added. We expect to make this

  11. Virtual image display as a backlight for 3D.

    PubMed

    Travis, Adrian; MacCrann, Niall; Emerton, Neil; Kollin, Joel; Georgiou, Andreas; Lanier, Jaron; Bathiche, Stephen

    2013-07-29

    We describe a device which has the potential to be used both as a virtual image display and as a backlight. The pupil of the emitted light fills the device approximately to its periphery and the collimated emission can be scanned both horizontally and vertically in the manner needed to illuminate an eye in any position. The aim is to reduce the power needed to illuminate a liquid crystal panel but also to enable a smooth transition from 3D to a virtual image as the user nears the screen. PMID:23938645

  12. The Application of Virtual Reality on Distance Education

    NASA Astrophysics Data System (ADS)

    Zhan, Zehui

    The features and classifications of Virtual Reality Techniques have been summarized and recommendation of applying Virtual Reality on distance education has been made. Future research is needed on the design and implementation of virtual classroom and courseware.

  13. Virtual Reality: A New Learning Environment.

    ERIC Educational Resources Information Center

    Ferrington, Gary; Loge, Kenneth

    1992-01-01

    Discusses virtual reality (VR) technology and its possible uses in military training, medical education, industrial design and development, the media industry, and education. Three primary applications of VR in the learning process--visualization, simulation, and construction of virtual worlds--are described, and pedagogical and moral issues are…

  14. Special Section: New Ways to Detect Colon Cancer 3-D virtual screening now being used

    MedlinePlus

    ... tech medical fields of biomedical visualization, computer graphics, virtual reality, and multimedia. The year was 1994. Kaufman's "two- ... organ, like the colon—and view it in virtual reality." Later, he and his team used it with ...

  15. Virtual reality and hallucination: a technoetic perspective

    NASA Astrophysics Data System (ADS)

    Slattery, Diana R.

    2008-02-01

    Virtual Reality (VR), especially in a technologically focused discourse, is defined by a class of hardware and software, among them head-mounted displays (HMDs), navigation and pointing devices; and stereoscopic imaging. This presentation examines the experiential aspect of VR. Putting "virtual" in front of "reality" modifies the ontological status of a class of experience-that of "reality." Reality has also been modified [by artists, new media theorists, technologists and philosophers] as augmented, mixed, simulated, artificial, layered, and enhanced. Modifications of reality are closely tied to modifications of perception. Media theorist Roy Ascott creates a model of three "VR's": Verifiable Reality, Virtual Reality, and Vegetal (entheogenically induced) Reality. The ways in which we shift our perceptual assumptions, create and verify illusions, and enter "the willing suspension of disbelief" that allows us entry into imaginal worlds is central to the experience of VR worlds, whether those worlds are explicitly representational (robotic manipulations by VR) or explicitly imaginal (VR artistic creations). The early rhetoric surrounding VR was interwoven with psychedelics, a perception amplified by Timothy Leary's presence on the historic SIGGRAPH panel, and the Wall Street Journal's tag of VR as "electronic LSD." This paper discusses the connections-philosophical, social-historical, and psychological-perceptual between these two domains.

  16. Virtual embryology: a 3D library reconstructed from human embryo sections and animation of development process.

    PubMed

    Komori, M; Miura, T; Shiota, K; Minato, K; Takahashi, T

    1995-01-01

    The volumetric shape of a human embryo and its development is hard to comprehend as they have been viewed as a 2D schemes in a textbook or microscopic sectional image. In this paper, a CAI and research support system for human embryology using multimedia presentation techniques is described. In this system, 3D data is acquired from a series of sliced specimens. Its 3D structure can be viewed interactively by rotating, extracting, and truncating its whole body or organ. Moreover, the development process of embryos can be animated using a morphing technique applied to the specimen in several stages. The system is intended to be used interactively, like a virtual reality system. Hence, the system is called Virtual Embryology. PMID:8591413

  17. D3D augmented reality imaging system: proof of concept in mammography

    PubMed Central

    Douglas, David B; Petricoin, Emanuel F; Liotta, Lance; Wilson, Eugene

    2016-01-01

    Purpose The purpose of this article is to present images from simulated breast microcalcifications and assess the pattern of the microcalcifications with a technical development called “depth 3-dimensional (D3D) augmented reality”. Materials and methods A computer, head display unit, joystick, D3D augmented reality software, and an in-house script of simulated data of breast microcalcifications in a ductal distribution were used. No patient data was used and no statistical analysis was performed. Results The D3D augmented reality system demonstrated stereoscopic depth perception by presenting a unique image to each eye, focal point convergence, head position tracking, 3D cursor, and joystick fly-through. Conclusion The D3D augmented reality imaging system offers image viewing with depth perception and focal point convergence. The D3D augmented reality system should be tested to determine its utility in clinical practice. PMID:27563261

  18. Virtual Reality: A Syllabus for a Course on Virtual Reality and Education.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    This document contains a slightly-revised syllabus for a Virtual Reality course taught in spring 1994. The syllabus begins with an introduction which contains information on the software used in the course and examples of schools that have introduced virtual reality technology in the curriculum. The remainder of the document is composed of the…

  19. Dual Reality: Merging the Real and Virtual

    NASA Astrophysics Data System (ADS)

    Lifton, Joshua; Paradiso, Joseph A.

    This paper proposes the convergence of sensor networks and virtual worlds not only as a possible solution to their respective limitations, but also as the beginning of a new creative medium. In such a "dual reality," both real and virtual worlds are complete unto themselves, but also enhanced by the ability to mutually reflect, influence, and merge by means of sensor/actuator networks deeply embedded in everyday environments. This paper describes a full implementation of a dual reality system using a popular online virtual world and a human-centric sensor network designed around a common electrical power strip. Example applications (e.g., browsing sensor networks in online virtual worlds), interaction techniques, and design strategies for the dual reality domain are demonstrated and discussed.

  20. 2D virtual texture on 3D real object with coded structured light

    NASA Astrophysics Data System (ADS)

    Molinier, Thierry; Fofi, David; Salvi, Joaquim; Gorria, Patrick

    2008-02-01

    Augmented reality is used to improve color segmentation on human body or on precious no touch artifacts. We propose a technique to project a synthesized texture on real object without contact. Our technique can be used in medical or archaeological application. By projecting a suitable set of light patterns onto the surface of a 3D real object and by capturing images with a camera, a large number of correspondences can be found and the 3D points can be reconstructed. We aim to determine these points of correspondence between cameras and projector from a scene without explicit points and normals. We then project an adjusted texture onto the real object surface. We propose a global and automatic method to virtually texture a 3D real object.

  1. Mobile Virtual Reality : A Solution for Big Data Visualization

    NASA Astrophysics Data System (ADS)

    Marshall, E.; Seichter, N. D.; D'sa, A.; Werner, L. A.; Yuen, D. A.

    2015-12-01

    Pursuits in geological sciences and other branches of quantitative sciences often require data visualization frameworks that are in continual need of improvement and new ideas. Virtual reality is a medium of visualization that has large audiences originally designed for gaming purposes; Virtual reality can be captured in Cave-like environment but they are unwieldy and expensive to maintain. Recent efforts by major companies such as Facebook have focussed more on a large market , The Oculus is the first of such kind of mobile devices The operating system Unity makes it possible for us to convert the data files into a mesh of isosurfaces and be rendered into 3D. A user is immersed inside of the virtual reality and is able to move within and around the data using arrow keys and other steering devices, similar to those employed in XBox.. With introductions of products like the Oculus Rift and Holo Lens combined with ever increasing mobile computing strength, mobile virtual reality data visualization can be implemented for better analysis of 3D geological and mineralogical data sets. As more new products like the Surface Pro 4 and other high power yet very mobile computers are introduced to the market, the RAM and graphics card capacity necessary to run these models is more available, opening doors to this new reality. The computing requirements needed to run these models are a mere 8 GB of RAM and 2 GHz of CPU speed, which many mobile computers are starting to exceed. Using Unity 3D software to create a virtual environment containing a visual representation of the data, any data set converted into FBX or OBJ format which can be traversed by wearing the Oculus Rift device. This new method for analysis in conjunction with 3D scanning has potential applications in many fields, including the analysis of precious stones or jewelry. Using hologram technology to capture in high-resolution the 3D shape, color, and imperfections of minerals and stones, detailed review and

  2. Gravity and spatial orientation in virtual 3D-mazes.

    PubMed

    Vidal, Manuel; Lipshits, Mark; McIntyre, Joseph; Berthoz, Alain

    2003-01-01

    In order to bring new insights into the processing of 3D spatial information, we conducted experiments on the capacity of human subjects to memorize 3D-structured environments, such as buildings with several floors or the potentially complex 3D structure of an orbital space station. We had subjects move passively in one of two different exploration modes, through a visual virtual environment that consisted of a series of connected tunnels. In upright displacement, self-rotation when going around corners in the tunnels was limited to yaw rotations. For horizontal translations, subjects faced forward in the direction of motion. When moving up or down through vertical segments of the 3D tunnels, however, subjects facing the tunnel wall, remaining upright as if moving up and down in a glass elevator. In the unconstrained displacement mode, subjects would appear to climb or dive face-forward when moving vertically; thus, in this mode subjects could experience visual flow consistent with rotations about any of the 3 canonical axes. In a previous experiment, subjects were asked to determine whether a static, outside view of a test tunnel corresponded or not to the tunnel through which they had just passed. Results showed that performance was better on this task for the upright than for the unconstrained displacement mode; i.e. when subjects remained "upright" with respect to the virtual environment as defined by subject's posture in the first segment. This effect suggests that gravity may provide a key reference frame used in the shift between egocentric and allocentric representations of the 3D virtual world. To check whether it is the polarizing effects of gravity that leads to the favoring of the upright displacement mode, the experimental paradigm was adapted for orbital flight and performed by cosmonauts onboard the International Space Station. For these flight experiments the previous recognition task was replaced by a computerized reconstruction task, which proved

  3. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  4. Research on 3D virtual campus scene modeling based on 3ds Max and VRML

    NASA Astrophysics Data System (ADS)

    Kang, Chuanli; Zhou, Yanliu; Liang, Xianyue

    2015-12-01

    With the rapid development of modem technology, the digital information management and the virtual reality simulation technology has become a research hotspot. Virtual campus 3D model can not only express the real world objects of natural, real and vivid, and can expand the campus of the reality of time and space dimension, the combination of school environment and information. This paper mainly uses 3ds Max technology to create three-dimensional model of building and on campus buildings, special land etc. And then, the dynamic interactive function is realized by programming the object model in 3ds Max by VRML .This research focus on virtual campus scene modeling technology and VRML Scene Design, and the scene design process in a variety of real-time processing technology optimization strategy. This paper guarantees texture map image quality and improve the running speed of image texture mapping. According to the features and architecture of Guilin University of Technology, 3ds Max, AutoCAD and VRML were used to model the different objects of the virtual campus. Finally, the result of virtual campus scene is summarized.

  5. Geometry and Texture Measures for Interactive Virtualized Reality Indoor Modeler

    NASA Astrophysics Data System (ADS)

    Thangamania, K.; Ichikari, R.; Okuma, T.; Ishikawa, T.; Kurata, T.

    2015-05-01

    This paper discusses the algorithm to detect the distorted textures in the virtualized reality indoor models and automatically generate the necessary 3D planes to hold the undistorted textures. Virtualized reality (VR) interactive indoor modeler, our previous contribution enables the user to interactively create their desired indoor VR model from a single 2D image. The interactive modeler uses the projective texture mapping for mapping the textures over the manually created 3D planes. If the user has not created the necessary 3D planes, then the texture that belong to various objects are projected to the available 3D planes, which leads to the presence of distorted textures. In this paper, those distorted textures are detected automatically by the suitable principles from the shape from texture research. The texture distortion features such as the slant, tilt and the curvature parameters are calculated from the 2D image by means of affine transformation measured between the neighboring texture patches within the single image. This kind of affine transform calculation from a single image is useful in the case of deficient multiple view images. The usage of superpixels in clustering the textures corresponding to different objects, reduces the modeling labor cost. A standby database also stores the repeated basic textures that are found in the indoor model, and provides texture choices for the distorted floor, wall and other regions. Finally, this paper documents the prototype implementation and experiments with the automatic 3D plane creation and distortion detection with the above mentioned principles in the virtualized reality indoor environment.

  6. Virtual environment display for a 3D audio room simulation

    NASA Technical Reports Server (NTRS)

    Chapin, William L.; Foster, Scott H.

    1992-01-01

    The development of a virtual environment simulation system integrating a 3D acoustic audio model with an immersive 3D visual scene is discussed. The system complements the acoustic model and is specified to: allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; reinforce the listener's feeling of telepresence in the acoustical environment with visual and proprioceptive sensations; enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations.

  7. From Multi-User Virtual Environment to 3D Virtual Learning Environment

    ERIC Educational Resources Information Center

    Livingstone, Daniel; Kemp, Jeremy; Edgar, Edmund

    2008-01-01

    While digital virtual worlds have been used in education for a number of years, advances in the capabilities and spread of technology have fed a recent boom in interest in massively multi-user 3D virtual worlds for entertainment, and this in turn has led to a surge of interest in their educational applications. In this paper we briefly review the…

  8. Virtual reality and telepresence for military medicine.

    PubMed

    Satava, R M

    1995-03-01

    The profound changes brought about by technology in the past few decades are leading to a total revolution in medicine. The advanced technologies of telepresence and virtual reality are but two of the manifestations emerging from our new information age; now all of medicine can be empowered because of this digital technology. The leading edge is on the digital battlefield, where an entire new concept in military medicine is evolving. Using remote sensors, intelligent systems, telepresence surgery and virtual reality surgical simulations, combat casualty care is prepared for the 21st century. PMID:7554840

  9. From Vesalius to Virtual Reality: How Embodied Cognition Facilitates the Visualization of Anatomy

    ERIC Educational Resources Information Center

    Jang, Susan

    2010-01-01

    This study examines the facilitative effects of embodiment of a complex internal anatomical structure through three-dimensional ("3-D") interactivity in a virtual reality ("VR") program. Since Shepard and Metzler's influential 1971 study, it has been known that 3-D objects (e.g., multiple-armed cube or external body parts) are visually and…

  10. ICCE/ICCAI 2000 Full & Short Papers (Virtual Reality in Education).

    ERIC Educational Resources Information Center

    2000

    This document contains the full text of the following full and short papers on virtual reality in education from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A CAL System for Appreciation of 3D Shapes by Surface Development (C3D-SD)" (Stephen C. F. Chan, Andy…

  11. Using Virtual Reality Computer Models to Support Student Understanding of Astronomical Concepts

    ERIC Educational Resources Information Center

    Barnett, Michael; Yamagata-Lynch, Lisa; Keating, Tom; Barab, Sasha A.; Hay, Kenneth E.

    2005-01-01

    The purpose of this study was to examine how 3-dimensional (3-D) models of the Solar System supported student development of conceptual understandings of various astronomical phenomena that required a change in frame of reference. In the course described in this study, students worked in teams to design and construct 3-D virtual reality computer…

  12. 3D Reconstruction of virtual colon structures from colonoscopy images.

    PubMed

    Hong, DongHo; Tavanapong, Wallapak; Wong, Johnny; Oh, JungHwan; de Groen, Piet C

    2014-01-01

    This paper presents the first fully automated reconstruction technique of 3D virtual colon segments from individual colonoscopy images. It is the basis of new software applications that may offer great benefits for improving quality of care for colonoscopy patients. For example, a 3D map of the areas inspected and uninspected during colonoscopy can be shown on request of the endoscopist during the procedure. The endoscopist may revisit the suggested uninspected areas to reduce the chance of missing polyps that reside in these areas. The percentage of the colon surface seen by the endoscopist can be used as a coarse objective indicator of the quality of the procedure. The derived virtual colon models can be stored for post-procedure training of new endoscopists to teach navigation techniques that result in a higher level of procedure quality. Our technique does not require a prior CT scan of the colon or any global positioning device. Our experiments on endoscopy images of an Olympus synthetic colon model reveal encouraging results with small average reconstruction errors (4.1 mm for the fold depths and 12.1 mm for the fold circumferences). PMID:24225230

  13. Applied virtual reality at the Research Triangle Institute

    NASA Technical Reports Server (NTRS)

    Montoya, R. Jorge

    1994-01-01

    Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.

  14. 3D Technology Selection for a Virtual Learning Environment by Blending ISO 9126 Standard and AHP

    ERIC Educational Resources Information Center

    Cetin, Aydin; Guler, Inan

    2011-01-01

    Web3D presents many opportunities for learners in a virtual world or virtual environment over the web. This is a great opportunity for open-distance education institutions to benefit from web3d technologies to create courses with interactive 3d materials. There are many open source and commercial products offering 3d technologies over the web…

  15. Surgery, Virtual Reality, and the Future*

    PubMed Central

    Vosburgh, Kirby G.; Golby, Alexandra; Pieper, Steven D.

    2014-01-01

    MMVR has provided the leading forum for the multidisciplinary interaction and development of the use of Virtual Reality (VR) techniques in medicine, particularly in surgical practice. Here we look back at the foundations of our field, focusing on the use of VR in Surgery and similar interventional procedures, sum up the current status, and describe the challenges and opportunities going forward. PMID:23653952

  16. Evaluation of Virtual Reality Training Using Affect

    ERIC Educational Resources Information Center

    Tichon, Jennifer

    2012-01-01

    Training designed to support and strengthen higher-order mental abilities now often involves immersion in Virtual Reality (VR) where dangerous real world scenarios can be safely replicated. However, despite the growing popularity of VR to train cognitive skills such as decision-making and situation awareness, methods for evaluating their use rely…

  17. NASA employee utilizes Virtual Reality (VR) equipment

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Bebe Ly of the Information Systems Directorate's Software Technology Branch at JSC gives virtual reality a try. The stero video goggles and headphones allow her to see and hear in a computer-generated world and the gloves allow her to move around and grasp objects.

  18. Are Learning Styles Relevant to Virtual Reality?

    ERIC Educational Resources Information Center

    Chen, Chwen Jen; Toh, Seong Chong; Ismail, Wan Mohd Fauzy Wan

    2005-01-01

    This study aims to investigate the effects of a virtual reality (VR)-based learning environment on learners with different learning styles. The findings of the aptitude-by-treatment interaction study have shown that learners benefit most from the VR (guided exploration) mode, irrespective of their learning styles. This shows that the VR-based…

  19. Virtual Reality Training Environments: Contexts and Concerns.

    ERIC Educational Resources Information Center

    Harmon, Stephen W.; Kenney, Patrick J.

    1994-01-01

    Discusses the contexts where virtual reality (VR) training environments might be appropriate; examines the advantages and disadvantages of VR as a training technology; and presents a case study of a VR training environment used at the NASA Johnson Space Center in preparation for the repair of the Hubble Space Telescope. (AEF)

  20. Virtual Reality: Is It for Real?

    ERIC Educational Resources Information Center

    Dowding, Tim J.

    1994-01-01

    Defines virtual reality and describes its application to psychomotor skills training. A description of a system that could be used to teach a college course in physical therapy, including the use of miniature computer workstation, sensory gloves, a programmable mannequin, and other existing technology, is provided. (Contains 10 references.) (KRN)

  1. Applications of Virtual Reality to Nuclear Safeguards

    SciTech Connect

    Stansfield, S.

    1998-11-03

    This paper explores two potential applications of Virtual Reality (VR) to international nuclear safeguards: training and information organization and navigation. The applications are represented by two existing prototype systems, one for training nuclear weapons dismantlement and one utilizing a VR model to facilitate intuitive access to related sets of information.

  2. Integrated Data Visualization and Virtual Reality Tool

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  3. Second Life, a 3-D Animated Virtual World: An Alternative Platform for (Art) Education

    ERIC Educational Resources Information Center

    Han, Hsiao-Cheng

    2011-01-01

    3-D animated virtual worlds are no longer only for gaming. With the advance of technology, animated virtual worlds not only are found on every computer, but also connect users with the internet. Today, virtual worlds are created not only by companies, but also through the collaboration of users. Online 3-D animated virtual worlds provide a new…

  4. Controlling Social Stress in Virtual Reality Environments

    PubMed Central

    Hartanto, Dwi; Kampmann, Isabel L.; Morina, Nexhmedin; Emmelkamp, Paul G. M.; Neerincx, Mark A.; Brinkman, Willem-Paul

    2014-01-01

    Virtual reality exposure therapy has been proposed as a viable alternative in the treatment of anxiety disorders, including social anxiety disorder. Therapists could benefit from extensive control of anxiety eliciting stimuli during virtual exposure. Two stimuli controls are studied in this study: the social dialogue situation, and the dialogue feedback responses (negative or positive) between a human and a virtual character. In the first study, 16 participants were exposed in three virtual reality scenarios: a neutral virtual world, blind date scenario, and job interview scenario. Results showed a significant difference between the three virtual scenarios in the level of self-reported anxiety and heart rate. In the second study, 24 participants were exposed to a job interview scenario in a virtual environment where the ratio between negative and positive dialogue feedback responses of a virtual character was systematically varied on-the-fly. Results yielded that within a dialogue the more positive dialogue feedback resulted in less self-reported anxiety, lower heart rate, and longer answers, while more negative dialogue feedback of the virtual character resulted in the opposite. The correlations between on the one hand the dialogue stressor ratio and on the other hand the means of SUD score, heart rate and audio length in the eight dialogue conditions showed a strong relationship: r(6) = 0.91, p = 0.002; r(6) = 0.76, p = 0.028 and r(6) = −0.94, p = 0.001 respectively. Furthermore, more anticipatory anxiety reported before exposure was found to coincide with more self-reported anxiety, and shorter answers during the virtual exposure. These results demonstrate that social dialogues in a virtual environment can be effectively manipulated for therapeutic purposes. PMID:24671006

  5. Virtual Reality in Schools: The Ultimate Educational Technology.

    ERIC Educational Resources Information Center

    Reid, Robert D.; Sykes, Wylmarie

    1999-01-01

    Discusses the use of virtual reality as an educational tool. Highlights include examples of virtual reality in public schools that lead to a more active learning process, simulated environments, integrating virtual reality into any curriculum, benefits to teachers and students, and overcoming barriers to implementation. (LRW)

  6. Transportation planning: A virtual reality

    SciTech Connect

    Bradley, J.; Hefele, J.; Dolin, R.M.

    1994-07-01

    An important factor in the development of any base technology is generating it in such a way that these technologies will continue to be useful through systems upgrades and implementation philosophy metamorphoses. Base technologies of traffic engineering including transportation modeling, traffic impact forecasting, traffic operation management, emergency situation routing and re-routing, and signal systems optimization should all be designed with the future in mind. Advanced Traffic Engineering topics, such as Intelligent Vehicle Highway Systems, are designed with advanced engineering concepts such as rules-based design and artificial intelligence. All aspects of development of base technologies must include Total Quality Engineering as the primary factor in order to succeed. This philosophy for development of base technologies for the County of Los Alamos is being developed leveraging the resources of the Center for Advanced Engineering Technology (CAET) at the Los Alamos National Laboratory. The mission of the CAET is to develop next-generation engineering technology that supports the Los Alamos National Laboratory`s mission and to transfer that technology to industry and academia. The CAET`s goal is to promote industrial, academic, and government interactions in diverse areas of engineering technology, such as, design, analysis, manufacturing, virtual enterprise, robotics, telepresence, rapid prototyping, and virtual environment technology. The Center is expanding, enhancing, and increasing core competencies at the Los Alamos National Laboratory. The CAET has three major thrust areas: development of base technologies, virtual environment technology applications, and educational outreach and training. Virtual environment technology immerses a user in a nonexistent or augmented environment for research or training purposes. Virtual environment technology illustrates the axiom, ``The best way to learn is by doing.``

  7. Physics and 3D in Flash Simulations: Open Source Reality

    NASA Astrophysics Data System (ADS)

    Harold, J. B.; Dusenbery, P.

    2009-12-01

    Over the last decade our ability to deliver simulations over the web has steadily advanced. The improvements in speed of the Adobe Flash engine, and the development of open source tools to expand it, allow us to deliver increasingly sophisticated simulation based games through the browser, with no additional downloads required. In this paper we will present activities we are developing as part of two asteroids education projects: Finding NEO (funded through NSF and NASA SMD), and Asteroids! (funded through NSF). The first activity is Rubble!, an asteroids deflection game built on the open source Box2D physics engine. This game challenges players to push asteroids in to safe orbits before they crash in to the Earth. The Box2D engine allows us to go well beyond simple 2-body orbital calculations and incorporate “rubble piles”. These objects, which are representative of many asteroids, are composed of 50 or more individual rocks which gravitationally bind and separate in realistic ways. Even bombs can be modeled with sufficient physical accuracy to convince players of the hazards of trying to “blow up” incoming asteroids. The ability to easily build games based on underlying physical models allows us to address physical misconceptions in a natural way: by having the player operate in a world that directly collides with those misconceptions. Rubble! provides a particularly compelling example of this due to the variety of well documented misconceptions regarding gravity. The second activity is a Light Curve challenge, which uses the open source PaperVision3D tools to analyze 3D asteroid models. The goal of this activity is to introduce the player to the concept of “light curves”, measurements of asteroid brightness over time which are used to calculate the asteroid’s period. These measurements can even be inverted to generate three dimensional models of asteroids that are otherwise too small and distant to directly image. Through the use of the Paper

  8. Acoustic simulation in realistic 3D virtual scenes

    NASA Astrophysics Data System (ADS)

    Gozard, Patrick; Le Goff, Alain; Naz, Pierre; Cathala, Thierry; Latger, Jean

    2003-09-01

    The simulation workshop CHORALE developed in collaboration with OKTAL SE company for the French MoD is used by government services and industrial companies for weapon system validation and qualification trials in the infrared domain. The main operational reference for CHORALE is the assessment of the infrared guidance system of the Storm Shadow missile French version, called Scalp. The use of CHORALE workshop is now extended to the acoustic domain. The main objective is the simulation of the detection of moving vehicles in realistic 3D virtual scenes. This article briefly describes the acoustic model in CHORALE. The 3D scene is described by a set of polygons. Each polygon is characterized by its acoustic resistivity or its complex impedance. Sound sources are associated with moving vehicles and are characterized by their spectra and directivities. A microphone sensor is defined by its position, its frequency band and its sensitivity. The purpose of the acoustic simulation is to calculate the incoming acoustic pressure on microphone sensors. CHORALE is based on a generic ray tracing kernel. This kernel possesses original capabilities: computation time is nearly independent on the scene complexity, especially the number of polygons, databases are enhanced with precise physical data, special mechanisms of antialiasing have been developed that enable to manage very accurate details. The ray tracer takes into account the wave geometrical divergence and the atmospheric transmission. The sound wave refraction is simulated and rays cast in the 3D scene are curved according to air temperature gradient. Finally, sound diffraction by edges (hill, wall,...) is also taken into account.

  9. 3D virtual colonoscopy with real-time volume rendering

    NASA Astrophysics Data System (ADS)

    Wan, Ming; Li, Wei J.; Kreeger, Kevin; Bitter, Ingmar; Kaufman, Arie E.; Liang, Zhengrong; Chen, Dongqing; Wax, Mark R.

    2000-04-01

    In our previous work, we developed a virtual colonoscopy system on a high-end 16-processor SGI Challenge with an expensive hardware graphics accelerator. The goal of this work is to port the system to a low cost PC in order to increase its availability for mass screening. Recently, Mitsubishi Electric has developed a volume-rendering PC board, called VolumePro, which includes 128 MB of RAM and vg500 rendering chip. The vg500 chip, based on Cube-4 technology, can render a 2563 volume at 30 frames per second. High image quality of volume rendering inside the colon is guaranteed by the full lighting model and 3D interpolation supported by the vg500 chip. However, the VolumePro board is lacking some features required by our interactive colon navigation. First, VolumePro currently does not support perspective projection which is paramount for interior colon navigation. Second, the patient colon data is usually much larger than 2563 and cannot be rendered in real-time. In this paper, we present our solutions to these problems, including simulated perspective projection and axis aligned boxing techniques, and demonstrate the high performance of our virtual colonoscopy system on low cost PCs.

  10. Building virtual 3D bone fragment models to control diaphyseal fracture reduction

    NASA Astrophysics Data System (ADS)

    Leloup, Thierry; Schuind, Frederic; Lasudry, Nadine; Van Ham, Philippe

    1999-05-01

    Most fractures of the long bones are displaced and need to be surgically reduced. External fixation is often used but the crucial point of this technique is the control of reduction, which is effected with a brilliance amplifier. This system, giving instantly a x-ray image, has many disadvantages. It implies frequent irradiation to the patient and the surgical team, the visual field is limited, the supplied images are distorted and it only gives 2D information. Consequently, the reduction is occasionally imperfect although intraoperatively it appears acceptable. Using the pains inserted in each fragment as markers and an optical tracker, it is possible to build a virtual 3D model for each principal fragment and to follow its movement during the reduction. This system will supply a 3D image of the fracture in real time and without irradiation. The brilliance amplifier could then be replaced by such a virtual reality system to provide the surgeon with an accurate tool facilitating the reduction of the fracture. The purpose of this work is to show how to build the 3D model for each principal bone fragment.

  11. Virtual environment display for a 3D audio room simulation

    NASA Astrophysics Data System (ADS)

    Chapin, William L.; Foster, Scott

    1992-06-01

    Recent developments in virtual 3D audio and synthetic aural environments have produced a complex acoustical room simulation. The acoustical simulation models a room with walls, ceiling, and floor of selected sound reflecting/absorbing characteristics and unlimited independent localizable sound sources. This non-visual acoustic simulation, implemented with 4 audio ConvolvotronsTM by Crystal River Engineering and coupled to the listener with a Poihemus IsotrakTM, tracking the listener's head position and orientation, and stereo headphones returning binaural sound, is quite compelling to most listeners with eyes closed. This immersive effect should be reinforced when properly integrated into a full, multi-sensory virtual environment presentation. This paper discusses the design of an interactive, visual virtual environment, complementing the acoustic model and specified to: 1) allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; 2) reinforce the listener's feeling of telepresence into the acoustical environment with visual and proprioceptive sensations; 3) enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and 4) serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations. The installed system implements a head-coupled, wide-angle, stereo-optic tracker/viewer and multi-computer simulation control. The portable demonstration system implements a head-mounted wide-angle, stereo-optic display, separate head and pointer electro-magnetic position trackers, a heterogeneous parallel graphics processing system, and object oriented C++ program code.

  12. Enabling virtual reality on mobile devices: enhancing students' learning experience

    NASA Astrophysics Data System (ADS)

    Feisst, Markus E.

    2011-05-01

    Nowadays, mobile devices are more and more powerful concerning processing power, main memory and storage as well as graphical output capability and the support for 3D mostly via OpenGL ES. Therefore modern devices allows it to enable Virtual Reality (VR) on them. Most students own (or will own in future) one of these more powerful mobile device. The students owning such a mobile device already using it to communicate (SMS, twitter, etc) and/or to listen to podcasts. Taking this knowledge into account, it makes sense to improve the students learning experience by enabling mobile devices to display VR content.

  13. Virtual reality for dragline planners

    SciTech Connect

    Cobcroft, T.

    2007-03-15

    3d-Dig as an invaluable mine planning and communication tool, developed by Earth Technology Pty Ltd., that makes it possible to easily communicate a mine plan through the use of animations and other graphics. An Australian company has been using it to plan in-detail pits and strips for up to five years in advance; a US operator is using it to optimise dragline stripping around inside corners and to accurately plan the traverse of ramps. The new system offers a better predication of rehandled volumes, linear coal advance and dig time within a strip. It is useful for optimising waste stripping and timing of uncovered coal to enhance blending and shipping reliability. It presents volumetric, spoil placement and positioning data while generating animations that communicate the plan. 5 figs.

  14. Fully Three-Dimensional Virtual-Reality System

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1994-01-01

    Proposed virtual-reality system presents visual displays to simulate free flight in three-dimensional space. System, virtual space pod, is testbed for control and navigation schemes. Unlike most virtual-reality systems, virtual space pod would not depend for orientation on ground plane, which hinders free flight in three dimensions. Space pod provides comfortable seating, convenient controls, and dynamic virtual-space images for virtual traveler. Controls include buttons plus joysticks with six degrees of freedom.

  15. An optical tracking system for virtual reality

    NASA Astrophysics Data System (ADS)

    Hrimech, Hamid; Merienne, Frederic

    2009-03-01

    In this paper we present a low-cost 3D tracking system which we have developed and tested in order to move away from traditional 2D interaction techniques (keyboard and mouse) in an attempt to improve user's experience while using a CVE. Such a tracking system is used to implement 3D interaction techniques that augment user experience, promote user's sense of transportation in the virtual world as well as user's awareness of their partners. The tracking system is a passive optical tracking system using stereoscopy a technique allowing the reconstruction of three-dimensional information from a couple of images. We have currently deployed our 3D tracking system on a collaborative research platform for investigating 3D interaction techniques in CVEs.

  16. Applying Virtual Reality to commercial Edutainment

    NASA Technical Reports Server (NTRS)

    Grissom, F.; Goza, Sharon P.; Goza, S. Michael

    1994-01-01

    Virtual reality (VR) when defined as a computer generated, immersive, three dimensional graphics environment which provides varying degrees of interactivity, remains an expensive, highly specialized application, yet to find its way into the school, home, or business. As a novel approach to a theme park-type attraction, though, its use can be justified. This paper describes how a virtual reality 'tour of the human digestive system' was created for the Omniplex Science Museum of Oklahoma City, Oklahoma. The customers main objectives were: (1) to educate; (2) to entertain; (3) to draw visitors; and (4) to generate revenue. The 'Edutainment' system ultimately delivered met these goals. As more such systems come into existence the resulting library of licensable programs will greatly reduce development costs to individual institutions.

  17. Feedback from video for virtual reality Navigation

    SciTech Connect

    Tsap, L V

    2000-10-27

    Important preconditions for wide acceptance of virtual reality (VR) systems include their comfort, ease and naturalness to use. Most existing trackers super from discomfort-related issues. For example, body-based trackers (hand controllers, joysticks, helmet attachments, etc.) restrict spontaneity and naturalness of motion, while ground-based devices (e.g., hand controllers) limit the workspace by literally binding an operator to the ground. There are similar problems with controls. This paper describes using real-time video with registered depth information (from a commercially available camera) for virtual reality navigation. Camera-based setup can replace cumbersome trackers. The method includes selective depth processing for increased speed, and a robust skin-color segmentation for accounting illumination variations.

  18. Objective and subjective quality assessment of geometry compression of reconstructed 3D humans in a 3D virtual room

    NASA Astrophysics Data System (ADS)

    Mekuria, Rufael; Cesar, Pablo; Doumanis, Ioannis; Frisiello, Antonella

    2015-09-01

    Compression of 3D object based video is relevant for 3D Immersive applications. Nevertheless, the perceptual aspects of the degradation introduced by codecs for meshes and point clouds are not well understood. In this paper we evaluate the subjective and objective degradations introduced by such codecs in a state of art 3D immersive virtual room. In the 3D immersive virtual room, users are captured with multiple cameras, and their surfaces are reconstructed as photorealistic colored/textured 3D meshes or point clouds. To test the perceptual effect of compression and transmission, we render degraded versions with different frame rates in different contexts (near/far) in the scene. A quantitative subjective study with 16 users shows that negligible distortion of decoded surfaces compared to the original reconstructions can be achieved in the 3D virtual room. In addition, a qualitative task based analysis in a full prototype field trial shows increased presence, emotion, user and state recognition of the reconstructed 3D Human representation compared to animated computer avatars.

  19. Participatory Gis: Experimentations for a 3d Social Virtual Globe

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Zamboni, G.

    2013-08-01

    The dawn of GeoWeb 2.0, the geographic extension of Web 2.0, has opened new possibilities in terms of online dissemination and sharing of geospatial contents, thus laying the foundations for a fruitful development of Participatory GIS (PGIS). The purpose of the study is to investigate the extension of PGIS applications, which are quite mature in the traditional bi-dimensional framework, up to the third dimension. More in detail, the system should couple a powerful 3D visualization with an increase of public participation by means of a tool allowing data collecting from mobile devices (e.g. smartphones and tablets). The PGIS application, built using the open source NASA World Wind virtual globe, is focussed on the cultural and tourism heritage of Como city, located in Northern Italy. An authentication mechanism was implemented, which allows users to create and manage customized projects through cartographic mash-ups of Web Map Service (WMS) layers. Saved projects populate a catalogue which is available to the entire community. Together with historical maps and the current cartography of the city, the system is also able to manage geo-tagged multimedia data, which come from user field-surveys performed through mobile devices and report POIs (Points Of Interest). Each logged user can then contribute to POIs characterization by adding textual and multimedia information (e.g. images, audios and videos) directly on the globe. All in all, the resulting application allows users to create and share contributions as it usually happens on social platforms, additionally providing a realistic 3D representation enhancing the expressive power of data.

  20. Presence Pedagogy: Teaching and Learning in a 3D Virtual Immersive World

    ERIC Educational Resources Information Center

    Bronack, Stephen; Sanders, Robert; Cheney, Amelia; Riedl, Richard; Tashner, John; Matzen, Nita

    2008-01-01

    As the use of 3D immersive virtual worlds in higher education expands, it is important to examine which pedagogical approaches are most likely to bring about success. AET Zone, a 3D immersive virtual world in use for more than seven years, is one embodiment of pedagogical innovation that capitalizes on what virtual worlds have to offer to social…

  1. Sound For Animation And Virtual Reality

    NASA Technical Reports Server (NTRS)

    Hahn, James K.; Docter, Pete; Foster, Scott H.; Mangini, Mark; Myers, Tom; Wenzel, Elizabeth M.; Null, Cynthia (Technical Monitor)

    1995-01-01

    Sound is an integral part of the experience in computer animation and virtual reality. In this course, we will present some of the important technical issues in sound modeling, rendering, and synchronization as well as the "art" and business of sound that are being applied in animations, feature films, and virtual reality. The central theme is to bring leading researchers and practitioners from various disciplines to share their experiences in this interdisciplinary field. The course will give the participants an understanding of the problems and techniques involved in producing and synchronizing sounds, sound effects, dialogue, and music. The problem spans a number of domains including computer animation and virtual reality. Since sound has been an integral part of animations and films much longer than for computer-related domains, we have much to learn from traditional animation and film production. By bringing leading researchers and practitioners from a wide variety of disciplines, the course seeks to give the audience a rich mixture of experiences. It is expected that the audience will be able to apply what they have learned from this course in their research or production.

  2. Virtual reality disaster training: translation to practice.

    PubMed

    Farra, Sharon L; Miller, Elaine T; Hodgson, Eric

    2015-01-01

    Disaster training is crucial to the mitigation of both mortality and morbidity associated with disasters. Just as clinical practice needs to be grounded in evidence, effective disaster education is dependent upon the development and use of andragogic and pedagogic evidence. Educational research findings must be transformed into useable education strategies. Virtual reality simulation is a teaching methodology that has the potential to be a powerful educational tool. The purpose of this article is to translate research findings related to the use of virtual reality simulation in disaster training into education practice. The Ace Star Model serves as a valuable framework to translate the VRS teaching methodology and improve disaster training of healthcare professionals. Using the Ace Star Model as a framework to put evidence into practice, strategies for implementing a virtual reality simulation are addressed. Practice guidelines, implementation recommendations, integration to practice and evaluation are discussed. It is imperative that health educators provide more exemplars of how research evidence can be moved through the various stages of the model to advance practice and sustain learning outcomes. PMID:24063793

  3. Virtual reality training for health-care professionals.

    PubMed

    Mantovani, Fabrizia; Castelnuovo, Gianluca; Gaggioli, Andrea; Riva, Giuseppe

    2003-08-01

    Emerging changes in health-care delivery are having a significant impact on the structure of health-care professionals' education. Today it is recognized that medical knowledge doubles every 6-8 years, with new medical procedures emerging everyday. While the half-life of medical information is so short, the average physician practices 30 years and the average nurse 40 years. Continuing education thus represents an important challenge to face. Recent advances in educational technology are offering an increasing number of innovative learning tools. Among these, Virtual Reality represents a promising area with high potential of enhancing the training of health-care professionals. Virtual Reality Training can provide a rich, interactive, engaging educational context, thus supporting experiential learning-by-doing; it can, in fact, contribute to raise interest and motivation in trainees and to effectively support skills acquisition and transfer, since the learning process can be settled within an experiential framework. Current virtual training applications for health-care differ a lot as to both their technological/multimedia sophistication and to the types of skills trained, varying for example from telesurgical applications to interactive simulations of human body and brain, to virtual worlds for emergency training. Other interesting applications include the development of immersive 3D environments for training psychiatrists and psychologists in the treatment of mental disorders. This paper has the main aim of discussing the rationale and main benefits for the use of virtual reality in health-care education and training. Significant research and projects carried out in this field will also be presented, followed by discussion on key issues concerning current limitations and future development directions. PMID:14511451

  4. Selected Applications of Virtual Reality in Manufacturing

    NASA Astrophysics Data System (ADS)

    Novak-Marcincin, Jozef

    2011-01-01

    Virtual reality (VR) has become an important and useful tool in science and engineering. VR applications cover a wide range of industrial areas from product design to analysis, from product prototyping to manufacturing. The design and manufacturing of a product can be viewed, evaluated and improved in a virtual environment before its prototype is made, which is an enormous cost saving. Virtual Manufacturing (VM) is the use of computer models and simulations of manufacturing processes to aid in the design and production of manufactured products. VM is the use of manufacturing-based simulations to optimize the design of product and processes for a specific manufacturing goal such as: design for assembly; quality; lean operations; and/or flexibility.

  5. Virtual reality applications to agoraphobia: a protocol.

    PubMed

    Cárdenas, Georgina; Muñoz, Sandra; González, Maribel; Uribarren, Guillermo

    2006-04-01

    Recently, educators and instructional designers have focused on the development and implementation of virtual learning environments that effectively combine theoretical and applied knowledge to teach university students. One of the trusts of the Psychology Virtual Teaching Laboratory in collaboration with the IXTLI observatory is to develop dissemination programs to promote the insertion of virtual reality (VR) technologies applied to rehabilitation in their clinical practice. This paper describes the development of (1) agoraphobia VR learning objects to be use as a teaching support tools in class and (2) a multimedia teaching program that incorporate digital video and VR scenarios address to students in the field of mental health. Promotion among professors and students about the use of this technology will allow us to initiate research in our country as well as to validate contextualized applications for our culture, therefore contributing with new advances in this field. PMID:16640489

  6. Virtual reality in the operating room of the future.

    PubMed

    Müller, W; Grosskopf, S; Hildebrand, A; Malkewitz, R; Ziegler, R

    1997-01-01

    In cooperation with the Max-Delbrück-Centrum/Robert-Rössle-Klinik (MDC/RRK) in Berlin, the Fraunhofer Institute for Computer Graphics is currently designing and developing a scenario for the operating room of the future. The goal of this project is to integrate new analysis, visualization and interaction tools in order to optimize and refine tumor diagnostics and therapy in combination with laser technology and remote stereoscopic video transfer. Hence, a human 3-D reference model is reconstructed using CT, MR, and anatomical cryosection images from the National Library of Medicine's Visible Human Project. Applying segmentation algorithms and surface-polygonization methods a 3-D representation is obtained. In addition, a "fly-through" the virtual patient is realized using 3-D input devices (data glove, tracking system, 6-DOF mouse). In this way, the surgeon can experience really new perspectives of the human anatomy. Moreover, using a virtual cutting plane any cut of the CT volume can be interactively placed and visualized in realtime. In conclusion, this project delivers visions for the application of effective visualization and VR systems. Commonly known as Virtual Prototyping and applied by the automotive industry long ago, this project shows, that the use of VR techniques can also prototype an operating room. After evaluating design and functionality of the virtual operating room, MDC plans to build real ORs in the near future. The use of VR techniques provides a more natural interface for the surgeon in the OR (e.g., controlling interactions by voice input). Besides preoperative planning future work will focus on supporting the surgeon in performing surgical interventions. An optimal synthesis of real and synthetic data, and the inclusion of visual, aural, and tactile senses in virtual environments can meet these requirements. This Augmented Reality could represent the environment for the surgeons of tomorrow. PMID:10173059

  7. I'm Not a Real Doctor, but I Play One in Virtual Reality: Implications of Virtual Reality for Judgments about Reality.

    ERIC Educational Resources Information Center

    Shapiro, Michael A.; McDonald, Daniel G.

    1992-01-01

    Shows that communication and social psychology research in the past 100 years have identified 2 different aspects of reality evaluation. Outlines the critical elements to form a theory of media reality effects. Extends that theory to include virtual reality, and shows how virtual reality will be an important tool for investigating these effects.…

  8. DJ Sim: a virtual reality DJ simulation game

    NASA Astrophysics Data System (ADS)

    Tang, Ka Yin; Loke, Mei Hwan; Chin, Ching Ling; Chua, Gim Guan; Chong, Jyh Herng; Manders, Corey; Khan, Ishtiaq Rasool; Yuan, Miaolong; Farbiz, Farzam

    2009-02-01

    This work describes the process of developing a 3D Virtual Reality (VR) DJ simulation game intended to be displayed on a stereoscopic display. Using a DLP projector and shutter glasses, the user of the system plays a game in which he or she is a DJ in a night club. The night club's music is playing, and the DJ is "scratching" in correspondence to this music. Much in the flavor of Guitar Hero or Dance Dance Revolution, a virtual turntable is manipulated to project information about how the user should perform. The user only needs a small set of hand gestures, corresponding to the turntable scratch movements to play the game. As the music plays, a series of moving arrows approaching the DJ's turntable instruct the user as to when and how to perform the scratches.

  9. Development of real-time motion capture system for 3D on-line games linked with virtual character

    NASA Astrophysics Data System (ADS)

    Kim, Jong Hyeong; Ryu, Young Kee; Cho, Hyung Suck

    2004-10-01

    Motion tracking method is being issued as essential part of the entertainment, medical, sports, education and industry with the development of 3-D virtual reality. Virtual human character in the digital animation and game application has been controlled by interfacing devices; mouse, joysticks, midi-slider, and so on. Those devices could not enable virtual human character to move smoothly and naturally. Furthermore, high-end human motion capture systems in commercial market are expensive and complicated. In this paper, we proposed a practical and fast motion capturing system consisting of optic sensors, and linked the data with 3-D game character with real time. The prototype experiment setup is successfully applied to a boxing game which requires very fast movement of human character.

  10. Intelligent virtual reality in the setting of fuzzy sets

    NASA Technical Reports Server (NTRS)

    Dockery, John; Littman, David

    1992-01-01

    The authors have previously introduced the concept of virtual reality worlds governed by artificial intelligence. Creation of an intelligent virtual reality was further proposed as a universal interface for the handicapped. This paper extends consideration of intelligent virtual realty to a context in which fuzzy set principles are explored as a major tool for implementing theory in the domain of applications to the disabled.

  11. Virtual Reality: A Dream Come True or a Nightmare.

    ERIC Educational Resources Information Center

    Cornell, Richard; Bailey, Dan

    Virtual Reality (VR) is a new medium which allows total stimulation of one's senses through human/computer interfaces. VR has applications in training simulators, nano-science, medicine, entertainment, electronic technology, and manufacturing. This paper focuses on some current and potential problems of virtual reality and virtual environments…

  12. Importance of Virtual Reality to Virtual Reality Exposure Therapy, Study Design of a Randomized Trial.

    PubMed

    McLay, Robert N; Baird, Alicia; Murphy, Jennifer; Deal, William; Tran, Lily; Anson, Heather; Klam, Warren; Johnston, Scott

    2015-01-01

    Post Traumatic Stress Disorder (PTSD) can be a debilitating problem in service members who have served in Iraq or Afghanistan. Virtual Reality Exposure Therapy (VRET) is one of the few interventions demonstrated in randomized controlled trials to be effective for PTSD in this population. There are theoretical reasons to expect that Virtual Reality (VR) adds to the effectiveness of exposure therapy, but there is also added expense and difficulty in using VR. Described is a trial comparing outcomes from VRET and a control exposure therapy (CET) protocol in service members with PTSD. PMID:26799904

  13. iVirtualWorld: A Domain-Oriented End-User Development Environment for Building 3D Virtual Chemistry Experiments

    ERIC Educational Resources Information Center

    Zhong, Ying

    2013-01-01

    Virtual worlds are well-suited for building virtual laboratories for educational purposes to complement hands-on physical laboratories. However, educators may face technical challenges because developing virtual worlds requires skills in programming and 3D design. Current virtual world building tools are developed for users who have programming…

  14. Virtual reality and consciousness inference in dreaming.

    PubMed

    Hobson, J Allan; Hong, Charles C-H; Friston, Karl J

    2014-01-01

    This article explores the notion that the brain is genetically endowed with an innate virtual reality generator that - through experience-dependent plasticity - becomes a generative or predictive model of the world. This model, which is most clearly revealed in rapid eye movement (REM) sleep dreaming, may provide the theater for conscious experience. Functional neuroimaging evidence for brain activations that are time-locked to rapid eye movements (REMs) endorses the view that waking consciousness emerges from REM sleep - and dreaming lays the foundations for waking perception. In this view, the brain is equipped with a virtual model of the world that generates predictions of its sensations. This model is continually updated and entrained by sensory prediction errors in wakefulness to ensure veridical perception, but not in dreaming. In contrast, dreaming plays an essential role in maintaining and enhancing the capacity to model the world by minimizing model complexity and thereby maximizing both statistical and thermodynamic efficiency. This perspective suggests that consciousness corresponds to the embodied process of inference, realized through the generation of virtual realities (in both sleep and wakefulness). In short, our premise or hypothesis is that the waking brain engages with the world to predict the causes of sensations, while in sleep the brain's generative model is actively refined so that it generates more efficient predictions during waking. We review the evidence in support of this hypothesis - evidence that grounds consciousness in biophysical computations whose neuronal and neurochemical infrastructure has been disclosed by sleep research. PMID:25346710

  15. Virtual reality and consciousness inference in dreaming

    PubMed Central

    Hobson, J. Allan; Hong, Charles C.-H.; Friston, Karl J.

    2014-01-01

    This article explores the notion that the brain is genetically endowed with an innate virtual reality generator that – through experience-dependent plasticity – becomes a generative or predictive model of the world. This model, which is most clearly revealed in rapid eye movement (REM) sleep dreaming, may provide the theater for conscious experience. Functional neuroimaging evidence for brain activations that are time-locked to rapid eye movements (REMs) endorses the view that waking consciousness emerges from REM sleep – and dreaming lays the foundations for waking perception. In this view, the brain is equipped with a virtual model of the world that generates predictions of its sensations. This model is continually updated and entrained by sensory prediction errors in wakefulness to ensure veridical perception, but not in dreaming. In contrast, dreaming plays an essential role in maintaining and enhancing the capacity to model the world by minimizing model complexity and thereby maximizing both statistical and thermodynamic efficiency. This perspective suggests that consciousness corresponds to the embodied process of inference, realized through the generation of virtual realities (in both sleep and wakefulness). In short, our premise or hypothesis is that the waking brain engages with the world to predict the causes of sensations, while in sleep the brain’s generative model is actively refined so that it generates more efficient predictions during waking. We review the evidence in support of this hypothesis – evidence that grounds consciousness in biophysical computations whose neuronal and neurochemical infrastructure has been disclosed by sleep research. PMID:25346710

  16. Sensorimotor Training in Virtual Reality: A Review

    PubMed Central

    Adamovich, Sergei V.; Fluet, Gerard G.; Tunik, Eugene; Merians, Alma S.

    2010-01-01

    Recent experimental evidence suggests that rapid advancement of virtual reality (VR) technologies has great potential for the development of novel strategies for sensorimotor training in neurorehabilitation. We discuss what the adaptive and engaging virtual environments can provide for massive and intensive sensorimotor stimulation needed to induce brain reorganization. Second, discrepancies between the veridical and virtual feedback can be introduced in VR to facilitate activation of targeted brain networks, which in turn can potentially speed up the recovery process. Here we review the existing experimental evidence regarding the beneficial effects of training in virtual environments on the recovery of function in the areas of gait, upper extremity function and balance, in various patient populations. We also discuss possible mechanisms underlying these effects. We feel that future research in the area of virtual rehabilitation should follow several important paths. Imaging studies to evaluate the effects of sensory manipulation on brain activation patterns and the effect of various training parameters on long term changes in brain function are needed to guide future clinical inquiry. Larger clinical studies are also needed to establish the efficacy of sensorimotor rehabilitation using VR approaches in various clinical populations and most importantly, to identify VR training parameters that are associated with optimal transfer into real-world functional improvements. PMID:19713617

  17. Summer Students in Virtual Reality: A Pilot Study on Educational Applications of Virtual Reality Technology.

    ERIC Educational Resources Information Center

    Bricken, Meredith; Byrne, Chris M.

    The goal of this study was to take a first step in evaluating the potential of virtual reality (VR) as a learning environment. The context of the study was The Technology Academy, a technology-oriented summer day camp for students ages 5-18, where student activities center around hands-on exploration of new technology (e.g., robotics, MIDI digital…

  18. A rapid algorithm for realistic human reaching and its use in a virtual reality system

    NASA Technical Reports Server (NTRS)

    Aldridge, Ann; Pandya, Abhilash; Goldsby, Michael; Maida, James

    1994-01-01

    The Graphics Analysis Facility (GRAF) at JSC has developed a rapid algorithm for computing realistic human reaching. The algorithm was applied to GRAF's anthropometrically correct human model and used in a 3D computer graphics system and a virtual reality system. The nature of the algorithm and its uses are discussed.

  19. Linking Audio and Visual Information while Navigating in a Virtual Reality Kiosk Display

    ERIC Educational Resources Information Center

    Sullivan, Briana; Ware, Colin; Plumlee, Matthew

    2006-01-01

    3D interactive virtual reality museum exhibits should be easy to use, entertaining, and informative. If the interface is intuitive, it will allow the user more time to learn the educational content of the exhibit. This research deals with interface issues concerning activating audio descriptions of images in such exhibits while the user is…

  20. Photographer: Digital Telepresence: Dr Murial Ross's Virtual Reality Application for Neuroscience

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Photographer: Digital Telepresence: Dr Murial Ross's Virtual Reality Application for Neuroscience Research Biocomputation. To study human disorders of balance and space motion sickness. Shown here is a 3D reconstruction of a nerve ending in inner ear, nature's wiring of balance organs.

  1. Virtual reality for the treatment of autism.

    PubMed

    Strickland, D

    1997-01-01

    Autism is a mental disorder which has received attention in several unrelated studies using virtual reality. One of the first attempts was to diagnose children with special needs at Tokyo University using a sandbox playing technique. Although operating the computer controls proved to be too difficult for the individuals with autism in the Tokyo study, research at the University of Nottingham, UK, is successful in using VR as a learning aid for children with a variety of disorders including autism. Both centers used flat screen computer systems with virtual scenes. Another study which concentrated on using VR as a learning aid with an immersive headset system is described in detail in this chapter. Perhaps because of the seriousness of the disorder and the lack of effective treatments, autism has received more study than attention deficit disorders, although both would appear to benefit from many of the same technology features. PMID:10184809

  2. Virtual reality for automotive design evaluation

    NASA Technical Reports Server (NTRS)

    Dodd, George G.

    1995-01-01

    A general description of Virtual Reality technology and possible applications was given from publicly available material. A video tape was shown demonstrating the use of multiple large-screen stereoscopic displays, configured in a 10' x 10' x 10' room, to allow a person to evaluate and interact with a vehicle which exists only as mathematical data, and is made only of light. The correct viewpoint of the vehicle is maintained by tracking special glasses worn by the subject. Interior illumination was changed by moving a virtual light around by hand; interior colors are changed by pointing at a color on a color palette, then pointing at the desired surface to change. We concluded by discussing research needed to move this technology forward.

  3. Virtual reality in construction industry: a requirement compatibility analysis approach

    NASA Astrophysics Data System (ADS)

    Ye, Jilin; Shulgin, Boris V.; Raja, Vinesh H.

    2006-02-01

    Virtual Reality (VR) is regarded as a high-end user-computer interface that involves real-time simulation and interactions through multiple sensorial channels. It is assumed that VR will reshape the interaction interfaces between user and computer technology by offering new approaches for the communication of information, the visualisation of processes and the creative expression of ideas. The VR application in construction has a relatively long history but its successful stories are not heard quite often. In this paper, the authors have explored how much further the construction industry could be supported by new three dimensional (3D) VR technologies in different construction processes. The design information in the construction industry has been discussed first followed by a detail construction process analysis. A questionnaire survey has been conducted and the results of the survey are presented and discussed. As an investigation into the application of 3D VR technologies in the context of the construction processes, the benefits and challenges of current and potential applications of 3D VR in the construction industry have been identified. This study also reveals the strengths and weaknesses of 3D VR technology applications in the construction processes. Suggestions and future works are also provided in this paper.

  4. The Virtual Radiopharmacy Laboratory: A 3-D Simulation for Distance Learning

    ERIC Educational Resources Information Center

    Alexiou, Antonios; Bouras, Christos; Giannaka, Eri; Kapoulas, Vaggelis; Nani, Maria; Tsiatsos, Thrasivoulos

    2004-01-01

    This article presents Virtual Radiopharmacy Laboratory (VR LAB), a virtual laboratory accessible through the Internet. VR LAB is designed and implemented in the framework of the VirRAD European project. This laboratory represents a 3D simulation of a radio-pharmacy laboratory, where learners, represented by 3D avatars, can experiment on…

  5. 3D Inhabited Virtual Worlds: Interactivity and Interaction between Avatars, Autonomous Agents, and Users.

    ERIC Educational Resources Information Center

    Jensen, Jens F.

    This paper addresses some of the central questions currently related to 3-Dimensional Inhabited Virtual Worlds (3D-IVWs), their virtual interactions, and communication, drawing from the theory and methodology of sociology, interaction analysis, interpersonal communication, semiotics, cultural studies, and media studies. First, 3D-IVWs--seen as a…

  6. Issues and Challenges of Teaching and Learning in 3D Virtual Worlds: Real Life Case Studies

    ERIC Educational Resources Information Center

    Pfeil, Ulrike; Ang, Chee Siang; Zaphiris, Panayiotis

    2009-01-01

    We aimed to study the characteristics and usage patterns of 3D virtual worlds in the context of teaching and learning. To achieve this, we organised a full-day workshop to explore, discuss and investigate the educational use of 3D virtual worlds. Thirty participants took part in the workshop. All conversations were recorded and transcribed for…

  7. Interaction Design and Usability of Learning Spaces in 3D Multi-user Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Minocha, Shailey; Reeves, Ahmad John

    Three-dimensional virtual worlds are multimedia, simulated environments, often managed over the Web, which users can 'inhabit' and interact via their own graphical, self-representations known as 'avatars'. 3D virtual worlds are being used in many applications: education/training, gaming, social networking, marketing and commerce. Second Life is the most widely used 3D virtual world in education. However, problems associated with usability, navigation and way finding in 3D virtual worlds may impact on student learning and engagement. Based on empirical investigations of learning spaces in Second Life, this paper presents design guidelines to improve the usability and ease of navigation in 3D spaces. Methods of data collection include semi-structured interviews with Second Life students, educators and designers. The findings have revealed that design principles from the fields of urban planning, Human- Computer Interaction, Web usability, geography and psychology can influence the design of spaces in 3D multi-user virtual environments.

  8. The Engelbourg's ruins: from 3D TLS point cloud acquisition to 3D virtual and historic models

    NASA Astrophysics Data System (ADS)

    Koehl, Mathieu; Berger, Solveig; Nobile, Sylvain

    2014-05-01

    The Castle of Engelbourg was built at the beginning of the 13th century, at the top of the Schlossberg. It is situated on the territory of the municipality of Thann (France), at the crossroads of Alsace and Lorraine, and dominates the outlet of the valley of Thur. Its strategic position was one of the causes of its systematic destructions during the 17th century, and Louis XIV finished his fate by ordering his demolition in 1673. Today only few vestiges remain, of which a section of the main tower from about 7m of diameter and 4m of wide laying on its slice, unique characteristic in the regional castral landscape. It is visible since the valley, was named "the Eye of the witch", and became a key attraction of the region. The site, which extends over approximately one hectare, is for several years the object of numerous archaeological studies and is at the heart of a project of valuation of the vestiges today. It was indeed a key objective, among the numerous planned works, to realize a 3D model of the site in its current state, in other words, a virtual model "such as seized", exploitable as well from a cultural and tourist point of view as by scientists and in archaeological researches. The team of the ICube/INSA lab had in responsibility the realization of this model, the acquisition of the data until the delivery of the virtual model, thanks to 3D TLS and topographic surveying methods. It was also planned to integrate into this 3D model, data of 2D archives, stemming from series of former excavations. The objectives of this project were the following ones: • Acquisition of 3D digital data of the site and 3D modelling • Digitization of the 2D archaeological data and integration in the 3D model • Implementation of a database connected to the 3D model • Virtual Visit of the site The obtained results allowed us to visualize every 3D object individually, under several forms (point clouds, 3D meshed objects and models, etc.) and at several levels of detail

  9. Virtual reality applications in robotic simulations

    NASA Technical Reports Server (NTRS)

    Homan, David J.; Gott, Charles J.; Goza, S. Michael

    1994-01-01

    Virtual reality (VR) provides a means to practice integrated extravehicular activities (EVA)/remote manipulator system (RMS) operations in the on-orbit configuration with no discomfort or risk to crewmembers. VR afforded the STS-61 crew the luxury of practicing the integrated EVA/RMS operations in an on-orbit configuration prior to the actual flight. The VR simulation was developed by the Automation and Robotics Division's Telepresence/Virtual Reality Lab and Integrated Graphics, Operations, and Analysis Lab (IGOAL) at JSC. The RMS Part Task Trainer (PTT) was developed by the IGOAL for RMS training in 1988 as a fully functional, kinematic simulation of the shuttle RMS and served as the RMS portion of the integrated VR simulation. Because the EVA crewmember could get a realistic view of the shuttle and payload bay in the VR simulation, he/she could explore different positions and views to determine the best method for performing a specific task, thus greatly increasing the efficiency of use of the neutral buoyancy facilities.

  10. Virtual and Printed 3D Models for Teaching Crystal Symmetry and Point Groups

    ERIC Educational Resources Information Center

    Casas, Lluís; Estop, Euge`nia

    2015-01-01

    Both, virtual and printed 3D crystal models can help students and teachers deal with chemical education topics such as symmetry and point groups. In the present paper, two freely downloadable tools (interactive PDF files and a mobile app) are presented as examples of the application of 3D design to study point-symmetry. The use of 3D printing to…

  11. Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine.

    PubMed

    Lee, S; Lee, J; Lee, A; Park, N; Lee, S; Song, S; Seo, A; Lee, H; Kim, J-I; Eom, K

    2013-05-01

    Augmented reality (AR) is a technology which enables users to see the real world, with virtual objects superimposed upon or composited with it. AR simulators have been developed and used in human medicine, but not in veterinary medicine. The aim of this study was to develop an AR intravenous (IV) injection simulator to train veterinary and pre-veterinary students to perform canine venipuncture. Computed tomographic (CT) images of a beagle dog were scanned using a 64-channel multidetector. The CT images were transformed into volumetric data sets using an image segmentation method and were converted into a stereolithography format for creating 3D models. An AR-based interface was developed for an AR simulator for IV injection. Veterinary and pre-veterinary student volunteers were randomly assigned to an AR-trained group or a control group trained using more traditional methods (n = 20/group; n = 8 pre-veterinary students and n = 12 veterinary students in each group) and their proficiency at IV injection technique in live dogs was assessed after training was completed. Students were also asked to complete a questionnaire which was administered after using the simulator. The group that was trained using an AR simulator were more proficient at IV injection technique using real dogs than the control group (P ≤ 0.01). The students agreed that they learned the IV injection technique through the AR simulator. Although the system used in this study needs to be modified before it can be adopted for veterinary educational use, AR simulation has been shown to be a very effective tool for training medical personnel. Using the technology reported here, veterinary AR simulators could be developed for future use in veterinary education. PMID:23103217

  12. Contextual EFL Learning in a 3D Virtual Environment

    ERIC Educational Resources Information Center

    Lan, Yu-Ju

    2015-01-01

    The purposes of the current study are to develop virtually immersive EFL learning contexts for EFL learners in Taiwan to pre- and review English materials beyond the regular English class schedule. A 2-iteration action research lasting for one semester was conducted to evaluate the effects of virtual contexts on learners' EFL learning. 132…

  13. Virtual Reality Exposure Therapy Using a Virtual Iraq: Case Report

    PubMed Central

    Gerardi, Maryrose; Rothbaum, Barbara Olasov; Ressler, Kerry; Heekin, Mary; Rizzo, Albert

    2013-01-01

    Posttraumatic stress disorder (PTSD) has been estimated to affect up to 18% of returning Operation Iraqi Freedom (OIF) veterans. Soldiers need to maintain constant vigilance to deal with unpredictable threats, and an unprecedented number of soldiers are surviving serious wounds. These risk factors are significant for development of PTSD; therefore, early and efficient intervention options must be identified and presented in a form acceptable to military personnel. This case report presents the results of treatment utilizing virtual reality exposure (VRE) therapy (virtual Iraq) to treat an OIF veteran with PTSD. Following brief VRE treatment, the veteran demonstrated improvement in PTSD symptoms as indicated by clinically and statistically significant changes in scores on the Clinician Administered PTSD Scale (CAPS; Blake et al., 1990) and the PTSD Symptom Scale Self-Report (PSS-SR; Foa, Riggs, Dancu, & Rothbaum, 1993). These results indicate preliminary promise for this treatment. PMID:18404648

  14. Virtual reality exposure therapy using a virtual Iraq: case report.

    PubMed

    Gerardi, Maryrose; Rothbaum, Barbara Olasov; Ressler, Kerry; Heekin, Mary; Rizzo, Albert

    2008-04-01

    Posttraumatic stress disorder (PTSD) has been estimated to affect up to 18% of returning Operation Iraqi Freedom (OIF) veterans. Soldiers need to maintain constant vigilance to deal with unpredictable threats, and an unprecedented number of soldiers are surviving serious wounds. These risk factors are significant for development of PTSD; therefore, early and efficient intervention options must be identified and presented in a form acceptable to military personnel. This case report presents the results of treatment utilizing virtual reality exposure (VRE) therapy (virtual Iraq) to treat an OIF veteran with PTSD. Following brief VRE treatment, the veteran demonstrated improvement in PTSD symptoms as indicated by clinically and statistically significant changes in scores on the Clinician Administered PTSD Scale (CAPS; Blake et al., 1990) and the PTSD Symptom Scale Self-Report (PSS-SR; Foa, Riggs, Dancu, & Rothbaum, 1993). These results indicate preliminary promise for this treatment. PMID:18404648

  15. Evaluation of Home Delivery of Lectures Utilizing 3D Virtual Space Infrastructure

    ERIC Educational Resources Information Center

    Nishide, Ryo; Shima, Ryoichi; Araie, Hiromu; Ueshima, Shinichi

    2007-01-01

    Evaluation experiments have been essential in exploring home delivery of lectures for which users can experience campus lifestyle and distant learning through 3D virtual space. This paper discusses the necessity of virtual space for distant learners by examining the effects of virtual space. The authors have pursued the possibility of…

  16. Treatment of Complicated Grief Using Virtual Reality: A Case Report

    ERIC Educational Resources Information Center

    Botella, C.; Osma, J.; Palacios, A. Garcia; Guillen, V.; Banos, R.

    2008-01-01

    This is the first work exploring the application of new technologies, concretely virtual reality, to facilitate emotional processing in the treatment of Complicated Grief. Our research team has designed a virtual reality environment (EMMA's World) to foster the expression and processing of emotions. In this study the authors present a description…

  17. A Desktop Virtual Reality Earth Motion System in Astronomy Education

    ERIC Educational Resources Information Center

    Chen, Chih Hung; Yang, Jie Chi; Shen, Sarah; Jeng, Ming Chang

    2007-01-01

    In this study, a desktop virtual reality earth motion system (DVREMS) is designed and developed to be applied in the classroom. The system is implemented to assist elementary school students to clarify earth motion concepts using virtual reality principles. A study was conducted to observe the influences of the proposed system in learning.…

  18. Designing a Virtual-Reality-Based, Gamelike Math Learning Environment

    ERIC Educational Resources Information Center

    Xu, Xinhao; Ke, Fengfeng

    2016-01-01

    This exploratory study examined the design issues related to a virtual-reality-based, gamelike learning environment (VRGLE) developed via OpenSimulator, an open-source virtual reality server. The researchers collected qualitative data to examine the VRGLE's usability, playability, and content integration for math learning. They found it important…

  19. Role of virtual reality simulation in endoscopy training.

    PubMed

    Harpham-Lockyer, Louis; Laskaratos, Faidon-Marios; Berlingieri, Pasquale; Epstein, Owen

    2015-12-10

    Recent advancements in virtual reality graphics and models have allowed virtual reality simulators to be incorporated into a variety of endoscopic training programmes. Use of virtual reality simulators in training programmes is thought to improve skill acquisition amongst trainees which is reflected in improved patient comfort and safety. Several studies have already been carried out to ascertain the impact that usage of virtual reality simulators may have upon trainee learning curves and how this may translate to patient comfort. This article reviews the available literature in this area of medical education which is particularly relevant to all parties involved in endoscopy training and curriculum development. Assessment of the available evidence for an optimal exposure time with virtual reality simulators and the long-term benefits of their use are also discussed. PMID:26675895

  20. The 'mad scientists': psychoanalysis, dream and virtual reality.

    PubMed

    Leclaire, Marie

    2003-04-01

    The author explores the concept of reality-testing as a means of assessing the relationship with reality that prevails in dream and in virtual reality. Based on a model developed by Jean Laplanche, she compares these activities in detail in order to determine their respective independence from the function of reality-testing. By carefully examining the concept of hallucination in the writings of Freud and Daniel Dennett, the author seeks to pinpoint the specific modalities of interaction between perceptions, ideas, wishes and actions that converge in the 'belief' and in the 'sense of reality'. The paper's main thesis consists of the distinction that it draws between immediacy-testing and reality-testing, with the further argument that this distinction not only dissipates the conceptual vagueness that generally surrounds the latter of the two concepts but also that it promotes a more precise analysis of the function of reality in dream and in virtual reality. PMID:12856355

  1. 3-D Virtual and Physical Reconstruction of Bendego Iron

    NASA Astrophysics Data System (ADS)

    Belmonte, S. L. R.; Zucolotto, M. E.; Fontes, R. C.; dos Santos, J. R. L.

    2012-09-01

    The use of 3D laser scanning to meteoritic to preserve the original shape of the meteorites before cutting and the facility of saved the datas in STL format (stereolithography) to print three-dimensional physical models and generate a digital replica.

  2. Virtual Presence: One Step Beyond Reality

    NASA Technical Reports Server (NTRS)

    Budden, Nancy Ann

    1997-01-01

    Our primary objective was to team up a group consisting of scientists and engineers from two different NASA cultures, and simulate an interactive teleoperated robot conducting geologic field work on the Moon or Mars. The information derived from the experiment will benefit both the robotics team and the planetary exploration team in the areas of robot design and development, and mission planning and analysis. The Earth Sciences and Space and Life Sciences Division combines the past with the future contributing experience from Apollo crews exploring the lunar surface, knowledge of reduced gravity environments, the performance limits of EVA suits, and future goals for human exploration beyond low Earth orbit. The Automation, Robotics. and Simulation Division brings to the table the technical expertise of robotic systems, the future goals of highly interactive robotic capabilities, treading on the edge of technology by joining for the first time a unique combination of telepresence with virtual reality.

  3. Dissociation in virtual reality: depersonalization and derealization

    NASA Astrophysics Data System (ADS)

    Garvey, Gregory P.

    2010-01-01

    This paper looks at virtual worlds such as Second Life7 (SL) as possible incubators of dissociation disorders as classified by the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition3 (also known as the DSM-IV). Depersonalization is where "a person feels that he or she has changed in some way or is somehow unreal." Derealization when "the same beliefs are held about one's surroundings." Dissociative Identity Disorder (DID), previously known as multiple personality disorder fits users of Second Life who adopt "in-world" avatars and in effect, enact multiple distinct identities or personalities (known as alter egos or alters). Select questions from the Structured Clinical Interview for Depersonalization (SCI-DER)8 will be discussed as they might apply to the user's experience in Second Life. Finally I would like to consider the hypothesis that rather than a pathological disorder, dissociation is a normal response to the "artificial reality" of Second Life.

  4. Virtual reality treatment of flying phobia.

    PubMed

    Baños, Rosa M; Botella, Cristina; Perpiñá, Concepción; Alcañiz, Mariano; Lozano, Jose Antonio; Osma, Jorge; Gallardo, Myriam

    2002-09-01

    Flying phobia (FP) might become a very incapacitating and disturbing problem in a person's social, working, and private areas. Psychological interventions based on exposure therapy have proved to be effective, but given the particular nature of this disorder they bear important limitations. Exposure therapy for FP might be excessively costly in terms of time, money, and efforts. Virtual reality (VR) overcomes these difficulties as different significant environments might be created, where the patient can interact with what he or she fears while in a totally safe and protected environment-the therapist's consulting room. This paper intends, on one hand, to show the different scenarios designed by our team for the VR treatment of FP, and on the other, to present the first results supporting the effectiveness of this new tool for the treatment of FP in a multiple baseline study. PMID:12381036

  5. Sustained Efficacy of Virtual Reality Distraction

    PubMed Central

    Rutter, Charles E.; Dahlquist, Lynnda M.; Weiss, Karen E.

    2011-01-01

    The current study tested whether the effectiveness of distraction using virtual reality (VR) technology in reducing cold pressor pain would maintain over the course of eight weekly exposures. Twenty-eight adults, 18 to 23 years of age, underwent one baseline cold pressor trial and one VR distraction trial in randomized order each week. VR distraction led to significant increases in pain threshold and pain tolerance, and significant decreases in pain intensity, time spent thinking about pain, and self-reported anxiety, relative to baseline. Repeated exposure did not appear to affect the benefits of VR. Implications for the long-term use of VR distraction as a non-pharmacological analgesic are discussed. PMID:19231295

  6. Applied virtual reality in aerospace design

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P.

    1995-01-01

    A virtual reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. The objectives of the MSFC VR Applications Program are to develop, assess, validate, and utilize VR in hardware development, operations development and support, mission operations training and science training. Before VR can be used with confidence in a particular application, VR must be validated for that class of applications. For that reason, specific validation studies for selected classes of applications have been proposed and are currently underway. These include macro-ergonomic 'control room class' design analysis, Spacelab stowage reconfiguration training, a full-body microgravity functional reach simulator, a gross anatomy teaching simulator, and micro-ergonomic design analysis. This paper describes the MSFC VR Applications Program and the validation studies.

  7. Multimodal event streams for virtual reality

    NASA Astrophysics Data System (ADS)

    von Spiczak, J.; Samset, E.; DiMaio, S.; Reitmayr, G.; Schmalstieg, D.; Burghart, C.; Kikinis, R.

    2007-01-01

    Applications in the fields of virtual and augmented reality as well as image-guided medical applications make use of a wide variety of hardware devices. Existing frameworks for interconnecting low-level devices and high-level application programs do not exploit the full potential for processing events coming from arbitrary sources and are not easily generalizable. In this paper, we will introduce a new multi-modal event processing methodology using dynamically-typed event attributes for event passing between multiple devices and systems. The existing OpenTracker framework was modified to incorporate a highly flexible and extensible event model, which can store data that is dynamically created and arbitrarily typed at runtime. The main factors impacting the library's throughput were determined and the performance was shown to be sufficient for most typical applications. Several sample applications were developed to take advantage of the new dynamic event model provided by the library, thereby demonstrating its flexibility and expressive power.

  8. An Onboard ISS Virtual Reality Trainer

    NASA Technical Reports Server (NTRS)

    Miralles, Evelyn

    2013-01-01

    Prior to the retirement of the Space Shuttle, many exterior repairs on the International Space Station (ISS) were carried out by shuttle astronauts, trained on the ground and flown to the station to perform these repairs. After the retirement of the shuttle, this is no longer an available option. As such, the need for the ISS crew members to review scenarios while on flight, either for tasks they already trained or for contingency operations has become a very critical subject. In many situations, the time between the last session of Neutral Buoyancy Laboratory (NBL) training and an Extravehicular Activity (EVA) task might be 6 to 8 months. In order to help with training for contingency repairs and to maintain EVA proficiency while on flight, the Johnson Space Center Virtual Reality Lab (VRLab) designed an onboard immersive ISS Virtual Reality Trainer (VRT), incorporating a unique optical system and making use of the already successful Dynamic Onboard Ubiquitous Graphical (DOUG) graphics software, to assist crew members with current procedures and contingency EVAs while on flight. The VRT provides an immersive environment similar to the one experienced at the VRLab crew training facility at NASA Johnson Space Center. EVA tasks are critical for a mission since as time passes the crew members may lose proficiency on previously trained tasks. In addition, there is an increased need for unplanned contingency repairs to fix problems arising as the ISS ages. The need to train and re-train crew members for EVAs and contingency scenarios is crucial and extremely demanding. ISS crew members are now asked to perform EVA tasks for which they have not been trained and potentially have never seen before.

  9. Virtual Reality: A Distraction Intervention for Chemotherapy

    PubMed Central

    Schneider, Susan M.; Hood, Linda E.

    2007-01-01

    Purpose/Objectives To explore virtual reality (VR) as a distraction intervention to relieve symptom distress in adults receiving chemotherapy treatments for breast, colon, and lung cancer. Design Crossover design in which participants served as their own control. Setting Outpatient clinic at a comprehensive cancer center in the southeastern United States. Sample 123 adults receiving initial chemotherapy treatments. Methods Participants were randomly assigned to receive the VR distraction intervention during one chemotherapy treatment and then received no intervention (control) during an alternate matched chemotherapy treatment. The Adapted Symptom Distress Scale–2, Revised Piper Fatigue Scale, and State Anxiety Inventory were used to measure symptom distress. The Presence Questionnaire and an open-ended questionnaire were used to evaluate the subjects’ VR experience. The influence of type of cancer, age, and gender on symptom outcomes was explored. Mixed models were used to test for differences in levels of symptom distress. Main Research Variables Virtual reality and symptom distress. Findings Patients had an altered perception of time (p < 0.001) when using VR, which validates the distracting capacity of the intervention. Evaluation of the intervention indicated that patients believed the head-mounted device was easy to use, they experienced no cybersickness, and 82% would use VR again. However, analysis demonstrated no significant differences in symptom distress immediately or two days following chemotherapy treatments. Conclusions Patients stated that using VR made the treatment seem shorter and that chemotherapy treatments with VR were better than treatments without the distraction intervention. However, positive experiences did not result in a decrease in symptom distress. The findings support the idea that using VR can help to make chemotherapy treatments more tolerable, but clinicians should not assume that use of VR will improve chemotherapy

  10. Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project.

    PubMed

    Zucchella, Chiara; Sinforiani, Elena; Tassorelli, Cristina; Cavallini, Elena; Tost-Pardell, Daniela; Grau, Sergi; Pazzi, Stefania; Puricelli, Stefano; Bernini, Sara; Bottiroli, Sara; Vecchi, Tomaso; Sandrini, Giorgio; Nappi, Giuseppe

    2014-01-01

    Conventional cognitive assessment is based on a pencil-and-paper neuropsychological evaluation, which is time consuming, expensive and requires the involvement of several professionals. Information and communication technology could be exploited to allow the development of tools that are easy to use, reduce the amount of data processing, and provide controllable test conditions. Serious games (SGs) have the potential to be new and effective tools in the management and treatment of cognitive impairments Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project in the elderly. Moreover, by adopting SGs in 3D virtual reality settings, cognitive functions might be evaluated using tasks that simulate daily activities, increasing the "ecological validity" of the assessment. In this commentary we report our experience in the creation of the Smart Aging platform, a 3D SGand virtual environment-based platform for the early identification and characterization of mild cognitive impairment. PMID:25473734

  11. Virtual reality, augmented reality, and robotics applied to digestive operative procedures: from in vivo animal preclinical studies to clinical use

    NASA Astrophysics Data System (ADS)

    Soler, Luc; Marescaux, Jacques

    2006-04-01

    Technological innovations of the 20 th century provided medicine and surgery with new tools, among which virtual reality and robotics belong to the most revolutionary ones. Our work aims at setting up new techniques for detection, 3D delineation and 4D time follow-up of small abdominal lesions from standard mecial images (CT scsan, MRI). It also aims at developing innovative systems making tumor resection or treatment easier with the use of augmented reality and robotized systems, increasing gesture precision. It also permits a realtime great distance connection between practitioners so they can share a same 3D reconstructed patient and interact on a same patient, virtually before the intervention and for real during the surgical procedure thanks to a telesurgical robot. In preclinical studies, our first results obtained from a micro-CT scanner show that these technologies provide an efficient and precise 3D modeling of anatomical and pathological structures of rats and mice. In clinical studies, our first results show the possibility to improve the therapeutic choice thanks to a better detection and and representation of the patient before performing the surgical gesture. They also show the efficiency of augmented reality that provides virtual transparency of the patient in real time during the operative procedure. In the near future, through the exploitation of these systems, surgeons will program and check on the virtual patient clone an optimal procedure without errors, which will be replayed on the real patient by the robot under surgeon control. This medical dream is today about to become reality.

  12. Virtual reality in medical education and assessment

    NASA Technical Reports Server (NTRS)

    Sprague, Laurie A.; Bell, Brad; Sullivan, Tim; Voss, Mark; Payer, Andrew F.; Goza, Stewart Michael

    1994-01-01

    The NASA Johnson Space Center (JSC)/LinCom Corporation, the University of Texas Medical Branch at Galveston (UTMB), and the Galveston Independent School District (GISD) have teamed up to develop a virtual visual environment display (VIVED) that provides a unique educational experience using virtual reality (VR) technologies. The VIVED end product will be a self-contained educational experience allowing students a new method of learning as they interact with the subject matter through VR. This type of interface is intuitive and utilizes spatial and psychomotor abilities which are now constrained or reduced by the current two dimensional terminals and keyboards. The perpetual challenge to educators remains the identification and development of methodologies which conform the learners abilities and preferences. The unique aspects of VR provide an opportunity to explore a new educational experience. Endowing medical students with an understanding of the human body poses some difficulty challenges. One of the most difficult is to convey the three dimensional nature of anatomical structures. The ideal environment for addressing this problem would be one that allows students to become small enough to enter the body and travel through it - much like a person walks through a building. By using VR technology, this effect can be achieved; when VR is combined with multimedia technologies, the effect can be spectacular.

  13. Using voice input and audio feedback to enhance the reality of a virtual experience

    SciTech Connect

    Miner, N.E.

    1994-04-01

    Virtual Reality (VR) is a rapidly emerging technology which allows participants to experience a virtual environment through stimulation of the participant`s senses. Intuitive and natural interactions with the virtual world help to create a realistic experience. Typically, a participant is immersed in a virtual environment through the use of a 3-D viewer. Realistic, computer-generated environment models and accurate tracking of a participant`s view are important factors for adding realism to a virtual experience. Stimulating a participant`s sense of sound and providing a natural form of communication for interacting with the virtual world are equally important. This paper discusses the advantages and importance of incorporating voice recognition and audio feedback capabilities into a virtual world experience. Various approaches and levels of complexity are discussed. Examples of the use of voice and sound are presented through the description of a research application developed in the VR laboratory at Sandia National Laboratories.

  14. Implementing virtual reality interfaces for the geosciences

    SciTech Connect

    Bethel, W.; Jacobsen, J.; Austin, A.; Lederer, M.; Little, T.

    1996-06-01

    For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called {open_quotes}Virtual Reality{close_quotes} (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools. VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter three or six-dimensional parameters. In the latter, three or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world, but the keyboard is the obstacle in that typing is cumbersome, error-prone and typically slow. In the latter, the user can interact with these parameters by means of motor skills which are highly developed. Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as a tool for computing streamlines via manipulation of a {open_quotes}rake.{close_quotes} The rake is presented to the user in the form of a {open_quotes}virtual well{close_quotes} icon, and provides parameters used by the streamlines algorithm.

  15. Special Section: New Ways to Detect Colon Cancer 3-D virtual screening now being used

    MedlinePlus

    ... body) from the National Library of Medicine's Visible Human project (www.nlm.nih.gov). By 1996, Kaufman and his colleagues had patented a pioneering computer software system and techniques for 3-D virtual ...

  16. Spilling the beans on java 3D: a tool for the virtual anatomist.

    PubMed

    Guttmann, G D

    1999-04-15

    The computing world has just provided the anatomist with another tool: Java 3D, within the Java 2 platform. On December 9, 1998, Sun Microsystems released Java 2. Java 3D classes are now included in the jar (Java Archive) archives of the extensions directory of Java 2. Java 3D is also a part of the Java Media Suite of APIs (Application Programming Interfaces). But what is Java? How does Java 3D work? How do you view Java 3D objects? A brief introduction to the concepts of Java and object-oriented programming is provided. Also, there is a short description of the tools of Java 3D and of the Java 3D viewer. Thus, the virtual anatomist has another set of computer tools to use for modeling various aspects of anatomy, such as embryological development. Also, the virtual anatomist will be able to assist the surgeon with virtual surgery using the tools found in Java 3D. Java 3D will be able to fulfill gaps, such as the lack of platform independence, interactivity, and manipulability of 3D images, currently existing in many anatomical computer-aided learning programs. PMID:10321435

  17. A Parameterizable Framework for Replicated Experiments in Virtual 3D Environments

    NASA Astrophysics Data System (ADS)

    Biella, Daniel; Luther, Wolfram

    This paper reports on a parameterizable 3D framework that provides 3D content developers with an initial spatial starting configuration, metaphorical connectors for accessing exhibits or interactive 3D learning objects or experiments, and other optional 3D extensions, such as a multimedia room, a gallery, username identification tools and an avatar selection room. The framework is implemented in X3D and uses a Web-based content management system. It has been successfully used for an interactive virtual museum for key historical experiments and in two additional interactive e-learning implementations: an African arts museum and a virtual science centre. It can be shown that, by reusing the framework, the production costs for the latter two implementations can be significantly reduced and content designers can focus on developing educational content instead of producing cost-intensive out-of-focus 3D objects.

  18. Combination of Virtual Tours, 3d Model and Digital Data in a 3d Archaeological Knowledge and Information System

    NASA Astrophysics Data System (ADS)

    Koehl, M.; Brigand, N.

    2012-08-01

    The site of the Engelbourg ruined castle in Thann, Alsace, France, has been for some years the object of all the attention of the city, which is the owner, and also of partners like historians and archaeologists who are in charge of its study. The valuation of the site is one of the main objective, as well as its conservation and its knowledge. The aim of this project is to use the environment of the virtual tour viewer as new base for an Archaeological Knowledge and Information System (AKIS). With available development tools we add functionalities in particular through diverse scripts that convert the viewer into a real 3D interface. By beginning with a first virtual tour that contains about fifteen panoramic images, the site of about 150 times 150 meters can be completely documented by offering the user a real interactivity and that makes visualization very concrete, almost lively. After the choice of pertinent points of view, panoramic images were realized. For the documentation, other sets of images were acquired at various seasons and climate conditions, which allow documenting the site in different environments and states of vegetation. The final virtual tour was deducted from them. The initial 3D model of the castle, which is virtual too, was also joined in the form of panoramic images for completing the understanding of the site. A variety of types of hotspots were used to connect the whole digital documentation to the site, including videos (as reports during the acquisition phases, during the restoration works, during the excavations, etc.), digital georeferenced documents (archaeological reports on the various constituent elements of the castle, interpretation of the excavations and the searches, description of the sets of collected objects, etc.). The completely personalized interface of the system allows either to switch from a panoramic image to another one, which is the classic case of the virtual tours, or to go from a panoramic photographic image

  19. Design of Learning Spaces in 3D Virtual Worlds: An Empirical Investigation of "Second Life"

    ERIC Educational Resources Information Center

    Minocha, Shailey; Reeves, Ahmad John

    2010-01-01

    "Second Life" (SL) is a three-dimensional (3D) virtual world, and educational institutions are adopting SL to support their teaching and learning. Although the question of how 3D learning spaces should be designed to support student learning and engagement has been raised among SL educators and designers, there is hardly any guidance or research…

  20. Employing Virtual Humans for Education and Training in X3D/VRML Worlds

    ERIC Educational Resources Information Center

    Ieronutti, Lucio; Chittaro, Luca

    2007-01-01

    Web-based education and training provides a new paradigm for imparting knowledge; students can access the learning material anytime by operating remotely from any location. Web3D open standards, such as X3D and VRML, support Web-based delivery of Educational Virtual Environments (EVEs). EVEs have a great potential for learning and training…

  1. Virtual 3D interactive system with embedded multiwavelength optical sensor array and sequential devices

    NASA Astrophysics Data System (ADS)

    Wang, Guo-Zhen; Huang, Yi-Pai; Hu, Kuo-Jui

    2012-06-01

    We proposed a virtual 3D-touch system by bare finger, which can detect the 3-axis (x, y, z) information of finger. This system has multi-wavelength optical sensor array embedded on the backplane of TFT panel and sequentail devices on the border of TFT panel. We had developed reflecting mode which can be worked by bare finger for the 3D interaction. A 4-inch mobile 3D-LCD with this proposed system was successfully been demonstrated already.

  2. Applications of Panoramic Images: from 720° Panorama to Interior 3d Models of Augmented Reality

    NASA Astrophysics Data System (ADS)

    Lee, I.-C.; Tsai, F.

    2015-05-01

    A series of panoramic images are usually used to generate a 720° panorama image. Although panoramic images are typically used for establishing tour guiding systems, in this research, we demonstrate the potential of using panoramic images acquired from multiple sites to create not only 720° panorama, but also three-dimensional (3D) point clouds and 3D indoor models. Since 3D modeling is one of the goals of this research, the location of the panoramic sites needed to be carefully planned in order to maintain a robust result for close-range photogrammetry. After the images are acquired, panoramic images are processed into 720° panoramas, and these panoramas which can be used directly as panorama guiding systems or other applications. In addition to these straightforward applications, interior orientation parameters can also be estimated while generating 720° panorama. These parameters are focal length, principle point, and lens radial distortion. The panoramic images can then be processed with closerange photogrammetry procedures to extract the exterior orientation parameters and generate 3D point clouds. In this research, VisaulSFM, a structure from motion software is used to estimate the exterior orientation, and CMVS toolkit is used to generate 3D point clouds. Next, the 3D point clouds are used as references to create building interior models. In this research, Trimble Sketchup was used to build the model, and the 3D point cloud was added to the determining of locations of building objects using plane finding procedure. In the texturing process, the panorama images are used as the data source for creating model textures. This 3D indoor model was used as an Augmented Reality model replacing a guide map or a floor plan commonly used in an on-line touring guide system. The 3D indoor model generating procedure has been utilized in two research projects: a cultural heritage site at Kinmen, and Taipei Main Station pedestrian zone guidance and navigation system. The

  3. A computer-based training system combining virtual reality and multimedia

    NASA Technical Reports Server (NTRS)

    Stansfield, Sharon A.

    1993-01-01

    Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment. The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system.

  4. A computer-based training system combining virtual reality and multimedia

    SciTech Connect

    Stansfield, S.A.

    1993-04-28

    Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment: The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system.

  5. A method of 3-D data information storage with virtual holography

    NASA Astrophysics Data System (ADS)

    Huang, Zhen; Liu, Guodong; Ren, Zhong; Zeng, Lüming

    2008-12-01

    In this paper, a new method of 3-D data cube based on virtual holographic storage is presented. Firstly, the data information is encoded in the form of 3-D data cube with a certain algorithm, in which the interval along coordinates between every data is d. Using the plane-scanning method, the 3-D cube can be described as a assembly of slices which are parallel planes along the coordinates at an interval of d. The dot on the slice represents a bit. The bright one means "1", while the dark one means "0". Secondly, a hologram of the 3-D cube is obtained by computer with virtual optics technology. All the information of a 3-D cube can be described by a 2-D hologram. At last, the hologram is inputted in the SLM, and recorded in the recording material by intersecting two coherent laser beams. When the 3-D data is exported, a reference light illuminates the hologram, and a CCD is used to get the object image which is a hologram of the 3-D data. Then the 3-D data is computed with virtual optical technology. Compared with 2-D data page storage, the 3-D data cube storage has outstanding performance in larger capacity of data storage and higher security of data.

  6. Toward virtual anatomy: a stereoscopic 3-D interactive multimedia computer program for cranial osteology.

    PubMed

    Trelease, R B

    1996-01-01

    Advances in computer visualization and user interface technologies have enabled development of "virtual reality" programs that allow users to perceive and to interact with objects in artificial three-dimensional environments. Such technologies were used to create an image database and program for studying the human skull, a specimen that has become increasingly expensive and scarce. Stereoscopic image pairs of a museum-quality skull were digitized from multiple views. For each view, the stereo pairs were interlaced into a single, field-sequential stereoscopic picture using an image processing program. The resulting interlaced image files are organized in an interactive multimedia program. At run-time, gray-scale 3-D images are displayed on a large-screen computer monitor and observed through liquid-crystal shutter goggles. Users can then control the program and change views with a mouse and cursor to point-and-click on screen-level control words ("buttons"). For each view of the skull, an ID control button can be used to overlay pointers and captions for important structures. Pointing and clicking on "hidden buttons" overlying certain structures triggers digitized audio spoken word descriptions or mini lectures. PMID:8793223

  7. Teaching dentistry by means of virtual reality--the Geneva project.

    PubMed

    Curnier, François

    2010-01-01

    After a brief historical introduction of virtual reality, the article focuses on why virtual reality is the next step in dental education. Contrary to existing systems for preclinical courses, such as plastic teeth and dummies, virtual reality has no limitations in terms of clinical case studies, objective evaluation, and interactivity. For the past six years we have been developing innovative concepts using force feedback arms and computer 3D simulation at the University of Geneva. After describing the simulator itself, we discuss the results of a preliminary survey we initiated in 2006. The survey concerns the teaching of dental anatomy using 3D rendering capabilities of the simulator for third-year students of the University of Geneva. The aim was to validate the added value of IT integration into our curriculum. The results showed that 70% of the students were satisfied or very satisfied with this module and that the simulation boosted their motivation to learn anatomy. It also became evident that IT did not introduce a supplemental complexity that reduced teaching efficiency. This was a clear message for us to develop a second-generation virtual reality dental simulator with improved tactile features to teach drilling procedures. PMID:20879463

  8. Human four-dimensional spatial intuition in virtual reality.

    PubMed

    Ambinder, Michael S; Wang, Ranxiao Frances; Crowell, James A; Francis, George K; Brinkmann, Peter

    2009-10-01

    It is a long-lasting question whether human beings, who evolved in a physical world of three dimensions, are capable of overcoming this fundamental limitation to develop an intuitive understanding of four-dimensional space. Techniques of analogy and graphical illustration have been developed with some subjective reports of success. However, there has been no objective evaluation of such achievements. Here, we show evidence that people with basic geometric knowledge can learn to make spatial judgments on the length of, and angle between, line segments embedded in four-dimensional space viewed in virtual reality with minimal exposure to the task and no feedback to their responses. Their judgments incorporated information from both the three-dimensional (3-D) projection and the fourth dimension, and the underlying representations were not algebraic in nature but based on visual imagery, although primitive and short lived. These results suggest that human spatial representations are not completely constrained by our evolution and development in a 3-D world. Illustration of the stimuli and experimental procedure (as video clips) and the instruction to participants (as a PDF file) may be downloaded from http://pbr.psychonomic-journals.org/content/supplemental. PMID:19815783

  9. The Virtual-casing Principle For 3D Toroidal Systems

    SciTech Connect

    Lazerson, Samuel A.

    2014-02-24

    The capability to calculate the magnetic field due to the plasma currents in a toroidally confined magnetic fusion equilibrium is of manifest relevance to equilibrium reconstruction and stellarator divertor design. Two methodologies arise for calculating such quantities. The first being a volume integral over the plasma current density for a given equilibrium. Such an integral is computationally expensive. The second is a surface integral over a surface current on the equilibrium boundary. This method is computationally desirable as the calculation does not grow as the radial resolution of the volume integral. This surface integral method has come to be known as the "virtual-casing principle". In this paper, a full derivation of this method is presented along with a discussion regarding its optimal application.

  10. 3D structure of nucleon with virtuality distributions

    NASA Astrophysics Data System (ADS)

    Radyushkin, Anatoly

    2014-09-01

    We describe a new approach to transverse momentum dependence in hard processes. Our starting point is coordinate representation for matrix elements of operators (in the simplest case, bilocal O (0 , z)) describing a hadron with momentum p. Treated as functions of (pz) and z2, they are parametrized through parton virtuality distribution (PVD) Φ (x , σ) , with x being Fourier-conjugate to (pz) and σ Laplace-conjugate to z2. For intervals with z+ = 0 , we introduce the transverse momentum distribution (TMD) f (x ,k⊥) , and write it in terms of PVD Φ (x , σ) . The results of covariant calculations, written in terms of Φ (x , σ) are converted into expressions involving f (x ,k⊥) . We propose models for soft PVDs/TMDs,and describe how one can generate high-k⊥ tails of TMDs from primordial soft distributions. We describe a new approach to transverse momentum dependence in hard processes. Our starting point is coordinate representation for matrix elements of operators (in the simplest case, bilocal O (0 , z)) describing a hadron with momentum p. Treated as functions of (pz) and z2, they are parametrized through parton virtuality distribution (PVD) Φ (x , σ) , with x being Fourier-conjugate to (pz) and σ Laplace-conjugate to z2. For intervals with z+ = 0 , we introduce the transverse momentum distribution (TMD) f (x ,k⊥) , and write it in terms of PVD Φ (x , σ) . The results of covariant calculations, written in terms of Φ (x , σ) are converted into expressions involving f (x ,k⊥) . We propose models for soft PVDs/TMDs,and describe how one can generate high-k⊥ tails of TMDs from primordial soft distributions. Supported by Jefferson Science Associates, LLC under U.S. DOE Contract #DE-AC05-06OR23177 and by U.S. DOE Grant #DE-FG02-97ER41028.

  11. Use of Virtual Reality for Space Flight

    NASA Technical Reports Server (NTRS)

    Harm, Deborah; Taylor, L. C.; Reschke, M. F.

    2011-01-01

    Virtual environments offer unique training opportunities, particularly for training astronauts and preadapting them to the novel sensory conditions of microgravity. Two unresolved human factors issues in virtual reality (VR) systems are: 1) potential "cybersickness", and 2) maladaptive sensorimotor performance following exposure to VR systems. Interestingly, these aftereffects are often quite similar to adaptive sensorimotor responses observed in astronauts during and/or following space flight. Active exploratory behavior in a new environment, with resulting feedback and the formation of new associations between sensory inputs and response outputs, promotes appropriate perception and motor control in the new environment. Thus, people adapt to consistent, sustained alterations of sensory input such as those produced by microgravity. Our research examining the effects of repeated exposures to a full field of view dome VR system showed that motion sickness and initial decrements in eye movement and postural control were greatly diminished following three exposures. These results suggest that repeated transitions between VR and the normal environment preflight might be a useful countermeasure for neurosensory and sensorimotor effects of space flight. The range of VR applications is enormous, extending from ground-based VR training for extravehicular activities at NASA, to medical and educational uses. It seems reasonable to suggest that other space related uses of VR should be investigated. For example, 1) use of head-mounted VR on orbit to rehearse/practice upcoming operational activities, and 2) ground-based VR training for emergency egress procedures. We propose that by combining VR designed for operational activities preflight, along with an appropriate schedule to facilitate sensorimotor adaptation and improve spatial orientation would potentially accomplish two important goals for astronauts and cosmonauts, preflight sensorimotor adaption and enhanced operational

  12. The VRFurnace: A Virtual Reality Application for Energy System Data Analysis

    SciTech Connect

    Peter Eric Johnson

    2001-05-01

    This paper presents the Virtual Reality Furnace (VRFurnace) application, an interactive 3-D visualization platform for pulverized coal furnace analysis. The VRFurnace is a versatile toolkit where a variety of different CFD data sets related to pulverized coal furnaces can be studied interactively. The toolkit combines standard CFD analysis techniques with tools that more effectively utilize the 3-D capabilities of a virtual environment. Interaction with data is achieved through a dynamic instructional menu system. The application has been designed for use in a projection-based system which allows engineers, management, and operators to see and interact with the data at the same time. Future developments are discussed and will include the ability to combine multiple power plant components into a single application, allow remote collaboration between different virtual environments, and allow users to make changes to a flow field and see the results of these changes as they are made creating a complete virtual power plant.

  13. Virtual reality and women's health: a breast biopsy system.

    PubMed

    Vahora, F; Temkin, B; Marcy, W; Gorman, P J; Krummel, T M; Heinrichs, W L

    1999-01-01

    Minimally invasive procedures are becoming much more common in surgical practice because of the many advantages for patient comfort and convenience, and improved surgical access. However some of the major problems leading to occasional surgical errors with this minimal access method are restricted vision, limited sense of touch, difficulties in identification in 3D space of the position of the instrument tips, and their handling during delicate, short-distance movements toward the surgical target area. These factors emphasize the need for computer simulated training in surgical manipulations and procedures in preparation for conducting them in patients. The key new feature of our proof-of-concept training simulator is a preventive mechanism that serves at least two functions. As the surgical target (or a critical structure) is approached, a haptically generated preventive force forewarns the surgeon, making it possible to abort those maneuvers that may lead to adverse results. By announcing a potential collision of a virtual instrument tip with a surgical target, the time used for searching for the target is shortened, and the haptic signal minimizes the potential of tissue damage. This real-time, interactive, virtual reality based, haptic breast biopsy-training simulation is a PC/NT based multitasking, multithreading system. It is based upon an advanced force feedback device. The system monitors and indirectly guides the surgeon's movements, while providing high fidelity visual and force feedback cues as the area of surgical interest is approached. Our first application is with human breast. PMID:10538389

  14. Retinal imaging with virtual reality stimulus for studying Salticidae retinas

    NASA Astrophysics Data System (ADS)

    Schiesser, Eric; Canavesi, Cristina; Long, Skye; Jakob, Elizabeth; Rolland, Jannick P.

    2014-12-01

    We present a 3-path optical system for studying the retinal movement of jumping spiders: a visible OLED virtual reality system presents stimulus, while NIR illumination and imaging systems observe retinal movement.

  15. Virtual reality and project management for astronomy

    NASA Astrophysics Data System (ADS)

    Martínez, L. A.; Villarreal, J. L.; Angeles, F.; Bernal, A.; Bribiesca, E.; Flores, R.

    2010-07-01

    Over the years astronomical instrumentation projects are becoming increasingly complex making necessary to find efficient ways for project communication management. While all projects share the need to communicate project information, the required information and the methods of distribution vary widely between projects and project staff. A particular problem experienced on many projects regardless of their size, is related to the amount of design, planning information and how that is distributed among the project stakeholders. One way to improve project communications management is to use a workflow that offers a predefined way to share information in a project. Virtual Reality (VR) offers the possibility to get a visual feedback of designed components without the expenses of prototype building, giving an experience that mimics real life situations using a computer. In this contribution we explore VR as a communication technology that helps to manage instrumentation projects by means of a workflow implemented on a software package called Discut designed at Universidad Nacional Autónoma de Mexico (UNAM). The workflow can integrate VR environments generated as CAD models.

  16. Using virtual reality to study food cravings.

    PubMed

    Ledoux, Tracey; Nguyen, Anthony S; Bakos-Block, Christine; Bordnick, Patrick

    2013-12-01

    Food cravings (FCs) are associated with overeating and obesity and are triggered by environmental cues. The study of FCs is challenged by difficulty replicating the natural environment in a laboratory. Virtual reality (VR) could be used to deliver naturalistic cues in a laboratory. The purpose of this study was to investigate whether food related cues delivered by VR could induce greater FCs than neutral VR cues, photographic food cues, or real food. Sixty normal weight non-dieting women were recruited; and, to prevent a floor effect, half were primed with a monotonous diet (MD). Experimental procedures involved delivering neutral cues via VR and food related cues via VR, photographs, and real food in counterbalanced order while measuring subjective (self-report) and objective (salivation) FCs. FCs produced by VR were marginally greater than a neutral cue, not significantly different from picture cues, and significantly less than real food. The modest effects may have been due to quality of the VR system and/or measures of FC (i.e., self-report and salivation). FC threshold among non-dieting normal weight women was lowered with the use of a MD condition. Weight loss programs with monotonous diets may inadvertently increase FCs making diet compliance more difficult. PMID:24055758

  17. Measuring performance in virtual reality phacoemulsification surgery

    NASA Astrophysics Data System (ADS)

    Söderberg, Per; Laurell, Carl-Gustaf; Simawi, Wamidh; Skarman, Eva; Nordh, Leif; Nordqvist, Per

    2008-02-01

    We have developed a virtual reality (VR) simulator for phacoemulsification surgery. The current work aimed at developing a relative performance index that characterizes the performance of an individual trainee. We recorded measurements of 28 response variables during three iterated surgical sessions in 9 experienced cataract surgeons, separately for the sculpting phase and the evacuation phase of phacoemulsification surgery and compared their outcome to that of a reference group of naive trainees. We defined an individual overall performance index, an individual class specific performance index and an individual variable specific performance index. We found that on an average the experienced surgeons performed at a lower level than a reference group of naive trainees but that this was particularly attributed to a few surgeons. When their overall performance index was further analyzed as class specific performance index and variable specific performance index it was found that the low level performance was attributed to a behavior that is acceptable for an experienced surgeon but not for a naive trainee. It was concluded that relative performance indices should use a reference group that corresponds to the measured individual since the definition of optimal surgery may vary among trainee groups depending on their level of experience.

  18. Performance index for virtual reality phacoemulsification surgery

    NASA Astrophysics Data System (ADS)

    Söderberg, Per; Laurell, Carl-Gustaf; Simawi, Wamidh; Skarman, Eva; Nordqvist, Per; Nordh, Leif

    2007-02-01

    We have developed a virtual reality (VR) simulator for phacoemulsification (phaco) surgery. The current work aimed at developing a performance index that characterizes the performance of an individual trainee. We recorded measurements of 28 response variables during three iterated surgical sessions in 9 subjects naive to cataract surgery and 6 experienced cataract surgeons, separately for the sculpting phase and the evacuation phase of phacoemulsification surgery. We further defined a specific performance index for a specific measurement variable and a total performance index for a specific trainee. The distribution function for the total performance index was relatively evenly distributed both for the sculpting and the evacuation phase indicating that parametric statistics can be used for comparison of total average performance indices for different groups in the future. The current total performance index for an individual considers all measurement variables included with the same weight. It is possible that a future development of the system will indicate that a better characterization of a trainee can be obtained if the various measurements variables are given specific weights. The currently developed total performance index for a trainee is statistically an independent observation of that particular trainee.

  19. STS-118 Astronaut Dave Williams Trains Using Virtual Reality Hardware

    NASA Technical Reports Server (NTRS)

    2007-01-01

    STS-118 astronaut and mission specialist Dafydd R. 'Dave' Williams, representing the Canadian Space Agency, uses Virtual Reality Hardware in the Space Vehicle Mock Up Facility at the Johnson Space Center to rehearse some of his duties for the upcoming mission. This type of virtual reality training allows the astronauts to wear special gloves and other gear while looking at a computer that displays simulating actual movements around the various locations on the station hardware which with they will be working.

  20. Understanding Human Perception of Building Categories in Virtual 3d Cities - a User Study

    NASA Astrophysics Data System (ADS)

    Tutzauer, P.; Becker, S.; Niese, T.; Deussen, O.; Fritsch, D.

    2016-06-01

    Virtual 3D cities are becoming increasingly important as a means of visually communicating diverse urban-related information. To get a deeper understanding of a human's cognitive experience of virtual 3D cities, this paper presents a user study on the human ability to perceive building categories (e.g. residential home, office building, building with shops etc.) from geometric 3D building representations. The study reveals various dependencies between geometric properties of the 3D representations and the perceptibility of the building categories. Knowledge about which geometries are relevant, helpful or obstructive for perceiving a specific building category is derived. The importance and usability of such knowledge is demonstrated based on a perception-guided 3D building abstraction process.

  1. Visualization of three-dimensional ultra-high resolution OCT in virtual reality.

    PubMed

    Schulze, Jürgen P; Schulze-Döbold, Claudia; Erginay, Ali; Tadayoni, Ramin

    2013-01-01

    Three-dimensional reconstruction of optical coherence tomography (OCT) images is a modern technique that helps interpret the images and understand the underlying disease. However, the 3D reconstruction displayed on commercial devices is of limited quality: images are shown on 2D screens and it is difficult or impossible to adjust the view point and capture the data set from a meaningful perspective. We did a preliminary study to evaluate the applicability of a novel, 3D TV-based virtual reality system with interactive volume rendering software to clinical diagnostics and present a workflow, which can incorporate virtual reality technology at various levels of immersion into the daily medical practice, from interactive VR systems to printed media. PMID:23400189

  2. Role of virtual reality for cerebral palsy management.

    PubMed

    Weiss, Patrice L Tamar; Tirosh, Emanuel; Fehlings, Darcy

    2014-08-01

    Virtual reality is the use of interactive simulations to present users with opportunities to perform in virtual environments that appear, sound, and less frequently, feel similar to real-world objects and events. Interactive computer play refers to the use of a game where a child interacts and plays with virtual objects in a computer-generated environment. Because of their distinctive attributes that provide ecologically realistic and motivating opportunities for active learning, these technologies have been used in pediatric rehabilitation over the past 15 years. The ability of virtual reality to create opportunities for active repetitive motor/sensory practice adds to their potential for neuroplasticity and learning in individuals with neurologic disorders. The objectives of this article is to provide an overview of how virtual reality and gaming are used clinically, to present the results of several example studies that demonstrate their use in research, and to briefly remark on future developments. PMID:24799367

  3. Hybrid 3D reconstruction and image-based rendering techniques for reality modeling

    NASA Astrophysics Data System (ADS)

    Sequeira, Vitor; Wolfart, Erik; Bovisio, Emanuele; Biotti, Ester; Goncalves, Joao G. M.

    2000-12-01

    This paper presents a component approach that combines in a seamless way the strong features of laser range acquisition with the visual quality of purely photographic approaches. The relevant components of the system are: (i) Panoramic images for distant background scenery where parallax is insignificant; (ii) Photogrammetry for background buildings and (iii) High detailed laser based models for the primary environment, structure of exteriors of buildings and interiors of rooms. These techniques have a wide range of applications in visualization, virtual reality, cost effective as-built analysis of architectural and industrial environments, building facilities management, real-estate, E-commerce, remote inspection of hazardous environments, TV production and many others.

  4. The Usability of Online Geographic Virtual Reality for Urban Planning

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Moore, A. B.

    2013-08-01

    Virtual reality (VR) technology is starting to become widely and freely available (for example the online OpenSimulator tool), with potential for use in 3D urban planning and design tasks but still needing rigorous assessment to establish this. A previous study consulted with a small group of urban professionals, who concluded in a satisfaction usability test that online VR had potential value as a usable 3D communication and remote marketing tool but acknowledged that visual quality and geographic accuracy were obstacles to overcome. This research takes the investigation a significant step further to also examine the usability aspects of efficiency (how quickly tasks are completed) and effectiveness (how successfully tasks are completed), relating to OpenSimulator in an urban planning situation. The comparative study pits a three-dimensional VR model (with increased graphic fidelity and geographic content to address the feedback of the previous study) of a subdivision design (in a Dunedin suburb) against 3D models built with GIS (ArcGIS) and CAD (BricsCAD) tools, two types of software environment well established in urban professional practice. Urban professionals participated in the study by attempting to perform timed tasks correctly in each of the environments before being asked questions about the technologies involved and their perceived importance to their professional work. The results reinforce the positive feedback for VR of the previous study, with the graphical and geographic data issues being somewhat addressed (though participants stressed the need for accurate and precise object and terrain modification capabilities in VR). Ease-ofuse and associated fastest task completion speed were significant positive outcomes to emerge from the comparison with GIS and CAD, pointing to a strong future for VR in an urban planning context.

  5. Integration of virtual and real scenes within an integral 3D imaging environment

    NASA Astrophysics Data System (ADS)

    Ren, Jinsong; Aggoun, Amar; McCormick, Malcolm

    2002-11-01

    The Imaging Technologies group at De Montfort University has developed an integral 3D imaging system, which is seen as the most likely vehicle for 3D television avoiding psychological effects. To create real fascinating three-dimensional television programs, a virtual studio that performs the task of generating, editing and integrating the 3D contents involving virtual and real scenes is required. The paper presents, for the first time, the procedures, factors and methods of integrating computer-generated virtual scenes with real objects captured using the 3D integral imaging camera system. The method of computer generation of 3D integral images, where the lens array is modelled instead of the physical camera is described. In the model each micro-lens that captures different elemental images of the virtual scene is treated as an extended pinhole camera. An integration process named integrated rendering is illustrated. Detailed discussion and deep investigation are focused on depth extraction from captured integral 3D images. The depth calculation method from the disparity and the multiple baseline method that is used to improve the precision of depth estimation are also presented. The concept of colour SSD and its further improvement in the precision is proposed and verified.

  6. A new approach towards image based virtual 3D city modeling by using close range photogrammetry

    NASA Astrophysics Data System (ADS)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2014-05-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing day to day for various engineering and non-engineering applications. Generally three main image based approaches are using for virtual 3D city models generation. In first approach, researchers used Sketch based modeling, second method is Procedural grammar based modeling and third approach is Close range photogrammetry based modeling. Literature study shows that till date, there is no complete solution available to create complete 3D city model by using images. These image based methods also have limitations This paper gives a new approach towards image based virtual 3D city modeling by using close range photogrammetry. This approach is divided into three sections. First, data acquisition process, second is 3D data processing, and third is data combination process. In data acquisition process, a multi-camera setup developed and used for video recording of an area. Image frames created from video data. Minimum required and suitable video image frame selected for 3D processing. In second section, based on close range photogrammetric principles and computer vision techniques, 3D model of area created. In third section, this 3D model exported to adding and merging of other pieces of large area. Scaling and alignment of 3D model was done. After applying the texturing and rendering on this model, a final photo-realistic textured 3D model created. This 3D model transferred into walk-through model or in movie form. Most of the processing steps are automatic. So this method is cost effective and less laborious. Accuracy of this model is good. For this research work, study area is the campus of department of civil engineering, Indian Institute of Technology, Roorkee. This campus acts as a prototype for city. Aerial photography is restricted in many country

  7. Virtual Reality as a Tool in the Education

    ERIC Educational Resources Information Center

    Piovesan, Sandra Dutra; Passerino, Liliana Maria; Pereira, Adriana Soares

    2012-01-01

    The virtual reality is being more and more used in the education, enabling the student to find out, to explore and to build his own knowledge. This paper presents an Educational Software for presence or distance education, for subjects of Formal Language, where the student can manipulate virtually the target that must be explored, analyzed and…

  8. Using Immersive Virtual Reality for Electrical Substation Training

    ERIC Educational Resources Information Center

    Tanaka, Eduardo H.; Paludo, Juliana A.; Cordeiro, Carlúcio S.; Domingues, Leonardo R.; Gadbem, Edgar V.; Euflausino, Adriana

    2015-01-01

    Usually, distribution electricians are called upon to solve technical problems found in electrical substations. In this project, we apply problem-based learning to a training program for electricians, with the help of a virtual reality environment that simulates a real substation. Using this virtual substation, users may safely practice maneuvers…

  9. Soldier evaluation of the virtual reality Iraq.

    PubMed

    Reger, Greg M; Gahm, Gregory A; Rizzo, Albert A; Swanson, Robert; Duma, Susan

    2009-01-01

    Repeated combat deployments to Iraq and Afghanistan are resulting in increased rates of posttraumatic stress disorder (PTSD) among military personnel. Although exposure therapy is an effective treatment for this disorder, some personnel do not significantly respond to treatment, possibly due to poor activation of the trauma memory or a lack of emotional engagement during therapy. In addition, some service members do not seek mental healthcare due to treatment stigma. Researchers recently developed a virtual reality (VR) Iraq to attempt to improve activation of the traumatic memory during exposure therapy and to provide a treatment approach that may be more appealing to some service members, relative to traditional face-to-face talk therapy. Initial validation of the application requires an assessment of how well it represents the experiences of previously deployed service members. This study evaluated the realism of the VR Iraq application according to the subjective evaluation of 93 U.S. Army soldiers who returned from Iraq in the last year. Those screening negative for PTSD used and evaluated a VR tactical convoy and a VR dismounted patrol in a simulated Middle Eastern city. Results indicated that 86% of soldiers rated the overall realism of the VR convoy as ranging from adequate to excellent. Eighty-two percent of soldiers reported adequate-to-excellent overall realism of the city environment. Results provide evidence that the VR Iraq presents a realistic context in which VR exposure therapy can be conducted. However, clinical trials are needed to assess the efficacy of VR exposure therapy for Iraq veterans with PTSD. PMID:19199854

  10. Virtual reality and telepresence for military medicine.

    PubMed

    Satava, R M

    1997-01-01

    For decades, warfighters have been putting in place a sophisticated "digital battlefield", an electronic communication and information system to support advanced technology. Medicine is now in a position to leverage these technologies to produce a fundamental revolution, and the keystone is the digital physician. Today nearly all information about a patient can be acquired electronically, and with the new technologies of teleoperation and telesurgery we can provide remote treatment and even surgery through telemedicine. The following framework for military medicine will leverage upon the current electronic battlefield. A personnel status monitor (PSM) will have a global positioning locator to tell the position of each soldier and a suite of vital signs sensors. When a soldier is wounded, the medic will instantly know the location of the soldier, and how serious is the casualty. This will permit the medic to locate the most critically wounded soldier. Once stabilised, he will be placed in a critical care pod, a fully automated intensive care unit in a stretcher, which will monitor his vital signs, administer fluids and medications and provide environmental protection. If immediate surgery is needed, a remote telepresence surgery vehicle will come to the wounded soldier, the medic will place him in the vehicle, and a surgeon will operate remotely using telepresence surgery from a distant Mobile Advance Surgical Hospital (MASH) to the combat zone. Also, the expertise from any specialist will be available from the rear echelons as far back as the home country. For education and training in combat casualty care, virtual reality simulators are being implemented. This same scenario can be utilised in civilian health care, especially in providing care to patients in remote areas who do not currently have access to simple, let alone sophisticated, health care. PMID:9140589

  11. Evaluating Virtual Reality and Augmented Reality Training for Industrial Maintenance and Assembly Tasks

    ERIC Educational Resources Information Center

    Gavish, Nirit; Gutiérrez, Teresa; Webel, Sabine; Rodríguez, Jorge; Peveri, Matteo; Bockholt, Uli; Tecchia, Franco

    2015-01-01

    The current study evaluated the use of virtual reality (VR) and augmented reality (AR) platforms, developed within the scope of the SKILLS Integrated Project, for industrial maintenance and assembly (IMA) tasks training. VR and AR systems are now widely regarded as promising training platforms for complex and highly demanding IMA tasks. However,…

  12. TV-view-into-reality metaphor: introducing computer vision into virtual worlds

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Knoche, Horst; Rossmann, Juergen

    1998-10-01

    Smart man machine interfaces turn out to be a key technology for service robots, for automation applications in industrial environments as well as in future scenarios for applications in space. For either field, the use of virtual reality (VR) techniques showed a great potential. At the IRF a virtual reality system was developed and implemented which allows the intuitive control of a multi-robot system and different automation systems under one unified VR framework. As the developed multi-robot system is also employed for space application, the intuitive commanding of inspection and teleoperation sequences is of great interest. In order to facilitate teleoperation and inspection, we make use of several metaphors and a vision system as an `intelligent sensor'. One major metaphor to be presented in the paper is the `TV-view into reality', where a TV-set is displayed in the virtual world with images of the real world being mapped onto the screen as textures. The user can move the TV-set in the virtual world and, as the image generating camera is carried by a robot, the camera-viewpoint changes accordingly. Thus the user can explore the physical world `behind' the virtual world, which is ideal for inspection and teleoperation tasks. By means of real world images and with different measurement-services provided by the underlying 3D vision system, the user can thus interactively build up or refine the virtual world according to the physical world he is watching through the TV-set.

  13. Applications and a three-dimensional desktop environment for an immersive virtual reality system

    NASA Astrophysics Data System (ADS)

    Kageyama, Akira; Masada, Youhei

    2013-08-01

    We developed an application launcher called Multiverse for scientific visualizations in a CAVE-type virtual reality (VR) system. Multiverse can be regarded as a type of three-dimensional (3D) desktop environment. In Multiverse, a user in a CAVE room can browse multiple visualization applications with 3D icons and explore movies that float in the air. Touching one of the movies causes "teleportation" into the application's VR space. After analyzing the simulation data using the application, the user can jump back into Multiverse's VR desktop environment in the CAVE.

  14. Two Innovative Steps for Training on Maintenance: 'VIRMAN' Spanish Project based on Virtual Reality 'STARMATE' European Project based on Augmented Reality

    SciTech Connect

    Gonzalez Anez, Francisco

    2002-07-01

    This paper presents two development projects (STARMATE and VIRMAN) focused on supporting training on maintenance. Both projects aim at specifying, designing, developing, and demonstrating prototypes allowing computer guided maintenance of complex mechanical elements using Augmented and Virtual Reality techniques. VIRMAN is a Spanish development project. The objective is to create a computer tool for maintenance training course elaborations and training delivery based on 3D virtual reality models of complex components. The training delivery includes 3D record displays on maintenance procedures with all complementary information for intervention understanding. Users are requested to perform the maintenance intervention trying to follow up the procedure. Users can be evaluated about the level of knowledge achieved. Instructors can check the evaluation records left during the training sessions. VIRMAN is simple software supported by a regular computer and can be used in an Internet framework. STARMATE is a forward step in the area of virtual reality. STARMATE is a European Commission project in the frame of 'Information Societies Technologies'. A consortium of five companies and one research institute shares their expertise in this new technology. STARMATE provides two main functionalities (1) user assistance for achieving assembly/de-assembly and following maintenance procedures, and (2) workforce training. The project relies on Augmented Reality techniques, which is a growing area in Virtual Reality research. The idea of Augmented Reality is to combine a real scene, viewed by the user, with a virtual scene, generated by a computer, augmenting the reality with additional information. The user interface is see-through goggles, headphones, microphone and an optical tracking system. All these devices are integrated in a helmet connected with two regular computers. The user has his hands free for performing the maintenance intervention and he can navigate in the virtual

  15. Embodied collaboration support system for 3D shape evaluation in virtual space

    NASA Astrophysics Data System (ADS)

    Okubo, Masashi; Watanabe, Tomio

    2005-12-01

    Collaboration mainly consists of two tasks; one is each partner's task that is performed by the individual, the other is communication with each other. Both of them are very important objectives for all the collaboration support system. In this paper, a collaboration support system for 3D shape evaluation in virtual space is proposed on the basis of both studies in 3D shape evaluation and communication support in virtual space. The proposed system provides the two viewpoints for each task. One is the viewpoint of back side of user's own avatar for the smooth communication. The other is that of avatar's eye for 3D shape evaluation. Switching the viewpoints satisfies the task conditions for 3D shape evaluation and communication. The system basically consists of PC, HMD and magnetic sensors, and users can share the embodied interaction by observing interaction between their avatars in virtual space. However, the HMD and magnetic sensors, which are put on the users, would restrict the nonverbal communication. Then, we have tried to compensate the loss of nodding of partner's avatar by introducing the speech-driven embodied interactive actor InterActor. Sensory evaluation by paired comparison of 3D shapes in the collaborative situation in virtual space and in real space and the questionnaire are performed. The result demonstrates the effectiveness of InterActor's nodding in the collaborative situation.

  16. Applying a 3D Situational Virtual Learning Environment to the Real World Business--An Extended Research in Marketing

    ERIC Educational Resources Information Center

    Wang, Shwu-huey

    2012-01-01

    In order to understand (1) what kind of students can be facilitated through the help of three-dimensional virtual learning environment (3D VLE), and (2) the relationship between a conventional test (ie, paper and pencil test) and the 3D VLE used in this study, the study designs a 3D virtual supermarket (3DVS) to help students transform their role…

  17. A Virtual Reality Simulator Prototype for Learning and Assessing Phaco-sculpting Skills

    NASA Astrophysics Data System (ADS)

    Choi, Kup-Sze

    This paper presents a virtual reality based simulator prototype for learning phacoemulsification in cataract surgery, with focus on the skills required for making a cross-shape trench in cataractous lens by an ultrasound probe during the phaco-sculpting procedure. An immersive virtual environment is created with 3D models of the lens and surgical tools. Haptic device is also used as 3D user interface. Phaco-sculpting is simulated by interactively deleting the constituting tetrahedrons of the lens model. Collisions between the virtual probe and the lens are effectively identified by partitioning the space containing the lens hierarchically with an octree. The simulator can be programmed to collect real-time quantitative user data for reviewing and assessing trainee's performance in an objective manner. A game-based learning environment can be created on top of the simulator by incorporating gaming elements based on the quantifiable performance metrics.

  18. Informatics in radiology: Intuitive user interface for 3D image manipulation using augmented reality and a smartphone as a remote control.

    PubMed

    Nakata, Norio; Suzuki, Naoki; Hattori, Asaki; Hirai, Naoya; Miyamoto, Yukio; Fukuda, Kunihiko

    2012-01-01

    Although widely used as a pointing device on personal computers (PCs), the mouse was originally designed for control of two-dimensional (2D) cursor movement and is not suited to complex three-dimensional (3D) image manipulation. Augmented reality (AR) is a field of computer science that involves combining the physical world and an interactive 3D virtual world; it represents a new 3D user interface (UI) paradigm. A system for 3D and four-dimensional (4D) image manipulation has been developed that uses optical tracking AR integrated with a smartphone remote control. The smartphone is placed in a hard case (jacket) with a 2D printed fiducial marker for AR on the back. It is connected to a conventional PC with an embedded Web camera by means of WiFi. The touch screen UI of the smartphone is then used as a remote control for 3D and 4D image manipulation. Using this system, the radiologist can easily manipulate 3D and 4D images from computed tomography and magnetic resonance imaging in an AR environment with high-quality image resolution. Pilot assessment of this system suggests that radiologists will be able to manipulate 3D and 4D images in the reading room in the near future. Supplemental material available at http://radiographics.rsna.org/lookup/suppl/doi:10.1148/rg.324115086/-/DC1. PMID:22556316

  19. EXPLORING ENVIRONMENTAL DATA IN A HIGHLY IMMERSIVE VIRTUAL REALITY ENVIRONMENT

    EPA Science Inventory

    Geography inherently fills a 3D space and yet we struggle with displaying geography using, primaarily, 2D display devices. Virtual environments offer a more realistically-dimensioned display space and this is being realized in the expanding area of research on 3D Geographic Infor...

  20. Virtual Reality and Learning: Where Is the Pedagogy?

    ERIC Educational Resources Information Center

    Fowler, Chris

    2015-01-01

    The aim of this paper was to build upon Dalgarno and Lee's model or framework of learning in three-dimensional (3-D) virtual learning environments (VLEs) and to extend their road map for further research in this area. The enhanced model shares the common goal with Dalgarno and Lee of identifying the learning benefits from using 3-D VLEs. The…

  1. Approach to Constructing 3d Virtual Scene of Irrigation Area Using Multi-Source Data

    NASA Astrophysics Data System (ADS)

    Cheng, S.; Dou, M.; Wang, J.; Zhang, S.; Chen, X.

    2015-10-01

    For an irrigation area that is often complicated by various 3D artificial ground features and natural environment, disadvantages of traditional 2D GIS in spatial data representation, management, query, analysis and visualization is becoming more and more evident. Building a more realistic 3D virtual scene is thus especially urgent for irrigation area managers and decision makers, so that they can carry out various irrigational operations lively and intuitively. Based on previous researchers' achievements, a simple, practical and cost-effective approach was proposed in this study, by adopting3D geographic information system (3D GIS), remote sensing (RS) technology. Based on multi-source data such as Google Earth (GE) high-resolution remote sensing image, ASTER G-DEM, hydrological facility maps and so on, 3D terrain model and ground feature models were created interactively. Both of the models were then rendered with texture data and integrated under ArcGIS platform. A vivid, realistic 3D virtual scene of irrigation area that has a good visual effect and possesses primary GIS functions about data query and analysis was constructed.Yet, there is still a long way to go for establishing a true 3D GIS for the irrigation are: issues of this study were deeply discussed and future research direction was pointed out in the end of the paper.

  2. Interactive graphical model building using telepresence and virtual reality

    SciTech Connect

    Cooke, C.; Stansfield, S.

    1993-10-01

    This paper presents a prototype system developed at Sandia National Laboratories to create and verify computer-generated graphical models of remote physical environments. The goal of the system is to create an interface between an operator and a computer vision system so that graphical models can be created interactively. Virtual reality and telepresence are used to allow interaction between the operator, computer, and remote environment. A stereo view of the remote environment is produced by two CCD cameras. The cameras are mounted on a three degree-of-freedom platform which is slaved to a mechanically-tracked, stereoscopic viewing device. This gives the operator a sense of immersion in the physical environment. The stereo video is enhanced by overlaying the graphical model onto it. Overlay of the graphical model onto the stereo video allows visual verification of graphical models. Creation of a graphical model is accomplished by allowing the operator to assist the computer in modeling. The operator controls a 3-D cursor to mark objects to be modeled. The computer then automatically extracts positional and geometric information about the object and creates the graphical model.

  3. A Virtual Reality Technique for Multi-phase Flows

    NASA Astrophysics Data System (ADS)

    Loth, Eric; Sherman, William; Auman, Aric; Navarro, Christopher

    2004-04-01

    A virtual reality (VR) technique has been developed to allow user immersion (stereo-graphic rendering, user tracking and object interactivity) in generic unsteady three-dimensional multi-phase flow data sets. This article describes the structure and logic used to design and construct a VR technique that employs a multi-phase flow-field computed a priori as an input (i.e. simulations are conducted beforehand with a researcher's multi-phase CFD code). The input field for this flow visualization is divided into two parts: the Eulerian three-dimensional grid nodes and velocities for the continuous fluid properties (specified using conventional TECLOT data format) and the Lagrangian time-history trajectory files for the dispersed fluid. While tracking the dispersed phase trajectories as animated spheres of adjustable size and number, the continuous-phase flow can be simultaneously rendered with velocity vectors, iso-contour surfaces and planar flood-contour maps of different variables. The geometric and notional view of the combined visualization of both phases is interactively controlled throughout a user session. The resulting technique is demonstrated with a 3-D unsteady data set of Lagrangian particles dispersing in a Eulerian description of a turbulent boundary layer, stemming from a direct numerical simulation of the Navier-Stokes equations.

  4. Implementing Virtual Reality Technology as an Effective Web Based Kiosk: Darulaman's Teacher Training College Tour (Ipda Vr Tour)

    ERIC Educational Resources Information Center

    Fadzil, Azman

    2006-01-01

    At present, the development of Virtual Reality (VR) technology is expanding due to the importance and needs to use the 3D elements and 360 degrees panorama in expressing a clearer picture to consumers in various fields such as education, military, medicine, entertainment and so on. The web based VR kiosk project in Darulaman's Teacher Training…

  5. Valorisation of Cultural Heritage Through Virtual Visit and Augmented Reality: the Case of the Abbey of Epau (france)

    NASA Astrophysics Data System (ADS)

    Simonetto, E.; Froment, C.; Labergerie, E.; Ferré, G.; Séchet, B.; Chédorge, H.; Cali, J.; Polidori, L.

    2013-07-01

    Terrestrial Laser Scanning (TLS), 3-D modeling and its Web visualization are the three key steps needed to perform storage and grant-free and wide access to cultural heritage, as highlighted in many recent examples. The goal of this study is to set up 3-D Web resources for "virtually" visiting the exterior of the Abbaye de l'Epau, an old French abbey which has both a rich history and delicate architecture. The virtuality is considered in two ways: the flowing navigation in a virtual reality environment around the abbey and a game activity using augmented reality. First of all, the data acquisition consists in GPS and tacheometry survey, terrestrial laser scanning and photography acquisition. After data pre-processing, the meshed and textured 3-D model is generated using 3-D Reshaper commercial software. The virtual reality visit and augmented reality animation are then created using Unity software. This work shows the interest of such tools in bringing out the regional cultural heritage and making it attractive to the public.

  6. Implementing Virtual Reality Technology as an Effective WEB Based KIOSK: Darulaman's Teacher Training College Tour (IPDA VR Tour)

    ERIC Educational Resources Information Center

    Azman, Fadzil

    2004-01-01

    At present the development of Virtual Reality (VR) technology is expanding due to the importance and needs to use the 3D elements and 360 degrees panorama. In expressing a clearer picture to consumers in various fields such as education, military, medicine, entertainment and so on. In live with the development the web based VR kiosk project in…

  7. Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments

    NASA Astrophysics Data System (ADS)

    Portalés, Cristina; Lerma, José Luis; Navarro, Santiago

    2010-01-01

    Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.

  8. 3D global estimation and augmented reality visualization of intra-operative X-ray dose.

    PubMed

    Rodas, Nicolas Loy; Padoy, Nicolas

    2014-01-01

    The growing use of image-guided minimally-invasive surgical procedures is confronting clinicians and surgical staff with new radiation exposure risks from X-ray imaging devices. The accurate estimation of intra-operative radiation exposure can increase staff awareness of radiation exposure risks and enable the implementation of well-adapted safety measures. The current surgical practice of wearing a single dosimeter at chest level to measure radiation exposure does not provide a sufficiently accurate estimation of radiation absorption throughout the body. In this paper, we propose an approach that combines data from wireless dosimeters with the simulation of radiation propagation in order to provide a global radiation risk map in the area near the X-ray device. We use a multi-camera RGBD system to obtain a 3D point cloud reconstruction of the room. The positions of the table, C-arm and clinician are then used 1) to simulate the propagation of radiation in a real-world setup and 2) to overlay the resulting 3D risk-map onto the scene in an augmented reality manner. By using real-time wireless dosimeters in our system, we can both calibrate the simulation and validate its accuracy at specific locations in real-time. We demonstrate our system in an operating room equipped with a robotised X-ray imaging device and validate the radiation simulation on several X-ray acquisition setups. PMID:25333145

  9. Teaching Digital Natives: 3-D Virtual Science Lab in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Franklin, Teresa J.

    2008-01-01

    This paper presents the development of a 3-D virtual environment in Second Life for the delivery of standards-based science content for middle school students in the rural Appalachian region of Southeast Ohio. A mixed method approach in which quantitative results of improved student learning and qualitative observations of implementation within…

  10. Socialisation for Learning at a Distance in a 3-D Multi-User Virtual Environment

    ERIC Educational Resources Information Center

    Edirisingha, Palitha; Nie, Ming; Pluciennik, Mark; Young, Ruth

    2009-01-01

    This paper reports findings of a pilot study that examined the pedagogical potential of "Second Life" (SL), a popular three-dimensional multi-user virtual environment (3-D MUVE) developed by the Linden Lab. The study is part of a 1-year research and development project titled "Modelling of Secondlife Environments" (http://www.le.ac.uk/moose)…

  11. Supporting Distributed Team Working in 3D Virtual Worlds: A Case Study in Second Life

    ERIC Educational Resources Information Center

    Minocha, Shailey; Morse, David R.

    2010-01-01

    Purpose: The purpose of this paper is to report on a study into how a three-dimensional (3D) virtual world (Second Life) can facilitate socialisation and team working among students working on a team project at a distance. This models the situation in many commercial sectors where work is increasingly being conducted across time zones and between…

  12. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars.

    PubMed

    Patil, Shashidhar; Chintalapalli, Harinadha Reddy; Kim, Dubeom; Chai, Youngho

    2015-01-01

    In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor. PMID:26094629

  13. The Cognitive Apprenticeship Theory for the Teaching of Mathematics in an Online 3D Virtual Environment

    ERIC Educational Resources Information Center

    Bouta, Hara; Paraskeva, Fotini

    2013-01-01

    Research spanning two decades shows that there is a continuing development of 3D virtual worlds and investment in such environments for educational purposes. Research stresses the need for these environments to be well-designed and for suitable pedagogies to be implemented in the teaching practice in order for these worlds to be fully effective.…

  14. Design and Implementation of a 3D Multi-User Virtual World for Language Learning

    ERIC Educational Resources Information Center

    Ibanez, Maria Blanca; Garcia, Jose Jesus; Galan, Sergio; Maroto, David; Morillo, Diego; Kloos, Carlos Delgado

    2011-01-01

    The best way to learn is by having a good teacher and the best language learning takes place when the learner is immersed in an environment where the language is natively spoken. 3D multi-user virtual worlds have been claimed to be useful for learning, and the field of exploiting them for education is becoming more and more active thanks to the…

  15. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars

    PubMed Central

    Patil, Shashidhar; Chintalapalli, Harinadha Reddy; Kim, Dubeom; Chai, Youngho

    2015-01-01

    In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor. PMID:26094629

  16. GEARS a 3D Virtual Learning Environment and Virtual Social and Educational World Used in Online Secondary Schools

    ERIC Educational Resources Information Center

    Barkand, Jonathan; Kush, Joseph

    2009-01-01

    Virtual Learning Environments (VLEs) are becoming increasingly popular in online education environments and have multiple pedagogical advantages over more traditional approaches to education. VLEs include 3D worlds where students can engage in simulated learning activities such as Second Life. According to Claudia L'Amoreaux at Linden Lab, "at…

  17. Virtual rough samples to test 3D nanometer-scale scanning electron microscopy stereo photogrammetry

    NASA Astrophysics Data System (ADS)

    Villarrubia, J. S.; Tondare, V. N.; Vladár, A. E.

    2016-03-01

    The combination of scanning electron microscopy for high spatial resolution, images from multiple angles to provide 3D information, and commercially available stereo photogrammetry software for 3D reconstruction offers promise for nanometer-scale dimensional metrology in 3D. A method is described to test 3D photogrammetry software by the use of virtual samples—mathematical samples from which simulated images are made for use as inputs to the software under test. The virtual sample is constructed by wrapping a rough skin with any desired power spectral density around a smooth near-trapezoidal line with rounded top corners. Reconstruction is performed with images simulated from different angular viewpoints. The software's reconstructed 3D model is then compared to the known geometry of the virtual sample. Three commercial photogrammetry software packages were tested. Two of them produced results for line height and width that were within close to 1 nm of the correct values. All of the packages exhibited some difficulty in reconstructing details of the surface roughness.

  18. A virtual-reality-based haptic surgical training system.

    PubMed

    Weiss, Holger; Ortmaier, Tobias; Maass, Heiko; Hirzinger, Gerd; Kuehnapfel, Uwe

    2003-01-01

    To improve training facilities for surgeons, a surgical training system based on virtual reality techniques has been developed. The goal of the developed system is to improve education of surgeons by making the knowledge of expert surgeons directly available to trainees. The system realizes two different approaches: the library and the driving school paradigm. In its current form, the system consists of two modules. The main module combines the virtual reality kernel KISMET, a visual and haptic display, and a database of different operations and/or techniques. The master station is a copy of the input and output facilities of the main module. Both modules communicate by a TCP/ IP-based connection. Initial tests demonstrated the feasibility of the chosen framework. Further developments include the gathering of data not only from virtual reality but also from real operations. Robotic-assisted surgery provides an attractive way of accomplishing this. PMID:15529957

  19. Using virtual 3D audio in multispeech channel and multimedia environments

    NASA Astrophysics Data System (ADS)

    Orosz, Michael D.; Karplus, Walter J.; Balakrishnan, Jerry D.

    2000-08-01

    The advantages and disadvantages of using virtual 3-D audio in mission-critical, multimedia display interfaces were evaluated. The 3D audio platform seems to be an especially promising candidate for aircraft cockpits, flight control rooms, and other command and control environments in which operators must make mission-critical decisions while handling demanding and routine tasks. Virtual audio signal processing creates the illusion for a listener wearing conventional earphones that each of a multiplicity of simultaneous speech or audio channels is originating from a different, program- specified location in virtual space. To explore the possible uses of this new, readily available technology, a test bed simulating some of the conditions experienced by the chief flight test coordinator at NASA's Dryden Flight Research Center was designed and implemented. Thirty test subjects simultaneously performed routine tasks requiring constant hand-eye coordination, while monitoring four speech channels, each generating continuous speech signals, for the occurrence of pre-specified keywords. Performance measures included accuracy in identifying the keywords, accuracy in identifying the speaker of the keyword, and response time. We found substantial improvements on all of these measures when comparing virtual audio with conventional, monaural transmissions. We also explored the effect on operator performance of different spatial configurations of the audio sources in 3-D space, simulated movement (dither) in the source locations, and of providing graphical redundancy. Some of these manipulations were less effective and may even decrease performance efficiency, even though they improve some aspects of the virtual space simulation.

  20. Design and application of real-time visual attention model for the exploration of 3D virtual environments.

    PubMed

    Hillaire, Sébastien; Lécuyer, Anatole; Regia-Corte, Tony; Cozot, Rémi; Royan, Jérôme; Breton, Gaspard

    2012-03-01

    This paper studies the design and application of a novel visual attention model designed to compute user's gaze position automatically, i.e., without using a gaze-tracking system. The model we propose is specifically designed for real-time first-person exploration of 3D virtual environments. It is the first model adapted to this context which can compute in real time a continuous gaze point position instead of a set of 3D objects potentially observed by the user. To do so, contrary to previous models which use a mesh-based representation of visual objects, we introduce a representation based on surface-elements. Our model also simulates visual reflexes and the cognitive processes which take place in the brain such as the gaze behavior associated to first-person navigation in the virtual environment. Our visual attention model combines both bottom-up and top-down components to compute a continuous gaze point position on screen that hopefully matches the user's one. We conducted an experiment to study and compare the performance of our method with a state-of-the-art approach. Our results are found significantly better with sometimes more than 100 percent of accuracy gained. This suggests that computing a gaze point in a 3D virtual environment in real time is possible and is a valid approach, compared to object-based approaches. Finally, we expose different applications of our model when exploring virtual environments. We present different algorithms which can improve or adapt the visual feedback of virtual environments based on gaze information. We first propose a level-of-detail approach that heavily relies on multiple-texture sampling. We show that it is possible to use the gaze information of our visual attention model to increase visual quality where the user is looking, while maintaining a high-refresh rate. Second, we introduce the use of the visual attention model in three visual effects inspired by the human visual system namely: depth-of-field blur, camera

  1. Visualizing Mars Using Virtual Reality: A State of the Art Mapping Technique Used on Mars Pathfinder

    NASA Technical Reports Server (NTRS)

    Stoker, C.; Zbinden, E.; Blackmon, T.; Nguyen, L.

    1999-01-01

    We describe an interactive terrain visualization system which rapidly generates and interactively displays photorealistic three-dimensional (3-D) models produced from stereo images. This product, first demonstrated in Mars Pathfinder, is interactive, 3-D, and can be viewed in an immersive display which qualifies it for the name Virtual Reality (VR). The use of this technology on Mars Pathfinder was the first use of VR for geologic analysis. A primary benefit of using VR to display geologic information is that it provides an improved perception of depth and spatial layout of the remote site. The VR aspect of the display allows an operator to move freely in the environment, unconstrained by the physical limitations of the perspective from which the data were acquired. Virtual Reality offers a way to archive and retrieve information in a way that is intuitively obvious. Combining VR models with stereo display systems can give the user a sense of presence at the remote location. The capability, to interactively perform measurements from within the VR model offers unprecedented ease in performing operations that are normally time consuming and difficult using other techniques. Thus, Virtual Reality can be a powerful a cartographic tool. Additional information is contained in the original extended abstract.

  2. Future directions for the development of virtual reality within an automotive manufacturer.

    PubMed

    Lawson, Glyn; Salanitri, Davide; Waterfield, Brian

    2016-03-01

    Virtual Reality (VR) can reduce time and costs, and lead to increases in quality, in the development of a product. Given the pressure on car companies to reduce time-to-market and to continually improve quality, the automotive industry has championed the use of VR across a number of applications, including design, manufacturing, and training. This paper describes interviews with 11 engineers and employees of allied disciplines from an automotive manufacturer about their current physical and virtual properties and processes. The results guided a review of research findings and scientific advances from the academic literature, which formed the basis of recommendations for future developments of VR technologies and applications. These include: develop a greater range of virtual contexts; use multi-sensory simulation; address perceived differences between virtual and real cars; improve motion capture capabilities; implement networked 3D technology; and use VR for market research. PMID:26164106

  3. Exploring Learning through Audience Interaction in Virtual Reality Dome Theaters

    NASA Astrophysics Data System (ADS)

    Apostolellis, Panagiotis; Daradoumis, Thanasis

    Informal learning in public spaces like museums, science centers and planetariums is increasingly popular during the last years. Recent advancements in large-scale displays allowed contemporary technology-enhanced museums to get equipped with digital domes, some with real-time capabilities like Virtual Reality systems. By conducting extensive literature review we have come to the conclusion that little to no research has been carried out on the leaning outcomes that the combination of VR and audience interaction can provide in the immersive environments of dome theaters. Thus, we propose that audience collaboration in immersive virtual reality environments presents a promising approach to support effective learning in groups of school aged children.

  4. Virtual Charter Schools: Realities and Unknowns

    ERIC Educational Resources Information Center

    Torre, Daniela

    2013-01-01

    Virtual charter schools have emerged over the last decade as an increasingly popular alternative to traditional public schooling. Unlike their face-to-face counterparts, virtual charter schools educate students through blended or entirely online curricula. They present a host of new policy issues that should be scrutinized in order to ensure that…

  5. Accuracy of 3D Virtual Planning of Corrective Osteotomies of the Distal Radius.

    PubMed

    Stockmans, Filip; Dezillie, Marleen; Vanhaecke, Jeroen

    2013-11-01

    Corrective osteotomies of the distal radius for symptomatic malunion are time-tested procedures that rely on accurate corrections. Patients with combined intra- and extra-articular malunions present a challenging deformity. Virtual planning and patient-specific instruments (PSIs) to transfer the planning into the operating room have been used both to simplify the surgery and to make it more accurate. This report focuses on the clinically achieved accuracy in four patients treated between 2008 and 2012 with virtual planning and PSIs for a combined intra- and extraarticular malunion of the distal radius. The accuracy of the correction is quantified by comparing the virtual three-dimensional (3D) planning model with the postoperative 3D bone model. For the extraarticular malunion the 3D volar tilt, 3D radial inclination and 3D ulnar variance are measured. The volar tilt is undercorrected in all cases with an average of -6 ± 6°. The average difference between the postoperative and planned 3D radial inclination was -1 ± 5°. The average difference between the postoperative and planned 3D ulnar variances is 0 ± 1 mm. For the evaluation of the intraarticular malunion, both the arc method of measurement and distance map measurement are used. The average postoperative maximum gap is 2.1 ± 0.9 mm. The average maximum postoperative step-off is 1.3 ± 0.4 mm. The average distance between the postoperative and planned articular surfaces is 1.1 ± 0.6 mm as determined in the distance map measurement. There is a tendency to achieve higher accuracy as experience builds up, both on the surgeon's side and on the design engineering side. We believe this technology holds the potential to achieve consistent accuracy of very complex corrections. PMID:24436834

  6. Accuracy of 3D Virtual Planning of Corrective Osteotomies of the Distal Radius

    PubMed Central

    Stockmans, Filip; Dezillie, Marleen; Vanhaecke, Jeroen

    2013-01-01

    Corrective osteotomies of the distal radius for symptomatic malunion are time-tested procedures that rely on accurate corrections. Patients with combined intra- and extra-articular malunions present a challenging deformity. Virtual planning and patient-specific instruments (PSIs) to transfer the planning into the operating room have been used both to simplify the surgery and to make it more accurate. This report focuses on the clinically achieved accuracy in four patients treated between 2008 and 2012 with virtual planning and PSIs for a combined intra- and extraarticular malunion of the distal radius. The accuracy of the correction is quantified by comparing the virtual three-dimensional (3D) planning model with the postoperative 3D bone model. For the extraarticular malunion the 3D volar tilt, 3D radial inclination and 3D ulnar variance are measured. The volar tilt is undercorrected in all cases with an average of –6 ± 6°. The average difference between the postoperative and planned 3D radial inclination was –1 ± 5°. The average difference between the postoperative and planned 3D ulnar variances is 0 ± 1 mm. For the evaluation of the intraarticular malunion, both the arc method of measurement and distance map measurement are used. The average postoperative maximum gap is 2.1 ± 0.9 mm. The average maximum postoperative step-off is 1.3 ± 0.4 mm. The average distance between the postoperative and planned articular surfaces is 1.1 ± 0.6 mm as determined in the distance map measurement. There is a tendency to achieve higher accuracy as experience builds up, both on the surgeon's side and on the design engineering side. We believe this technology holds the potential to achieve consistent accuracy of very complex corrections. PMID:24436834

  7. An Interactive 3D Virtual Anatomy Puzzle for Learning and Simulation - Initial Demonstration and Evaluation.

    PubMed

    Messier, Erik; Wilcox, Jascha; Dawson-Elli, Alexander; Diaz, Gabriel; Linte, Cristian A

    2016-01-01

    To inspire young students (grades 6-12) to become medical practitioners and biomedical engineers, it is necessary to expose them to key concepts of the field in a way that is both exciting and informative. Recent advances in medical image acquisition, manipulation, processing, visualization, and display have revolutionized the approach in which the human body and internal anatomy can be seen and studied. It is now possible to collect 3D, 4D, and 5D medical images of patient specific data, and display that data to the end user using consumer level 3D stereoscopic display technology. Despite such advancements, traditional 2D modes of content presentation such as textbooks and slides are still the standard didactic equipment used to teach young students anatomy. More sophisticated methods of display can help to elucidate the complex 3D relationships between structures that are so often missed when viewing only 2D media, and can instill in students an appreciation for the interconnection between medicine and technology. Here we describe the design, implementation, and preliminary evaluation of a 3D virtual anatomy puzzle dedicated to helping users learn the anatomy of various organs and systems by manipulating 3D virtual data. The puzzle currently comprises several components of the human anatomy and can be easily extended to include additional organs and systems. The 3D virtual anatomy puzzle game was implemented and piloted using three display paradigms - a traditional 2D monitor, a 3D TV with active shutter glass, and the DK2 version Oculus Rift, as well as two different user interaction devices - a space mouse and traditional keyboard controls. PMID:27046584

  8. Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve

    2011-03-01

    Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.

  9. Thermal feedback in virtual reality and telerobotic systems

    NASA Technical Reports Server (NTRS)

    Zerkus, Mike; Becker, Bill; Ward, Jon; Halvorsen, Lars

    1994-01-01

    A new concept has been developed that allows temperature to be part of the virtual world. The Displaced Temperature Sensing System (DTSS) can 'display' temperature in a virtual reality system.The DTSS can also serve as a feedback device for telerobotics. For virtual reality applications the virtual world software would be required to have a temperature map of its world. By whatever means (magnetic tracker, ultrasound tracker, etc.) the hand and fingers, which have been instrumented with thermodes, would be tracked. The temperature associated with the current position would be transmitted to the DRSS via a serial data link. The DTSS would provide that temperature to the fingers. For telerobotic operation the function of the DTSS is to transmit a temperature from a remote location to the fingers where the temperature can be felt.

  10. Comparative analysis of video processing and 3D rendering for cloud video games using different virtualization technologies

    NASA Astrophysics Data System (ADS)

    Bada, Adedayo; Alcaraz-Calero, Jose M.; Wang, Qi; Grecos, Christos

    2014-05-01

    This paper describes a comprehensive empirical performance evaluation of 3D video processing employing the physical/virtual architecture implemented in a cloud environment. Different virtualization technologies, virtual video cards and various 3D benchmarks tools have been utilized in order to analyse the optimal performance in the context of 3D online gaming applications. This study highlights 3D video rendering performance under each type of hypervisors, and other factors including network I/O, disk I/O and memory usage. Comparisons of these factors under well-known virtual display technologies such as VNC, Spice and Virtual 3D adaptors reveal the strengths and weaknesses of the various hypervisors with respect to 3D video rendering and streaming.

  11. A Calligraphy Mastering Support System Using Virtual Reality Technology and its Learning Effects

    NASA Astrophysics Data System (ADS)

    Muranaka, Noriaki; Yamamoto, Takafumi; Imanishi, Shigeru

    The virtual reality is one in the intelligence information carrier technology. As the application to the education field of the virtual reality, we examine a support system for the calligraphy mastering. The purpose of this system is to realize the model where the information during writing progress can be real time displayed. As a result, we can learn calligraphy casually without being limited to the place and the time. We use 3-D computer graphics (CG) for the virtual image to decrease memory capacity and we examine about the learning effect of this system. We can not get the sense of the writing brush from the tablet pen of the conventional system. Therefore, we developed the writing brush type input device which is near the sense of the writing brush. Moreover, we developed the automatic animation generation processing used 3-D CG. The practice person can experience the subtle writing progress of calligraphy from the eye of calligraphy teacher using this animation. We are using a semitransparent screen and a half mirror instead of HMD to use in general VR. Practice persons improve at the short time by learning their writing brush forming die input pen synchronous with virtual writing brush.

  12. Visualization of the public transportation infrastructure services using virtual reality standards

    NASA Astrophysics Data System (ADS)

    Gracanin, Denis

    2002-07-01

    Public infrastructure services like transportation, energy, air quality, water quality, etc., are characterized by a presence of rich information that can be leveraged to provide more efficient support to public infrastructure decision-making. That information has significant geospatial and temporal characteristics and consists of large quantities of data from a variety of data sources that should be stored efficiently and then integrated in a common framework. A mapping between the data elements and their spatial and temporal locations may be used to provide service information (e.g. traffic conditions) in a meaningful form. Service information is, in general, complex, multi-dimensional , physical or abstract information that is intrinsically difficult to represent and manipulate. Visualization techniques, in conjuction with simulation and data modeling can be extremely useful tools to address many public infrastructure issues and problems. In our previous work, a virtual reality based interface for simulation and evaluation of airport Automated People Movers (APMs) has been implemented using Virtual Reality Modeling Language (VRML). That work has been extended to use Extended 3D (X3D) virtual reality standard. Since the X3D standard is based on the Extensible Markup Language (XML), the integration with the available data sources is improved. The paper focuses on issues related to public infrastructures serivces in transportation and how to include and visualize information from the real world in a virtual environment. The real world in transportation includes building structures, streets and other guide-ways, vehicles, pedestrians, control signals, etc. while the transportation data sources provide information about vehicular and pedestrian traffic and current control strategies.

  13. Virtual Reality Simulation of the International Space Welding Experiment

    NASA Technical Reports Server (NTRS)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.

  14. Serious games for screening pre-dementia conditions: from virtuality to reality? A pilot project

    PubMed Central

    Zucchella, Chiara; Sinforiani, Elena; Tassorelli, Cristina; Cavallini, Elena; Tost-Pardell, Daniela; Grau, Sergi; Pazzi, Stefania; Puricelli, Stefano; Bernini, Sara; Bottiroli, Sara; Vecchi, Tomaso; Sandrini, Giorgio; Nappi, Giuseppe

    2014-01-01

    Summary Conventional cognitive assessment is based on a pencil-and-paper neuropsychological evaluation, which is time consuming, expensive and requires the involvement of several professionals. Information and communication technology could be exploited to allow the development of tools that are easy to use, reduce the amount of data processing, and provide controllable test conditions. Serious games (SGs) have the potential to be new and effective tools in the management and treatment of cognitive impairments in the elderly. Moreover, by adopting SGs in 3D virtual reality settings, cognitive functions might be evaluated using tasks that simulate daily activities, increasing the “ecological validity” of the assessment. In this commentary we report our experience in the creation of the Smart Aging platform, a 3D SG-and virtual environment-based platform for the early identification and characterization of mild cognitive impairment. PMID:25473734

  15. [A new concept in digestive surgery: the computer assisted surgical procedure, from virtual reality to telemanipulation].

    PubMed

    Marescaux, J; Clément, J M; Nord, M; Russier, Y; Tassetti, V; Mutter, D; Cotin, S; Ayache, N

    1997-11-01

    Surgical simulation increasingly appears to be an essential aspect of tomorrow's surgery. The development of a hepatic surgery simulator is an advanced concept calling for a new writing system which will transform the medical world: virtual reality. Virtual reality extends the perception of our five senses by representing more than the real state of things by the means of computer sciences and robotics. It consists of three concepts: immersion, navigation and interaction. Three reasons have led us to develop this simulator: the first is to provide the surgeon with a comprehensive visualisation of the organ. The second reason is to allow for planning and surgical simulation that could be compared with the detailed flight-plan for a commercial jet pilot. The third lies in the fact that virtual reality is an integrated part of the concept of computer assisted surgical procedure. The project consists of a sophisticated simulator which has to include five requirements: visual fidelity, interactivity, physical properties, physiological properties, sensory input and output. In this report we will describe how to get a realistic 3D model of the liver from bi-dimensional 2D medical images for anatomical and surgical training. The introduction of a tumor and the consequent planning and virtual resection is also described, as are force feedback and real-time interaction. PMID:9554121

  16. Using virtual reality environment to improve joint attention associated with pervasive developmental disorder.

    PubMed

    Cheng, Yufang; Huang, Ruowen

    2012-01-01

    The focus of this study is using data glove to practice Joint attention skill in virtual reality environment for people with pervasive developmental disorder (PDD). The virtual reality environment provides a safe environment for PDD people. Especially, when they made errors during practice in virtual reality environment, there is no suffering or dangerous consequences to deal with. Joint attention is a critical skill in the disorder characteristics of children with PDD. The absence of joint attention is a deficit frequently affects their social relationship in daily life. Therefore, this study designed the Joint Attention Skills Learning (JASL) systems with data glove tool to help children with PDD to practice joint attention behavior skills. The JASL specifically focus the skills of pointing, showing, sharing things and behavior interaction with other children with PDD. The system is designed in playroom-scene and presented in the first-person perspectives for users. The functions contain pointing and showing, moving virtual objects, 3D animation, text, speaking sounds, and feedback. The method was employed single subject multiple-probe design across subjects' designs, and analysis of visual inspection in this study. It took 3 months to finish the experimental section. Surprisingly, the experiment results reveal that the participants have further extension in improving the joint attention skills in their daily life after using the JASL system. The significant potential in this particular treatment of joint attention for each participant will be discussed in details in this paper. PMID:22776822

  17. fVisiOn: glasses-free tabletop 3D display to provide virtual 3D media naturally alongside real media

    NASA Astrophysics Data System (ADS)

    Yoshida, Shunsuke

    2012-06-01

    A novel glasses-free tabletop 3D display, named fVisiOn, floats virtual 3D objects on an empty, flat, tabletop surface and enables multiple viewers to observe raised 3D images from any angle at 360° Our glasses-free 3D image reproduction method employs a combination of an optical device and an array of projectors and produces continuous horizontal parallax in the direction of a circular path located above the table. The optical device shapes a hollow cone and works as an anisotropic diffuser. The circularly arranged projectors cast numerous rays into the optical device. Each ray represents a particular ray that passes a corresponding point on a virtual object's surface and orients toward a viewing area around the table. At any viewpoint on the ring-shaped viewing area, both eyes collect fractional images from different projectors, and all the viewers around the table can perceive the scene as 3D from their perspectives because the images include binocular disparity. The entire principle is installed beneath the table, so the tabletop area remains clear. No ordinary tabletop activities are disturbed. Many people can naturally share the 3D images displayed together with real objects on the table. In our latest prototype, we employed a handmade optical device and an array of over 100 tiny projectors. This configuration reproduces static and animated 3D scenes for a 130° viewing area and allows 5-cm-tall virtual characters to play soccer and dance on the table.

  18. Representing 3D virtual objects: interaction between visuo-spatial ability and type of exploration.

    PubMed

    Meijer, Frank; van den Broek, Egon L

    2010-03-17

    We investigated individual differences in interactively exploring 3D virtual objects. 36 participants explored 24 simple and 24 difficult objects (composed of respectively three and five Biederman geons) actively, passively, or not at all. Both their 3D mental representation of the objects and visuo-spatial ability was assessed. Results show that, regardless of the object's complexity, people with a low VSA benefit from active exploration of objects, where people with a middle or high VSA do not. These findings extend and refine earlier research on interactively learning visuo-spatial information and underline the importance to take individual differences into account. PMID:20116394

  19. Teaching Marketing through a Micro-Economy in Virtual Reality

    ERIC Educational Resources Information Center

    Drake-Bridges, Erin; Strelzoff, Andrew; Sulbaran, Tulio

    2011-01-01

    Teaching retailing principles to students is a challenge because although real-world wholesale and retail decision making very heavily depends on dynamic conditions, classroom exercises are limited to abstract discussions and role-playing. This article describes two interlocking class projects taught using the virtual reality of secondlife.com,…

  20. Improving Weight Maintenance Using Virtual Reality (Second Life)

    ERIC Educational Resources Information Center

    Sullivan, Debra K.; Goetz, Jeannine R.; Gibson, Cheryl A.; Washburn, Richard A.; Smith, Bryan K.; Lee, Jaehoon; Gerald, Stephanie; Fincham, Tennille; Donnelly, Joseph E.

    2013-01-01

    Objective: Compare weight loss and maintenance between a face-to-face (FTF) weight management clinic and a clinic delivered via virtual reality (VR). Methods: Participants were randomized to 3 months of weight loss with a weekly clinic delivered via FTF or VR and then 6 months' weight maintenance delivered with VR. Data were collected at baseline…

  1. A Constructivist Approach to Virtual Reality for Experiential Learning

    ERIC Educational Resources Information Center

    Aiello, P.; D'Elia, F.; Di Tore, S.; Sibilio, M.

    2012-01-01

    Consideration of a possible use of virtual reality technologies in school contexts requires gathering together the suggestions of many scientific domains aimed at "understanding" the features of these same tools that let them offer valid support to the teaching-learning processes in educational settings. Specifically, the present study is aimed at…

  2. A Virtual Reality Dance Training System Using Motion Capture Technology

    ERIC Educational Resources Information Center

    Chan, J. C. P.; Leung, H.; Tang, J. K. T.; Komura, T.

    2011-01-01

    In this paper, a new dance training system based on the motion capture and virtual reality (VR) technologies is proposed. Our system is inspired by the traditional way to learn new movements-imitating the teacher's movements and listening to the teacher's feedback. A prototype of our proposed system is implemented, in which a student can imitate…

  3. Virtual Reality in Psychological, Medical and Pedagogical Applications

    ERIC Educational Resources Information Center

    Eichenberg, Christiane, Ed.

    2012-01-01

    This book has an aim to present latest applications, trends and developments of virtual reality technologies in three humanities disciplines: in medicine, psychology and pedagogy. Studies show that people in both educational as well as in the medical therapeutic range expect more and more that modern media are included in the corresponding demand…

  4. Exploration through Virtual Reality: Encounters with the Target Culture

    ERIC Educational Resources Information Center

    O'Brien, Mary Grantham; Levy, Richard M.

    2008-01-01

    This paper presents the results of a study on the use of a virtual reality (VR) world in a German language classroom. After participating in a lesson on the use of commands, students experienced the language and culture through navigation in a VR world. It is argued that this new medium allows for students to be immersed in the target culture and…

  5. Language Learning in Virtual Reality Environments: Past, Present, and Future

    ERIC Educational Resources Information Center

    Lin, Tsun-Ju; Lan, Yu-Ju

    2015-01-01

    This study investigated the research trends in language learning in a virtual reality environment by conducting a content analysis of findings published in the literature from 2004 to 2013 in four top ranked computer-assisted language learning journals: "Language Learning & Technology," "CALICO Journal," "Computer…

  6. Issues Surrounding the Use of Virtual Reality in Geographic Education

    ERIC Educational Resources Information Center

    Lisichenko, Richard

    2015-01-01

    As with all classroom innovations intended to improve geographic education, the adoption of virtual reality (VR) poses issues for consideration prior to endorsing its use. Of these, effectiveness, implementation, and safe use need to be addressed. Traditionally, sense of place, geographic knowledge, and firsthand experiences provided by field…

  7. Using Virtual Reality To Bring Your Instruction to Life.

    ERIC Educational Resources Information Center

    Gaddis, Tony

    Prepared by the manager of a virtual reality (VR) laboratory at North Carolina's Haywood Community College, the three papers collected in this document are designed to help instructors incorporate VR into their classes. The first paper reviews the characteristics of VR, defining it as a computer-generated simulation of a three-dimensional…

  8. Immersive Training Systems: Virtual Reality and Education and Training.

    ERIC Educational Resources Information Center

    Psotka, Joseph

    1995-01-01

    Describes virtual reality (VR) technology and VR research on education and training. Focuses on immersion as the key added value of VR, analyzes cognitive variables connected to immersion, how it is generated in synthetic environments and its benefits. Discusses value of tracked, immersive visual displays over nonimmersive simulations. Contains 78…

  9. Virtual Reality Hypermedia Design Frameworks for Science Instruction.

    ERIC Educational Resources Information Center

    Maule, R. William; Oh, Byron; Check, Rosa

    This paper reports on a study that conceptualizes a research framework to aid software design and development for virtual reality (VR) computer applications for instruction in the sciences. The framework provides methodologies for the processing, collection, examination, classification, and presentation of multimedia information within hyperlinked…

  10. Virtual Reality for Life Skills Education: Program Evaluation

    ERIC Educational Resources Information Center

    Vogel, Jennifer; Bowers, Clint; Meehan, Cricket; Hoeft, Raegan; Bradley, Kristy

    2004-01-01

    A program evaluation was completed for a Virtual Reality (VR) pilot project intended to aid deaf children in learning various life skills which they may be at risk of not adequately learning. Such skills include crossing the street safely, exiting a building during a fire drill, and avoiding situations in which strangers may harm them. The VR was…

  11. Virtual Reality Augmentation for Functional Assessment and Treatment of Stuttering

    ERIC Educational Resources Information Center

    Brundage, Shelley B.

    2007-01-01

    Stuttering characteristics, assessment, and treatment principles present challenges to assessment and treatment that can be addressed with virtual reality (VR) technology. This article describes how VR can be used to assist clinicians in meeting some of these challenges with adults who stutter. A review of current VR research at the Stuttering…

  12. Virtual Reality, a New Tool for a New Educational Paradigm.

    ERIC Educational Resources Information Center

    de Lurdes A S Morais Camacho, Maria

    1998-01-01

    Discusses the use of virtual reality as a new educational technology, its added value, and implementation possibilities. Includes an example in the field of archaeology which is being developed in Portugal that can be used for the reconstruction of architectonic and archaeological heritage. (LRW)

  13. Are Spatial Visualization Abilities Relevant to Virtual Reality?

    ERIC Educational Resources Information Center

    Chen, Chwen Jen

    2006-01-01

    This study aims to investigate the effects of virtual reality (VR)-based learning environment on learners of different spatial visualization abilities. The findings of the aptitude-by-treatment interaction study have shown that learners benefit most from the Guided VR mode, irrespective of their spatial visualization abilities. This indicates that…

  14. Virtual Reality: Alive and Well in the Inner-City.

    ERIC Educational Resources Information Center

    Willman, Jay

    2001-01-01

    At R. B. Russell Vocational High School (Winnipeg, Manitoba), which serves economically disadvantaged, primarily First Nations students, a student-developed Web site uses virtual reality and digital video technologies to teach auto mechanics in ways that are relevant to students' diverse learning styles and needs. The project has increased student…

  15. Virtual Reality: An Experiential Tool for Clinical Psychology

    ERIC Educational Resources Information Center

    Riva, Giuseppe

    2009-01-01

    Several Virtual Reality (VR) applications for the understanding, assessment and treatment of mental health problems have been developed in the last 15 years. Typically, in VR the patient learns to manipulate problematic situations related to his/her problem. In fact, VR can be described as an advanced form of human-computer interface that is able…

  16. Virtual Reality: Teaching Tool of the Twenty-First Century?

    ERIC Educational Resources Information Center

    Hoffman, Helene; Vu, Dzung

    1997-01-01

    Virtual reality-based procedural and surgical simulations promise to revolutionize medical training. A wide range of simulations representing diverse content areas and varied implementation strategies are under development or in early use. The new systems will make broad-based training experiences available for students at all levels without risks…

  17. Virtual reality as a distraction technique in chronic pain patients.

    PubMed

    Wiederhold, Brenda K; Gao, Kenneth; Sulea, Camelia; Wiederhold, Mark D

    2014-06-01

    We explored the use of virtual reality distraction techniques for use as adjunctive therapy to treat chronic pain. Virtual environments were specifically created to provide pleasant and engaging experiences where patients navigated on their own through rich and varied simulated worlds. Real-time physiological monitoring was used as a guide to determine the effectiveness and sustainability of this intervention. Human factors studies showed that virtual navigation is a safe and effective method for use with chronic pain patients. Chronic pain patients demonstrated significant relief in subjective ratings of pain that corresponded to objective measurements in peripheral, noninvasive physiological measures. PMID:24892196

  18. Virtual Reality as a Distraction Technique in Chronic Pain Patients

    PubMed Central

    Gao, Kenneth; Sulea, Camelia; Wiederhold, Mark D.

    2014-01-01

    Abstract We explored the use of virtual reality distraction techniques for use as adjunctive therapy to treat chronic pain. Virtual environments were specifically created to provide pleasant and engaging experiences where patients navigated on their own through rich and varied simulated worlds. Real-time physiological monitoring was used as a guide to determine the effectiveness and sustainability of this intervention. Human factors studies showed that virtual navigation is a safe and effective method for use with chronic pain patients. Chronic pain patients demonstrated significant relief in subjective ratings of pain that corresponded to objective measurements in peripheral, noninvasive physiological measures. PMID:24892196

  19. Learning Science in a Virtual Reality Application: The Impacts of Animated-Virtual Actors' Visual Complexity

    ERIC Educational Resources Information Center

    Kartiko, Iwan; Kavakli, Manolya; Cheng, Ken

    2010-01-01

    As the technology in computer graphics advances, Animated-Virtual Actors (AVAs) in Virtual Reality (VR) applications become increasingly rich and complex. Cognitive Theory of Multimedia Learning (CTML) suggests that complex visual materials could hinder novice learners from attending to the lesson properly. On the other hand, previous studies have…

  20. Learning and Teaching in Virtual Worlds: Implications of Virtual Reality for Education.

    ERIC Educational Resources Information Center

    Moore, Paul

    1995-01-01

    Surveys the research into virtual reality (VR) and focuses on the implications of immersive virtual worlds for learning and teaching. Topics include how VR differs from other forms of interactive multimedia, VR and the development of educational theory and methodology, and case studies in educational VR research. (Author/LRW)

  1. The Potential of Using Virtual Reality Technology in Physical Activity Settings

    ERIC Educational Resources Information Center

    Pasco, Denis

    2013-01-01

    In recent years, virtual reality technology has been successfully used for learning purposes. The purposes of the article are to examine current research on the role of virtual reality in physical activity settings and discuss potential application of using virtual reality technology to enhance learning in physical education. The article starts…

  2. Using Virtual Reality Environment to Improve Joint Attention Associated with Pervasive Developmental Disorder

    ERIC Educational Resources Information Center

    Cheng, Yufang; Huang, Ruowen

    2012-01-01

    The focus of this study is using data glove to practice Joint attention skill in virtual reality environment for people with pervasive developmental disorder (PDD). The virtual reality environment provides a safe environment for PDD people. Especially, when they made errors during practice in virtual reality environment, there is no suffering or…

  3. Education about Hallucinations Using an Internet Virtual Reality System: A Qualitative Survey

    ERIC Educational Resources Information Center

    Yellowlees, Peter M.; Cook, James N.

    2006-01-01

    Objective: The authors evaluate an Internet virtual reality technology as an education tool about the hallucinations of psychosis. Method: This is a pilot project using Second Life, an Internet-based virtual reality system, in which a virtual reality environment was constructed to simulate the auditory and visual hallucinations of two patients…

  4. Applications of virtual reality to nuclear safeguards and non-proliferation

    SciTech Connect

    Stansfield, S.

    1996-12-31

    This paper presents several applications of virtual reality relevant to the areas of nuclear safeguards and non-proliferation. Each of these applications was developed to the prototype stage at Sandia National Laboratories` Virtual Reality and Intelligent Simulation laboratory. These applications include the use of virtual reality for facility visualization, training of inspection personnel, and security and monitoring of nuclear facilities.

  5. Versatile, Immersive, Creative and Dynamic Virtual 3-D Healthcare Learning Environments: A Review of the Literature

    PubMed Central

    2008-01-01

    The author provides a critical overview of three-dimensional (3-D) virtual worlds and “serious gaming” that are currently being developed and used in healthcare professional education and medicine. The relevance of this e-learning innovation for teaching students and professionals is debatable and variables influencing adoption, such as increased knowledge, self-directed learning, and peer collaboration, by academics, healthcare professionals, and business executives are examined while looking at various Web 2.0/3.0 applications. There is a need for more empirical research in order to unearth the pedagogical outcomes and advantages associated with this e-learning technology. A brief description of Roger’s Diffusion of Innovations Theory and Siemens’ Connectivism Theory for today’s learners is presented as potential underlying pedagogical tenets to support the use of virtual 3-D learning environments in higher education and healthcare. PMID:18762473

  6. Versatile, immersive, creative and dynamic virtual 3-D healthcare learning environments: a review of the literature.

    PubMed

    Hansen, Margaret M

    2008-01-01

    The author provides a critical overview of three-dimensional (3-D) virtual worlds and "serious gaming" that are currently being developed and used in healthcare professional education and medicine. The relevance of this e-learning innovation for teaching students and professionals is debatable and variables influencing adoption, such as increased knowledge, self-directed learning, and peer collaboration, by academics, healthcare professionals, and business executives are examined while looking at various Web 2.0/3.0 applications. There is a need for more empirical research in order to unearth the pedagogical outcomes and advantages associated with this e-learning technology. A brief description of Roger's Diffusion of Innovations Theory and Siemens' Connectivism Theory for today's learners is presented as potential underlying pedagogical tenets to support the use of virtual 3-D learning environments in higher education and healthcare. PMID:18762473

  7. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy.

    PubMed

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-19

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3D-MIP platform when a larger number of cores is available. PMID:24910506

  8. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    PubMed Central

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl’s law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3D-MIP platform when a larger number of cores is available. PMID:24910506

  9. An improved virtual aberration model to simulate mask 3D and resist effects

    NASA Astrophysics Data System (ADS)

    Kanaya, Reiji; Fujii, Koichi; Imai, Motokatsu; Matsuyama, Tomoyuki; Tsuzuki, Takao; Lin, Qun Ying

    2015-03-01

    As shrinkage of design features progresses, the difference in best focus positions among different patterns is becoming a fatal issue, especially when many patterns co-exist in a layer. The problem arises from three major factors: aberrations of projection optics, mask 3D topography effects, and resist thickness effects. Aberrations in projection optics have already been thoroughly investigated, but mask 3D topography effects and resist thickness effects are still under study. It is well known that mask 3D topography effects can be simulated by various Electro-magnetic Field (EMF) analysis methods. However, it is almost impossible to use them for full chip modeling because all of these methods are extremely computationally intensive. Consequently, they usually apply only to a limited range of mask patterns which are about tens of square micro meters in area. Resist thickness effects on best focus positions are rarely treated as a topic of lithography investigations. Resist 3D effects are treated mostly for resist profile prediction, which also requires an intensive EMF analysis when one needs to predict it accurately. In this paper, we present a simplified Virtual Aberration (VA) model to simulate both mask 3D induced effects and resist thickness effects. A conventional simulator, when applied with this simplified method, can factor in both mask 3D topography effects and resist thickness effects. Thus it can be used to model inter-pattern Best Focus Difference (BFD) issues with the least amount of rigorous EMF analysis.

  10. An Interactive Virtual 3D Tool for Scientific Exploration of Planetary Surfaces

    NASA Astrophysics Data System (ADS)

    Traxler, Christoph; Hesina, Gerd; Gupta, Sanjeev; Paar, Gerhard

    2014-05-01

    In this paper we present an interactive 3D visualization tool for scientific analysis and planning of planetary missions. At the moment scientists have to look at individual camera images separately. There is no tool to combine them in three dimensions and look at them seamlessly as a geologist would do (by walking backwards and forwards resulting in different scales). For this reason a virtual 3D reconstruction of the terrain that can be interactively explored is necessary. Such a reconstruction has to consider multiple scales ranging from orbital image data to close-up surface image data from rover cameras. The 3D viewer allows seamless zooming between these various scales, giving scientists the possibility to relate small surface features (e.g. rock outcrops) to larger geological contexts. For a reliable geologic assessment a realistic surface rendering is important. Therefore the material properties of the rock surfaces will be considered for real-time rendering. This is achieved by an appropriate Bidirectional Reflectance Distribution Function (BRDF) estimated from the image data. The BRDF is implemented to run on the Graphical Processing Unit (GPU) to enable realistic real-time rendering, which allows a naturalistic perception for scientific analysis. Another important aspect for realism is the consideration of natural lighting conditions, which means skylight to illuminate the reconstructed scene. In our case we provide skylights from Mars and Earth, which allows switching between these two modes of illumination. This gives geologists the opportunity to perceive rock outcrops from Mars as they would appear on Earth facilitating scientific assessment. Besides viewing the virtual reconstruction on multiple scales, scientists can also perform various measurements, i.e. geo-coordinates of a selected point or distance between two surface points. Rover or other models can be placed into the scene and snapped onto certain location of the terrain. These are

  11. The use of virtual reality in acrophobia research and treatment.

    PubMed

    Coelho, Carlos M; Waters, Allison M; Hine, Trevor J; Wallis, Guy

    2009-06-01

    Acrophobia, or fear of heights, is a widespread and debilitating anxiety disorder affecting perhaps 1 in 20 adults. Virtual reality (VR) technology has been used in the psychological treatment of acrophobia since 1995, and has come to dominate the treatment of numerous anxiety disorders. It is now known that virtual reality exposure therapy (VRET) regimens are highly effective for acrophobia treatment. This paper reviews current theoretical understanding of acrophobia as well as the evolution of its common treatments from the traditional exposure therapies to the most recent virtually guided ones. In particular, the review focuses on recent innovations in the use of VR technology and discusses the benefits it may offer for examining the underlying causes of the disorder, allowing for the systematic assessment of interrelated factors such as the visual, vestibular and postural control systems. PMID:19282142

  12. The assessment of virtual reality for human anatomy instruction

    NASA Technical Reports Server (NTRS)

    Benn, Karen P.

    1994-01-01

    This research project seeks to meet the objective of science training by developing, assessing, and validating virtual reality as a human anatomy training medium. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment the traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three dimensional, unlike the one dimensional depiction found in textbooks and the two dimensional depiction found on the computer. Virtual reality is a breakthrough technology that allows one to step through the computer screen into a three dimensional world. This technology offers many opportunities to enhance science education. Therefore, a virtual testing environment of the abdominopelvic region of a human cadaver was created to study the placement of body parts within the nine anatomical divisions of the abdominopelvic region and the four abdominal quadrants.

  13. Simulation of color deficiency in virtual reality.

    PubMed

    Jin, Bei; Ai, Zhuming; Rasmussen, Mary

    2005-01-01

    Color deficiency protanopia is simulated in a virtual home environment. A color database is created to set the corresponding relation between each color for normal vision and for protanopia. Based on this database, a second texture system is set up for the home model. The proper texture system is used according to the user's choice on the interactive menu. PMID:15718732

  14. VIRTUAL FENCING-A CONCEPT INTO REALITY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Virtual fencing is a method of controlling animals without ground based, natural or man made structures. Control occurs by altering an animal's behavior through one or more sensory cues administered to the animal after it has attempted to penetrate an electronically generated 3-dimensional boundary....

  15. Decentralized commanding and supervision: the distributed projective virtual reality approach

    NASA Astrophysics Data System (ADS)

    Rossmann, Juergen

    2000-10-01

    As part of the cooperation between the University of Souther California (USC) and the Institute of Robotics Research (IRF) of the University of Dortmund experiments regarding the control of robots over long distances by means of virtual reality based man machine interfaces have been successfully carried out. In this paper, the newly developed virtual reality system that is being used for the control of a multi-robot system for space applications as well as for the control and supervision of industrial robotics and automation applications is presented. The general aim of the development was to provide the framework for Projective Virtual Reality which allows users to project their actions in the virtual world into the real world primarily by means of robots but also by other means of automation. The framework is based on a new approach which builds on the task deduction capabilities of a newly developed virtual reality system and a task planning component. The advantage of this new approach is that robots which work at great distances from the control station can be controlled as easily and intuitively as robots that work right next to the control station. Robot control technology now provides the user in the virtual world with a prolonged arm into the physical environment, thus paving the way for a new quality of user-friendly man machine interfaces for automation applications. Lately, this work has been enhanced by a new structure that allows to distribute the virtual reality application over multiple computers. With this new step, it is now possible for multiple users to work together in the same virtual room, although they may physically be thousands of miles apart. They only need an Internet or ISDN connection to share this new experience. Last but not least, the distribution technology has been further developed to not just allow users to cooperate but to be able to run the virtual world on many synchronized PCs so that a panorama projection or even a cave can

  16. The cognitive apprenticeship theory for the teaching of mathematics in an online 3D virtual environment

    NASA Astrophysics Data System (ADS)

    Bouta, Hara; Paraskeva, Fotini

    2013-03-01

    Research spanning two decades shows that there is a continuing development of 3D virtual worlds and investment in such environments for educational purposes. Research stresses the need for these environments to be well-designed and for suitable pedagogies to be implemented in the teaching practice in order for these worlds to be fully effective. To this end, we propose a pedagogical framework based on the cognitive apprenticeship for deriving principles and guidelines to inform the design, development and use of a 3D virtual environment. This study examines how the use of a 3D virtual world facilitates the teaching of mathematics in primary education by combining design principles and guidelines based on the Cognitive Apprenticeship Theory and the teaching methods that this theory introduces. We focus specifically on 5th and 6th grade students' engagement (behavioral, affective and cognitive) while learning fractional concepts over a period of two class sessions. Quantitative and qualitative analyses indicate considerable improvement in the engagement of the students who participated in the experiment. This paper presents the findings regarding students' cognitive engagement in the process of comprehending basic fractional concepts - notoriously hard for students to master. The findings are encouraging and suggestions are made for further research.

  17. Multiviewer 3D monitor

    NASA Astrophysics Data System (ADS)

    Kostrzewski, Andrew A.; Aye, Tin M.; Kim, Dai Hyun; Esterkin, Vladimir; Savant, Gajendra D.

    1998-09-01

    Physical Optics Corporation has developed an advanced 3-D virtual reality system for use with simulation tools for training technical and military personnel. This system avoids such drawbacks of other virtual reality (VR) systems as eye fatigue, headaches, and alignment for each viewer, all of which are due to the need to wear special VR goggles. The new system is based on direct viewing of an interactive environment. This innovative holographic multiplexed screen technology makes it unnecessary for the viewer to wear special goggles.

  18. 3D Audio System

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Ames Research Center research into virtual reality led to the development of the Convolvotron, a high speed digital audio processing system that delivers three-dimensional sound over headphones. It consists of a two-card set designed for use with a personal computer. The Convolvotron's primary application is presentation of 3D audio signals over headphones. Four independent sound sources are filtered with large time-varying filters that compensate for motion. The perceived location of the sound remains constant. Possible applications are in air traffic control towers or airplane cockpits, hearing and perception research and virtual reality development.

  19. Computer Based Training: Field Deployable Trainer and Shared Virtual Reality

    NASA Technical Reports Server (NTRS)

    Mullen, Terence J.

    1997-01-01

    Astronaut training has traditionally been conducted at specific sites with specialized facilities. Because of its size and nature the training equipment is generally not portable. Efforts are now under way to develop training tools that can be taken to remote locations, including into orbit. Two of these efforts are the Field Deployable Trainer and Shared Virtual Reality projects. Field Deployable Trainer NASA has used the recent shuttle mission by astronaut Shannon Lucid to the Russian space station, Mir, as an opportunity to develop and test a prototype of an on-orbit computer training system. A laptop computer with a customized user interface, a set of specially prepared CD's, and video tapes were taken to the Mir by Ms. Lucid. Based upon the feedback following the launch of the Lucid flight, our team prepared materials for the next Mir visitor. Astronaut John Blaha will fly on NASA/MIR Long Duration Mission 3, set to launch in mid September. He will take with him a customized hard disk drive and a package of compact disks containing training videos, references and maps. The FDT team continues to explore and develop new and innovative ways to conduct offsite astronaut training using personal computers. Shared Virtual Reality Training NASA's Space Flight Training Division has been investigating the use of virtual reality environments for astronaut training. Recent efforts have focused on activities requiring interaction by two or more people, called shared VR. Dr. Bowen Loftin, from the University of Houston, directs a virtual reality laboratory that conducts much of the NASA sponsored research. I worked on a project involving the development of a virtual environment that can be used to train astronauts and others to operate a science unit called a Biological Technology Facility (BTF). Facilities like this will be used to house and control microgravity experiments on the space station. It is hoped that astronauts and instructors will ultimately be able to share

  20. Architecture of web services in the enhancement of real-time 3D video virtualization in cloud environment

    NASA Astrophysics Data System (ADS)

    Bada, Adedayo; Wang, Qi; Alcaraz-Calero, Jose M.; Grecos, Christos

    2016-04-01

    This paper proposes a new approach to improving the application of 3D video rendering and streaming by jointly exploring and optimizing both cloud-based virtualization and web-based delivery. The proposed web service architecture firstly establishes a software virtualization layer based on QEMU (Quick Emulator), an open-source virtualization software that has been able to virtualize system components except for 3D rendering, which is still in its infancy. The architecture then explores the cloud environment to boost the speed of the rendering at the QEMU software virtualization layer. The capabilities and inherent limitations of Virgil 3D, which is one of the most advanced 3D virtual Graphics Processing Unit (GPU) available, are analyzed through benchmarking experiments and integrated into the architecture to further speed up the rendering. Experimental results are reported and analyzed to demonstrate the benefits of the proposed approach.

  1. 3D Virtual Worlds as Art Media and Exhibition Arenas: Students' Responses and Challenges in Contemporary Art Education

    ERIC Educational Resources Information Center

    Lu, Lilly

    2013-01-01

    3D virtual worlds (3D VWs) are considered one of the emerging learning spaces of the 21st century; however, few empirical studies have investigated educational applications and student learning aspects in art education. This study focused on students' responses to and challenges with 3D VWs in both aspects. The findings show that most…

  2. Elderly Healthcare Monitoring Using an Avatar-Based 3D Virtual Environment

    PubMed Central

    Pouke, Matti; Häkkilä, Jonna

    2013-01-01

    Homecare systems for elderly people are becoming increasingly important due to both economic reasons as well as patients’ preferences. Sensor-based surveillance technologies are an expected future trend, but research so far has devoted little attention to the User Interface (UI) design of such systems and the user-centric design approach. In this paper, we explore the possibilities of an avatar-based 3D visualization system, which exploits wearable sensors and human activity simulations. We present a technical prototype and the evaluation of alternative concept designs for UIs based on a 3D virtual world. The evaluation was conducted with homecare providers through focus groups and an online survey. Our results show firstly that systems taking advantage of 3D virtual world visualization techniques have potential especially due to the privacy preserving and simplified information presentation style, and secondly that simple representations and glancability should be emphasized in the design. The identified key use cases highlight that avatar-based 3D presentations can be helpful if they provide an overview as well as details on demand. PMID:24351747

  3. Virtual 3D bladder reconstruction for augmented medical records from white light cystoscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Lurie, Kristen L.; Zlatev, Dimitar V.; Angst, Roland; Liao, Joseph C.; Ellerbee, Audrey K.

    2016-02-01

    Bladder cancer has a high recurrence rate that necessitates lifelong surveillance to detect mucosal lesions. Examination with white light cystoscopy (WLC), the standard of care, is inherently subjective and data storage limited to clinical notes, diagrams, and still images. A visual history of the bladder wall can enhance clinical and surgical management. To address this clinical need, we developed a tool to transform in vivo WLC videos into virtual 3-dimensional (3D) bladder models using advanced computer vision techniques. WLC videos from rigid cystoscopies (1280 x 720 pixels) were recorded at 30 Hz followed by immediate camera calibration to control for image distortions. Video data were fed into an automated structure-from-motion algorithm that generated a 3D point cloud followed by a 3D mesh to approximate the bladder surface. The highest quality cystoscopic images were projected onto the approximated bladder surface to generate a virtual 3D bladder reconstruction. In intraoperative WLC videos from 36 patients undergoing transurethral resection of suspected bladder tumors, optimal reconstruction was achieved from frames depicting well-focused vasculature, when the bladder was maintained at constant volume with minimal debris, and when regions of the bladder wall were imaged multiple times. A significant innovation of this work is the ability to perform the reconstruction using video from a clinical procedure collected with standard equipment, thereby facilitating rapid clinical translation, application to other forms of endoscopy and new opportunities for longitudinal studies of cancer recurrence.

  4. Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation.

    PubMed

    Wang, Junchen; Suenaga, Hideyuki; Liao, Hongen; Hoshi, Kazuto; Yang, Liangjing; Kobayashi, Etsuko; Sakuma, Ichiro

    2015-03-01

    Autostereoscopic 3D image overlay for augmented reality (AR) based surgical navigation has been studied and reported many times. For the purpose of surgical overlay, the 3D image is expected to have the same geometric shape as the original organ, and can be transformed to a specified location for image overlay. However, how to generate a 3D image with high geometric fidelity and quantitative evaluation of 3D image's geometric accuracy have not been addressed. This paper proposes a graphics processing unit (GPU) based computer-generated integral imaging pipeline for real-time autostereoscopic 3D display, and an automatic closed-loop 3D image calibration paradigm for displaying undistorted 3D images. Based on the proposed methods, a novel AR device for 3D image surgical overlay is presented, which mainly consists of a 3D display, an AR window, a stereo camera for 3D measurement, and a workstation for information processing. The evaluation on the 3D image rendering performance with 2560×1600 elemental image resolution shows the rendering speeds of 50-60 frames per second (fps) for surface models, and 5-8 fps for large medical volumes. The evaluation of the undistorted 3D image after the calibration yields sub-millimeter geometric accuracy. A phantom experiment simulating oral and maxillofacial surgery was also performed to evaluate the proposed AR overlay device in terms of the image registration accuracy, 3D image overlay accuracy, and the visual effects of the overlay. The experimental results show satisfactory image registration and image overlay accuracy, and confirm the system usability. PMID:25465067

  5. Design Virtual Reality Scene Roam for Tour Animations Base on VRML and Java

    NASA Astrophysics Data System (ADS)

    Cao, Zaihui; hu, Zhongyan

    Virtual reality has been involved in a wide range of academic and commercial applications. It can give users a natural feeling of the environment by creating realistic virtual worlds. Implementing a virtual tour through a model of a tourist area on the web has become fashionable. In this paper, we present a web-based application that allows a user to, walk through, see, and interact with a fully three-dimensional model of the tourist area. Issues regarding navigation and disorientation areaddressed and we suggest a combination of the metro map and an intuitive navigation system. Finally we present a prototype which implements our ideas. The application of VR techniques integrates the visualization and animation of the three dimensional modelling to landscape analysis. The use of the VRML format produces the possibility to obtain some views of the 3D model and to explore it in real time. It is an important goal for the spatial information sciences.

  6. Envisioning the future of home care: applications of immersive virtual reality.

    PubMed

    Brennan, Patricia Flatley; Arnott Smith, Catherine; Ponto, Kevin; Radwin, Robert; Kreutz, Kendra

    2013-01-01

    Accelerating the design of technologies to support health in the home requires 1) better understanding of how the household context shapes consumer health behaviors and (2) the opportunity to afford engineers, designers, and health professionals the chance to systematically study the home environment. We developed the Living Environments Laboratory (LEL) with a fully immersive, six-sided virtual reality CAVE to enable recreation of a broad range of household environments. We have successfully developed a virtual apartment, including a kitchen, living space, and bathroom. Over 2000 people have visited the LEL CAVE. Participants use an electronic wand to activate common household affordances such as opening a refrigerator door or lifting a cup. Challenges currently being explored include creating natural gesture to interface with virtual objects, developing robust, simple procedures to capture actual living environments and rendering them in a 3D visualization, and devising systematic stable terminologies to characterize home environments. PMID:23920626

  7. Fast extraction of minimal paths in 3D images and applications to virtual endoscopy.

    PubMed

    Deschamps, T; Cohen, L D

    2001-12-01

    The aim of this article is to build trajectories for virtual endoscopy inside 3D medical images, using the most automatic way. Usually the construction of this trajectory is left to the clinician who must define some points on the path manually using three orthogonal views. But for a complex structure such as the colon, those views give little information on the shape of the object of interest. The path construction in 3D images becomes a very tedious task and precise a priori knowledge of the structure is needed to determine a suitable trajectory. We propose a more automatic path tracking method to overcome those drawbacks: we are able to build a path, given only one or two end points and the 3D image as inputs. This work is based on previous work by Cohen and Kimmel [Int. J. Comp. Vis. 24 (1) (1997) 57] for extracting paths in 2D images using Fast Marching algorithm. Our original contribution is twofold. On the first hand, we present a general technical contribution which extends minimal paths to 3D images and gives new improvements of the approach that are relevant in 2D as well as in 3D to extract linear structures in images. It includes techniques to make the path extraction scheme faster and easier, by reducing the user interaction. We also develop a new method to extract a centered path in tubular structures. Synthetic and real medical images are used to illustrate each contribution. On the other hand, we show that our method can be efficiently applied to the problem of finding a centered path in tubular anatomical structures with minimum interactivity, and that this path can be used for virtual endoscopy. Results are shown in various anatomical regions (colon, brain vessels, arteries) with different 3D imaging protocols (CT, MR). PMID:11731307

  8. Virtual Boutique: a 3D modeling and content-based management approach to e-commerce

    NASA Astrophysics Data System (ADS)

    Paquet, Eric; El-Hakim, Sabry F.

    2000-12-01

    The Virtual Boutique is made out of three modules: the decor, the market and the search engine. The decor is the physical space occupied by the Virtual Boutique. It can reproduce any existing boutique. For this purpose, photogrammetry is used. A set of pictures of a real boutique or space is taken and a virtual 3D representation of this space is calculated from them. Calculations are performed with software developed at NRC. This representation consists of meshes and texture maps. The camera used in the acquisition process determines the resolution of the texture maps. Decorative elements are added like painting, computer generated objects and scanned objects. The objects are scanned with laser scanner developed at NRC. This scanner allows simultaneous acquisition of range and color information based on white laser beam triangulation. The second module, the market, is made out of all the merchandises and the manipulators, which are used to manipulate and compare the objects. The third module, the search engine, can search the inventory based on an object shown by the customer in order to retrieve similar objects base don shape and color. The items of interest are displayed in the boutique by reconfiguring the market space, which mean that the boutique can be continuously customized according to the customer's needs. The Virtual Boutique is entirely written in Java 3D and can run in mono and stereo mode and has been optimized in order to allow high quality rendering.

  9. Virtual Spring-Based 3D Multi-Agent Group Coordination

    NASA Astrophysics Data System (ADS)

    Daneshvar, Roozbeh; Shih, Liwen

    As future personal vehicles start enjoying the ability to fly, tackling safe transportation coordination can be a tremendous task, far beyond the current challenge on radar screen monitoring of the already saturated air traffic control. Our focus is on the distributed safe-distance coordination among a group of autonomous flying vehicle agents, where each follows its own current straight-line direction in a 3D space with variable speeds. A virtual spring-based model is proposed for the group coordination. Within a specified neighborhood radius, each vehicle forms a virtual connection with each neighbor vehicle by a virtual spring. As the vehicle changes its position, speed and altitude, the total resultant forces on each virtual spring try to maintain zero by moving to the mechanical equilibrium point. The agents then add the simple total virtual spring constraints to their movements to determine their next positions individually. Together, the multi-agent vehicles reach a group behavior, where each of them keeps a minimal safe-distance with others. A new safe behavior thus arises in the group level. With the proposed virtual spring coordination model, the vehicles need no direct communication with each other, require only minimum local processing resources, and the control is completely distributed. New behaviors can now be formulated and studied based on the proposed model, e.g., how a fast driving vehicle can find its way though the crowd by avoiding the other vehicles effortlessly1.

  10. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  11. Designing 3 Dimensional Virtual Reality Using Panoramic Image

    NASA Astrophysics Data System (ADS)

    Wan Abd Arif, Wan Norazlinawati; Wan Ahmad, Wan Fatimah; Nordin, Shahrina Md.; Abdullah, Azrai; Sivapalan, Subarna

    The high demand to improve the quality of the presentation in the knowledge sharing field is to compete with rapidly growing technology. The needs for development of technology based learning and training lead to an idea to develop an Oil and Gas Plant Virtual Environment (OGPVE) for the benefit of our future. Panoramic Virtual Reality learning based environment is essential in order to help educators overcome the limitations in traditional technical writing lesson. Virtual reality will help users to understand better by providing the simulations of real-world and hard to reach environment with high degree of realistic experience and interactivity. Thus, in order to create a courseware which will achieve the objective, accurate images of intended scenarios must be acquired. The panorama shows the OGPVE and helps to generate ideas to users on what they have learnt. This paper discusses part of the development in panoramic virtual reality. The important phases for developing successful panoramic image are image acquisition and image stitching or mosaicing. In this paper, the combination of wide field-of-view (FOV) and close up image used in this panoramic development are also discussed.

  12. [Virtual reality in the treatment of mental disorders].

    PubMed

    Malbos, Eric; Boyer, Laurent; Lançon, Christophe

    2013-11-01

    Virtual reality is a media allowing users to interact in real time with computerized virtual environments. The application of this immersive technology to cognitive behavioral therapies is increasingly exploited for the treatment of mental disorders. The present study is a review of literature spanning from 1992 to 2012. It depicts the utility of this new tool for assessment and therapy through the various clinical studies carried out on subjects exhibiting diverse mental disorders. Most of the studies conducted on tested subjects attest to the significant efficacy of the Virtual Reality Exposure Therapy (VRET) for the treatment of distinct mental disorders. Comparative studies of VRET with the treatment of reference (the in vivo exposure component of the cognitive behavioral therapy) document an equal efficacy of the two methods and in some cases a superior therapeutic effect in favor of the VRET. Even though clinical experiments set on a larger scale, extended follow-up and studies about factors influencing presence are needed, virtual reality exposure represents an efficacious, confidential, affordable, flexible, interactive therapeutic method which application will progressively widened in the field of mental health. PMID:23702202

  13. Virtual reality simulation in neurosurgery: technologies and evolution.

    PubMed

    Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H

    2013-01-01

    Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery. PMID:23254804

  14. Simulation Of Assembly Processes With Technical Of Virtual Reality

    NASA Astrophysics Data System (ADS)

    García García, Manuel; Arenas Reina, José Manuel; Lite, Alberto Sánchez; Sebastián Pérez, Miguel Ángel

    2009-11-01

    Virtual reality techniques use at industrial processes provides a real approach to product life cycle. For components manual assembly, the use of virtual surroundings facilitates a simultaneous engineering in which variables such as human factors and productivity take a real act. On the other hand, in the actual phase of industrial competition it is required a rapid adjustment to client needs and to market situation. In this work it is analyzed the assembly of the front components of a vehicle using virtual reality tools and following up a product-process design methodology which includes every life service stage. This study is based on workstations design, taking into account productive and human factors from the ergonomic point of view implementing a postural study of every assembly operation, leaving the rest of stages for a later study. Design is optimized applying this methodology together with the use of virtual reality tools. It is also achieved a 15% reduction on time assembly and of 90% reduction in muscle—skeletal diseases at every assembly operation.

  15. Astronauts Prepare for Mission With Virtual Reality Hardware

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Astronauts John M. Grunsfeld (left), STS-109 payload commander, and Nancy J. Currie, mission specialist, use the virtual reality lab at Johnson Space Center to train for upcoming duties aboard the Space Shuttle Columbia. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team to perform its duties for the fourth Hubble Space Telescope Servicing mission. The most familiar form of virtual reality technology is some form of headpiece, which fits over your eyes and displays a three dimensional computerized image of another place. Turn your head left and right, and you see what would be to your sides; turn around, and you see what might be sneaking up on you. An important part of the technology is some type of data glove that you use to propel yourself through the virtual world. Currently, the medical community is using the new technologies in four major ways: To see parts of the body more accurately, for study, to make better diagnosis of disease and to plan surgery in more detail; to obtain a more accurate picture of a procedure during surgery; to perform more types of surgery with the most noninvasive, accurate methods possible; and to model interactions among molecules at a molecular level.

  16. Blood Pool Segmentation Results in Superior Virtual Cardiac Models than Myocardial Segmentation for 3D Printing.

    PubMed

    Farooqi, Kanwal M; Lengua, Carlos Gonzalez; Weinberg, Alan D; Nielsen, James C; Sanz, Javier

    2016-08-01

    The method of cardiac magnetic resonance (CMR) three-dimensional (3D) image acquisition and post-processing which should be used to create optimal virtual models for 3D printing has not been studied systematically. Patients (n = 19) who had undergone CMR including both 3D balanced steady-state free precession (bSSFP) imaging and contrast-enhanced magnetic resonance angiography (MRA) were retrospectively identified. Post-processing for the creation of virtual 3D models involved using both myocardial (MS) and blood pool (BP) segmentation, resulting in four groups: Group 1-bSSFP/MS, Group 2-bSSFP/BP, Group 3-MRA/MS and Group 4-MRA/BP. The models created were assessed by two raters for overall quality (1-poor; 2-good; 3-excellent) and ability to identify predefined vessels (1-5: superior vena cava, inferior vena cava, main pulmonary artery, ascending aorta and at least one pulmonary vein). A total of 76 virtual models were created from 19 patient CMR datasets. The mean overall quality scores for Raters 1/2 were 1.63 ± 0.50/1.26 ± 0.45 for Group 1, 2.12 ± 0.50/2.26 ± 0.73 for Group 2, 1.74 ± 0.56/1.53 ± 0.61 for Group 3 and 2.26 ± 0.65/2.68 ± 0.48 for Group 4. The numbers of identified vessels for Raters 1/2 were 4.11 ± 1.32/4.05 ± 1.31 for Group 1, 4.90 ± 0.46/4.95 ± 0.23 for Group 2, 4.32 ± 1.00/4.47 ± 0.84 for Group 3 and 4.74 ± 0.56/4.63 ± 0.49 for Group 4. Models created using BP segmentation (Groups 2 and 4) received significantly higher ratings than those created using MS for both overall quality and number of vessels visualized (p < 0.05), regardless of the acquisition technique. There were no significant differences between Groups 1 and 3. The ratings for Raters 1 and 2 had good correlation for overall quality (ICC = 0.63) and excellent correlation for the total number of vessels visualized (ICC = 0.77). The intra-rater reliability was good for Rater A (ICC = 0.65). Three models were successfully printed

  17. Web GIS in practice VII: stereoscopic 3-D solutions for online maps and virtual globes

    USGS Publications Warehouse

    Boulos, Maged N.K.; Robinson, Larry R.

    2009-01-01

    Because our pupils are about 6.5 cm apart, each eye views a scene from a different angle and sends a unique image to the visual cortex, which then merges the images from both eyes into a single picture. The slight difference between the right and left images allows the brain to properly perceive the 'third dimension' or depth in a scene (stereopsis). However, when a person views a conventional 2-D (two-dimensional) image representation of a 3-D (three-dimensional) scene on a conventional computer screen, each eye receives essentially the same information. Depth in such cases can only be approximately inferred from visual clues in the image, such as perspective, as only one image is offered to both eyes. The goal of stereoscopic 3-D displays is to project a slightly different image into each eye to achieve a much truer and realistic perception of depth, of different scene planes, and of object relief. This paper presents a brief review of a number of stereoscopic 3-D hardware and software solutions for creating and displaying online maps and virtual globes (such as Google Earth) in "true 3D", with costs ranging from almost free to multi-thousand pounds sterling. A practical account is also given of the experience of the USGS BRD UMESC (United States Geological Survey's Biological Resources Division, Upper Midwest Environmental Sciences Center) in setting up a low-cost, full-colour stereoscopic 3-D system.

  18. 3D resolution enhancement of deep-tissue imaging based on virtual spatial overlap modulation microscopy.

    PubMed

    Su, I-Cheng; Hsu, Kuo-Jen; Shen, Po-Ting; Lin, Yen-Yin; Chu, Shi-Wei

    2016-07-25

    During the last decades, several resolution enhancement methods for optical microscopy beyond diffraction limit have been developed. Nevertheless, those hardware-based techniques typically require strong illumination, and fail to improve resolution in deep tissue. Here we develop a high-speed computational approach, three-dimensional virtual spatial overlap modulation microscopy (3D-vSPOM), which immediately solves the strong-illumination issue. By amplifying only the spatial frequency component corresponding to the un-scattered point-spread-function at focus, plus 3D nonlinear value selection, 3D-vSPOM shows significant resolution enhancement in deep tissue. Since no iteration is required, 3D-vSPOM is much faster than iterative deconvolution. Compared to non-iterative deconvolution, 3D-vSPOM does not need a priori information of point-spread-function at deep tissue, and provides much better resolution enhancement plus greatly improved noise-immune response. This method is ready to be amalgamated with two-photon microscopy or other laser scanning microscopy to enhance deep-tissue resolution. PMID:27464077

  19. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.

  20. A Voice and Mouse Input Interface for 3D Virtual Environments

    NASA Technical Reports Server (NTRS)

    Kao, David L.; Bryson, Steve T.

    2003-01-01

    There have been many successful stories on how 3D input devices can be fully integrated into an immersive virtual environment. Electromagnetic trackers, optical trackers, gloves, and flying mice are just some of these input devices. Though we can use existing 3D input devices that are commonly used for VR applications, there are several factors that prevent us from choosing these input devices for our applications. One main factor is that most of these tracking devices are not suitable for prolonged use due to human fatigue associated with using them. A second factor is that many of them would occupy additional office space. Another factor is that many of the 3D input devices are expensive due to the unusual hardware that are required. For our VR applications, we want a user interface that would work naturally with standard equipment. In this paper, we demonstrate applications or our proposed muitimodal interface using a 3D dome display. We also show that effective data analysis can be achieved while the scientists view their data rendered inside the dome display and perform user interactions simply using the mouse and voice input. Though the sphere coordinate grid seems to be ideal for interaction using a 3D dome display, we can also use other non-spherical grids as well.

  1. Web GIS in practice VII: stereoscopic 3-D solutions for online maps and virtual globes

    PubMed Central

    Boulos, Maged N Kamel; Robinson, Larry R

    2009-01-01

    Because our pupils are about 6.5 cm apart, each eye views a scene from a different angle and sends a unique image to the visual cortex, which then merges the images from both eyes into a single picture. The slight difference between the right and left images allows the brain to properly perceive the 'third dimension' or depth in a scene (stereopsis). However, when a person views a conventional 2-D (two-dimensional) image representation of a 3-D (three-dimensional) scene on a conventional computer screen, each eye receives essentially the same information. Depth in such cases can only be approximately inferred from visual clues in the image, such as perspective, as only one image is offered to both eyes. The goal of stereoscopic 3-D displays is to project a slightly different image into each eye to achieve a much truer and realistic perception of depth, of different scene planes, and of object relief. This paper presents a brief review of a number of stereoscopic 3-D hardware and software solutions for creating and displaying online maps and virtual globes (such as Google Earth) in "true 3D", with costs ranging from almost free to multi-thousand pounds sterling. A practical account is also given of the experience of the USGS BRD UMESC (United States Geological Survey's Biological Resources Division, Upper Midwest Environmental Sciences Center) in setting up a low-cost, full-colour stereoscopic 3-D system. PMID:19849837

  2. Factory of Realities: On the Emergence of Virtual Spatiotemporal Structures

    NASA Astrophysics Data System (ADS)

    Zapatrin, Romàn R.

    The ubiquitous nature of modern Information Retrieval (IR) and Virtual World give rise to new realities. To what extent are these `realities' real? Which `physics' should be applied to quantitatively describe them? In this chapter, I dwell on few examples. The first is adaptive neural networks, which are not networks and not neural, but still provide service similar to classical artificial neural networks (ANNs) in extended fashion. The second is the emergence of objects looking like Einsteinian space-time, which describe the behavior of an Internet surfer like geodesic motion. The third is the demonstration of nonclassical and even stronger-than-quantum probabilities in IR, their use...

  3. Dynamic WIFI-Based Indoor Positioning in 3D Virtual World

    NASA Astrophysics Data System (ADS)

    Chan, S.; Sohn, G.; Wang, L.; Lee, W.

    2013-11-01

    A web-based system based on the 3DTown project was proposed using Google Earth plug-in that brings information from indoor positioning devices and real-time sensors into an integrated 3D indoor and outdoor virtual world to visualize the dynamics of urban life within the 3D context of a city. We addressed limitation of the 3DTown project with particular emphasis on video surveillance camera used for indoor tracking purposes. The proposed solution was to utilize wireless local area network (WLAN) WiFi as a replacement technology for localizing objects of interest due to the wide spread availability and large coverage area of WiFi in indoor building spaces. Indoor positioning was performed using WiFi without modifying existing building infrastructure or introducing additional access points (AP)s. A hybrid probabilistic approach was used for indoor positioning based on previously recorded WiFi fingerprint database in the Petrie Science and Engineering building at York University. In addition, we have developed a 3D building modeling module that allows for efficient reconstruction of outdoor building models to be integrated with indoor building models; a sensor module for receiving, distributing, and visualizing real-time sensor data; and a web-based visualization module for users to explore the dynamic urban life in a virtual world. In order to solve the problems in the implementation of the proposed system, we introduce approaches for integration of indoor building models with indoor positioning data, as well as real-time sensor information and visualization on the web-based system. In this paper we report the preliminary results of our prototype system, demonstrating the system's capability for implementing a dynamic 3D indoor and outdoor virtual world that is composed of discrete modules connected through pre-determined communication protocols.

  4. Virtual reality and telerobotics applications of an Address Recalculation Pipeline

    NASA Technical Reports Server (NTRS)

    Regan, Matthew; Pose, Ronald

    1994-01-01

    The technology described in this paper was designed to reduce latency to user interactions in immersive virtual reality environments. It is also ideally suited to telerobotic applications such as interaction with remote robotic manipulators in space or in deep sea operations. in such circumstances the significant latency is observed response to user stimulus which is due to communications delays, and the disturbing jerkiness due to low and unpredictable frame rates on compressed video user feedback or computationally limited virtual worlds, can be masked by our techniques. The user is provided with highly responsive visual feedback independent of communication or computational delays in providing physical video feedback or in rendering virtual world images. Virtual and physical environments can be combined seamlessly using these techniques.

  5. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    PubMed

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  6. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    PubMed Central

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  7. Options in virtual 3D, optical-impression-based planning of dental implants.

    PubMed

    Reich, Sven; Kern, Thomas; Ritter, Lutz

    2014-01-01

    If a 3D radiograph, which in today's dentistry often consists of a CBCT dataset, is available for computerized implant planning, the 3D planning should also consider functional prosthetic aspects. In a conventional workflow, the CBCT is done with a specially produced radiopaque prosthetic setup that makes the desired prosthetic situation visible during virtual implant planning. If an exclusively digital workflow is chosen, intraoral digital impressions are taken. On these digital models, the desired prosthetic suprastructures are designed. The entire datasets are virtually superimposed by a "registration" process on the corresponding structures (teeth) in the CBCTs. Thus, both the osseous and prosthetic structures are visible in one single 3D application and make it possible to consider surgical and prosthetic aspects. After having determined the implant positions on the computer screen, a drilling template is designed digitally. According to this design (CAD), a template is printed or milled in CAM process. This template is the first physically extant product in the entire workflow. The article discusses the options and limitations of this workflow. PMID:25098158

  8. Avalanche for shape and feature-based virtual screening with 3D alignment.

    PubMed

    Diller, David J; Connell, Nancy D; Welsh, William J

    2015-11-01

    This report introduces a new ligand-based virtual screening tool called Avalanche that incorporates both shape- and feature-based comparison with three-dimensional (3D) alignment between the query molecule and test compounds residing in a chemical database. Avalanche proceeds in two steps. The first step is an extremely rapid shape/feature based comparison which is used to narrow the focus from potentially millions or billions of candidate molecules and conformations to a more manageable number that are then passed to the second step. The second step is a detailed yet still rapid 3D alignment of the remaining candidate conformations to the query conformation. Using the 3D alignment, these remaining candidate conformations are scored, re-ranked and presented to the user as the top hits for further visualization and evaluation. To provide further insight into the method, the results from two prospective virtual screens are presented which show the ability of Avalanche to identify hits from chemical databases that would likely be missed by common substructure-based or fingerprint-based search methods. The Avalanche method is extended to enable patent landscaping, i.e., structural refinements to improve the patentability of hits for deployment in drug discovery campaigns. PMID:26458937

  9. Exploring conformational search protocols for ligand-based virtual screening and 3-D QSAR modeling.

    PubMed

    Cappel, Daniel; Dixon, Steven L; Sherman, Woody; Duan, Jianxin

    2015-02-01

    3-D ligand conformations are required for most ligand-based drug design methods, such as pharmacophore modeling, shape-based screening, and 3-D QSAR model building. Many studies of conformational search methods have focused on the reproduction of crystal structures (i.e. bioactive conformations); however, for ligand-based modeling the key question is how to generate a ligand alignment that produces the best results for a given query molecule. In this work, we study different conformation generation modes of ConfGen and the impact on virtual screening (Shape Screening and e-Pharmacophore) and QSAR predictions (atom-based and field-based). In addition, we develop a new search method, called common scaffold alignment, that automatically detects the maximum common scaffold between each screening molecule and the query to ensure identical coordinates of the common core, thereby minimizing the noise introduced by analogous parts of the molecules. In general, we find that virtual screening results are relatively insensitive to the conformational search protocol; hence, a conformational search method that generates fewer conformations could be considered "better" because it is more computationally efficient for screening. However, for 3-D QSAR modeling we find that more thorough conformational sampling tends to produce better QSAR predictions. In addition, significant improvements in QSAR predictions are obtained with the common scaffold alignment protocol developed in this work, which focuses conformational sampling on parts of the molecules that are not part of the common scaffold. PMID:25408244

  10. Virtual Reality Technologies for Research and Education in Obesity and Diabetes: Research Needs and Opportunities

    PubMed Central

    Ershow, Abby G; Peterson, Charles M; Riley, William T; Rizzo, Albert “Skip”; Wansink, Brian

    2011-01-01

    The rising rates, high prevalence, and adverse consequences of obesity and diabetes call for new approaches to the complex behaviors needed to prevent and manage these conditions. Virtual reality (VR) technologies, which provide controllable, multisensory, interactive three-dimensional (3D) stimulus environments, are a potentially valuable means of engaging patients in interventions that foster more healthful eating and physical activity patterns. Furthermore, the capacity of VR technologies to motivate, record, and measure human performance represents a novel and useful modality for conducting research. This article summarizes background information and discussions for a joint July 2010 National Institutes of Health – Department of Defense workshop entitled Virtual Reality Technologies for Research and Education in Obesity and Diabetes. The workshop explored the research potential of VR technologies as tools for behavioral and neuroscience studies in diabetes and obesity, and the practical potential of VR in fostering more effective utilization of diabetes- and obesity-related nutrition and lifestyle information. Virtual reality technologies were considered especially relevant for fostering desirable health-related behaviors through motivational reinforcement, personalized teaching approaches, and social networking. Virtual reality might also be a means of extending the availability and capacity of health care providers. Progress in the field will be enhanced by further developing available platforms and taking advantage of VR’s capabilities as a research tool for well-designed hypothesis-testing behavioral science. Multidisciplinary collaborations are needed between the technology industry and academia, and among researchers in biomedical, behavioral, pedagogical, and computer science disciplines. Research priorities and funding opportunities for use of VR to improve prevention and management of obesity and diabetes can be found at agency websites (National

  11. Immersive virtual reality and environmental noise assessment: An innovative audio–visual approach

    SciTech Connect

    Ruotolo, Francesco; Maffei, Luigi; Di Gabriele, Maria; Iachini, Tina; Masullo, Massimiliano; Ruggiero, Gennaro; Senese, Vincenzo Paolo

    2013-07-15

    Several international studies have shown that traffic noise has a negative impact on people's health and that people's annoyance does not depend only on noise energetic levels, but rather on multi-perceptual factors. The combination of virtual reality technology and audio rendering techniques allow us to experiment a new approach for environmental noise assessment that can help to investigate in advance the potential negative effects of noise associated with a specific project and that in turn can help designers to make educated decisions. In the present study, the audio–visual impact of a new motorway project on people has been assessed by means of immersive virtual reality technology. In particular, participants were exposed to 3D reconstructions of an actual landscape without the projected motorway (ante operam condition), and of the same landscape with the projected motorway (post operam condition). Furthermore, individuals' reactions to noise were assessed by means of objective cognitive measures (short term verbal memory and executive functions) and subjective evaluations (noise and visual annoyance). Overall, the results showed that the introduction of a projected motorway in the environment can have immediate detrimental effects of people's well-being depending on the distance from the noise source. In particular, noise due to the new infrastructure seems to exert a negative influence on short term verbal memory and to increase both visual and noise annoyance. The theoretical and practical implications of these findings are discussed. -- Highlights: ► Impact of traffic noise on people's well-being depends on multi-perceptual factors. ► A multisensory virtual reality technology is used to simulate a projected motorway. ► Effects on short-term memory and auditory and visual subjective annoyance were found. ► The closer the distance from the motorway the stronger was the effect. ► Multisensory virtual reality methodologies can be used to study

  12. Virtual reality technologies for research and education in obesity and diabetes: research needs and opportunities.

    PubMed

    Ershow, Abby G; Peterson, Charles M; Riley, William T; Rizzo, Albert Skip; Wansink, Brian

    2011-03-01

    The rising rates, high prevalence, and adverse consequences of obesity and diabetes call for new approaches to the complex behaviors needed to prevent and manage these conditions. Virtual reality (VR) technologies, which provide controllable, multisensory, interactive three-dimensional (3D) stimulus environments, are a potentially valuable means of engaging patients in interventions that foster more healthful eating and physical activity patterns. Furthermore, the capacity of VR technologies to motivate, record, and measure human performance represents a novel and useful modality for conducting research. This article summarizes background information and discussions for a joint July 2010 National Institutes of Health - Department of Defense workshop entitled Virtual Reality Technologies for Research and Education in Obesity and Diabetes. The workshop explored the research potential of VR technologies as tools for behavioral and neuroscience studies in diabetes and obesity, and the practical potential of VR in fostering more effective utilization of diabetes- and obesity-related nutrition and lifestyle information. Virtual reality technologies were considered especially relevant for fostering desirable health-related behaviors through motivational reinforcement, personalized teaching approaches, and social networking. Virtual reality might also be a means of extending the availability and capacity of health care providers. Progress in the field will be enhanced by further developing available platforms and taking advantage of VR's capabilities as a research tool for well-designed hypothesis-testing behavioral science. Multidisciplinary collaborations are needed between the technology industry and academia, and among researchers in biomedical, behavioral, pedagogical, and computer science disciplines. Research priorities and funding opportunities for use of VR to improve prevention and management of obesity and diabetes can be found at agency websites (National

  13. A virtual interface for interactions with 3D models of the human body.

    PubMed

    De Paolis, Lucio T; Pulimeno, Marco; Aloisio, Giovanni

    2009-01-01

    The developed system is the first prototype of a virtual interface designed to avoid contact with the computer so that the surgeon is able to visualize 3D models of the patient's organs more effectively during surgical procedure or to use this in the pre-operative planning. The doctor will be able to rotate, to translate and to zoom in on 3D models of the patient's organs simply by moving his finger in free space; in addition, it is possible to choose to visualize all of the organs or only some of them. All of the interactions with the models happen in real-time using the virtual interface which appears as a touch-screen suspended in free space in a position chosen by the user when the application is started up. Finger movements are detected by means of an optical tracking system and are used to simulate touch with the interface and to interact by pressing the buttons present on the virtual screen. PMID:19377116

  14. Location and Longing: The Nicotine Craving Experience in Virtual Reality

    PubMed Central

    Carter, Brian L.; Bordnick, Patrick; Traylor, Amy; Day, Susan X.; Paris, Megan

    2008-01-01

    Considerable research suggests that cigarette craving is complex, with psychological, emotional, cognitive, and behavioral aspects that are inadequately captured by typical craving assessments that focus on level of severity. That is, the experience of craving, for cigarette smokers, remains poorly understood. This study immersed smokers in different virtual reality (VR) scenarios (with and without cigarette cues present), collected detailed craving assessments, and analyzed the data using a multidimensional analytic approach. Non-treatment-seeking, nicotine dependent smokers (N = 22) experienced two different virtual reality scenarios, one with cigarette cues and one without, and rated 24 descriptors related to craving. Multidimensional scaling (MDS) models demonstrate that smokers’ experience of craving is qualitatively, structurally different under VR smoking cue conditions versus neutral conditions. This finding sheds new light on the complexity of craving as well as implications for its measurement. PMID:18243586

  15. Polymer-based actuators for virtual reality devices

    NASA Astrophysics Data System (ADS)

    Bolzmacher, Christian; Hafez, Moustapha; Benali Khoudja, Mohamed; Bernardoni, Paul; Dubowsky, Steven

    2004-07-01

    Virtual Reality (VR) is gaining more importance in our society. For many years, VR has been limited to the entertainment applications. Today, practical applications such as training and prototyping find a promising future in VR. Therefore there is an increasing demand for low-cost, lightweight haptic devices in virtual reality (VR) environment. Electroactive polymers seem to be a potential actuation technology that could satisfy these requirements. Dielectric polymers developed the past few years have shown large displacements (more than 300%). This feature makes them quite interesting for integration in haptic devices due to their muscle-like behaviour. Polymer actuators are flexible and lightweight as compared to traditional actuators. Using stacks with several layers of elatomeric film increase the force without limiting the output displacement. The paper discusses some design methods for a linear dielectric polymer actuator for VR devices. Experimental results of the actuator performance is presented.

  16. Human Factors in Virtual Reality Development

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Proffitt, Dennis R.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    This half-day tutorial will provide an overview of basic perceptual functioning as it relates to the design of virtual environment systems. The tutorial consists of three parts. First, basic issues in visual perception will be presented, including discussions of the visual sensations of brightness and color, and the visual perception of depth relationships in three-dimensional space (with a special emphasis on motion -specified depth). The second section will discuss the importance of conducting human-factors user studies and evaluations. Examples and suggestions on how best to get help with user studies will be provided. Finally, we will discuss how, by drawing on their complementary competencies, perceptual psychologists and computer engineers can work as a team to develop optimal VR systems, technologies, and techniques.

  17. A Virtual Reality System Framework for Industrial Product Design Validation

    NASA Astrophysics Data System (ADS)

    Ladeveze, Nicolas; Sghaier, Adel; Fourquet, Jean Yves

    2009-03-01

    This paper presents a virtual reality simulation architecture intended to improve the product parts design quality and the way to take into account manufacturing and maintenance requests in order to reduce the cost and time of the products design. This architecture merges previous studies into a unique framework dedicated to product pre design. Using several interfaces, this architecture allows a fast pre designed product validation on a large scope from the multi physics computation to the maintainability studies.

  18. Collaborative virtual reality environments for computational science and design.

    SciTech Connect

    Papka, M. E.

    1998-02-17

    The authors are developing a networked, multi-user, virtual-reality-based collaborative environment coupled to one or more petaFLOPs computers, enabling the interactive simulation of 10{sup 9} atom systems. The purpose of this work is to explore the requirements for this coupling. Through the design, development, and testing of such systems, they hope to gain knowledge that allows computational scientists to discover and analyze their results more quickly and in a more intuitive manner.

  19. Fostering Learning Through Interprofessional Virtual Reality Simulation Development.

    PubMed

    Nicely, Stephanie; Farra, Sharon

    2015-01-01

    This article presents a unique strategy for improving didactic learning and clinical skill while simultaneously fostering interprofessional collaboration and communication. Senior-level nursing students collaborated with students enrolled in the Department of Interactive Media Studies to design a virtual reality simulation based upon disaster management and triage techniques. Collaborative creation of the simulation proved to be a strategy for enhancing students' knowledge of and skill in disaster management and triage while impacting attitudes about interprofessional communication and teamwork. PMID:26521506

  20. Rapid prototyping--when virtual meets reality.

    PubMed

    Beguma, Zubeda; Chhedat, Pratik

    2014-01-01

    Rapid prototyping (RP) describes the customized production of solid models using 3D computer data. Over the past decade, advances in RP have continued to evolve, resulting in the development of new techniques that have been applied to the fabrication of various prostheses. RP fabrication technologies include stereolithography (SLA), fused deposition modeling (FDM), computer numerical controlled (CNC) milling, and, more recently, selective laser sintering (SLS). The applications of RP techniques for dentistry include wax pattern fabrication for dental prostheses, dental (facial) prostheses mold (shell) fabrication, and removable dental prostheses framework fabrication. In the past, a physical plastic shape of the removable partial denture (RPD) framework was produced using an RP machine, and then used as a sacrificial pattern. Yet with the advent of the selective laser melting (SLM) technique, RPD metal frameworks can be directly fabricated, thereby omitting the casting stage. This new approach can also generate the wax pattern for facial prostheses directly, thereby reducing labor-intensive laboratory procedures. Many people stand to benefit from these new RP techniques for producing various forms of dental prostheses, which in the near future could transform traditional prosthodontic practices. PMID:25643461

  1. Elastic registration using 3D ChainMail: application to virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Castro-Pareja, Carlos R.; Daly, Barry; Shekhar, Raj

    2006-03-01

    We present an elastic registration algorithm based on local deformations modeled using cubic B-splines and controlled using 3D ChainMail. Our algorithm eliminates the appearance of folding artifacts and allows local rigidity and compressibility control independent of the image similarity metric being used. 3D ChainMail propagates large internal deformations between neighboring B-Spline control points, thereby preserving the topology of the transformed image without requiring the addition of penalty terms based on rigidity of the transformation field to the equation used to maximize image similarity. A novel application to virtual colonoscopy is presented where the algorithm is used to significantly improve cross-localization between colon locations in prone and supine CT images.

  2. Paranasal sinus surgery planning using CT virtual reality

    NASA Astrophysics Data System (ADS)

    Hopper, Kenneth D.

    2001-05-01

    CT virtual reality using volumetric rendering can tag such structures as the nasofrontal ducts, osteomeatal complexes, the middle turbinates, as well as the planned surgical sites in patients undergoing endoscopic surgery for inflammatory disease. Frequently, anatomical landmarks are obscured by overlying disease, making the endoscopic surgeon's job difficult. We have evaluated the use of CT virtual reality of the paranasal sinuses in assisting the surgeon in these types of cases. This paper reviews 25 patients with 40 sites with significant paranasal sinus disease in whom endoscopic surgery was planned. The ability of volumetric virtual reality with the various surgical sites chosen from the preoperative 2D CT's dramatically improved the accuracy of the endoscopic surgeon in localizing their surgical window. In the sphenoid sinus, the addition of CT endoscopy would have allowed the endoscopist to operate on the correct sinus an additional 28% of the time and help them miss vital structures in 25%. In the frontal sinus, CT endoscopy correctly directed the endoscopist to the correct sinus in an additional 44%. The results of this study indicate CT endoscopy may significantly improve the accuracy of endoscopic surgery into the frontal and sphenoid sinuses.

  3. Virtual reality in rhinology-a new dimension of clinical experience.

    PubMed

    Klapan, Ivica; Raos, Pero; Galeta, Tomislav; Kubat, Goranka

    2016-07-01

    There is often a need to more precisely identify the extent of pathology and the fine elements of intracranial anatomic features during the diagnostic process and during many operations in the nose, sinus, orbit, and skull base region. In two case reports, we describe the methods used in the diagnostic workup and surgical therapy in the nose and paranasal sinus region. Besides baseline x-ray, multislice computed tomography, and magnetic resonance imaging, operative field imaging was performed via a rapid prototyping model, virtual endoscopy, and 3-D imaging. Different head tissues were visualized in different colors, showing their anatomic interrelations and the extent of pathologic tissue within the operative field. This approach has not yet been used as a standard preoperative or intraoperative procedure in otorhinolaryngology. In this way, we tried to understand the new, visualized "world of anatomic relations within the patient's head" by creating an impression of perception (virtual perception) of the given position of all elements in a particular anatomic region of the head, which does not exist in the real world (virtual world). This approach was aimed at upgrading the diagnostic workup and surgical therapy by ensuring a faster, safer and, above all, simpler operative procedure. In conclusion, any ENT specialist can provide virtual reality support in implementing surgical procedures, with additional control of risks and within the limits of normal tissue, without additional trauma to the surrounding tissue in the anatomic region. At the same time, the virtual reality support provides an impression of the virtual world as the specialist navigates through it and manipulates virtual objects. PMID:27434481

  4. 3D pulmonary airway color image reconstruction via shape from shading and virtual bronchoscopy imaging techniques

    NASA Astrophysics Data System (ADS)

    Suter, Melissa; Reinhardt, Joseph M.; Hoffman, Eric A.; McLennan, Geoffrey

    2005-04-01

    The dependence on macro-optical imaging of the human body in the assessment of possible disease is rapidly increasing concurrent with, and as a direct result of, advancements made in medical imaging technologies. Assessing the pulmonary airways through bronchoscopy is performed extensively in clinical practice however remains highly subjective due to limited visualization techniques and the lack of quantitative analyses. The representation of 3D structures in 2D visualization modes, although providing an insight to the structural content of the scene, may in fact skew the perception of the structural form. We have developed two methods for visualizing the optically derived airway mucosal features whilst preserving the structural scene integrity. Shape from shading (SFS) techniques can be used to extract 3D structural information from 2D optical images. The SFS technique presented addresses many limitations previously encountered in conventional techniques resulting in high-resolution 3D color images. The second method presented to combine both color and structural information relies on combined CT and bronchoscopy imaging modalities. External imaging techniques such as CT provide a means of determining the gross structural anatomy of the pulmonary airways, however lack the important optically derived mucosal color. Virtual bronchoscopy is used to provide a direct link between the CT derived structural anatomy and the macro-optically derived mucosal color. Through utilization of a virtual and true bronchoscopy matching technique we are able to directly extract combined structurally sound 3D color segments of the pulmonary airways. Various pulmonary airway diseases are assessed and the resulting combined color and texture results are presented demonstrating the effectiveness of the presented techniques.

  5. Going Virtual… or Not: Development and Testing of a 3D Virtual Astronomy Environment

    NASA Astrophysics Data System (ADS)

    Ruzhitskaya, L.; Speck, A.; Ding, N.; Baldridge, S.; Witzig, S.; Laffey, J.

    2013-04-01

    We present our preliminary results of a pilot study of students' knowledge transfer of an astronomy concept into a new environment. We also share our discoveries on what aspects of a 3D environment students consider being motivational and discouraging for their learning. This study was conducted among 64 non-science major students enrolled in an astronomy laboratory course. During the course, students learned the concept and applications of Kepler's laws using a 2D interactive environment. Later in the semester, the students were placed in a 3D environment in which they were asked to conduct observations and to answers a set of questions pertaining to the Kepler's laws of planetary motion. In this study, we were interested in observing scrutinizing and assessing students' behavior: from choices that they made while creating their avatars (virtual representations) to tools they choose to use, to their navigational patterns, to their levels of discourse in the environment. These helped us to identify what features of the 3D environment our participants found to be helpful and interesting and what tools created unnecessary clutter and distraction. The students' social behavior patterns in the virtual environment together with their answers to the questions helped us to determine how well they understood Kepler's laws, how well they could transfer the concepts to a new situation, and at what point a motivational tool such as a 3D environment becomes a disruption to the constructive learning. Our founding confirmed that students construct deeper knowledge of a concept when they are fully immersed in the environment.

  6. Virtual reality hardware and graphic display options for brain-machine interfaces

    PubMed Central

    Marathe, Amar R.; Carey, Holle L.; Taylor, Dawn M.

    2009-01-01

    Virtual reality hardware and graphic displays are reviewed here as a development environment for brain-machine interfaces (BMIs). Two desktop stereoscopic monitors and one 2D monitor were compared in a visual depth discrimination task and in a 3D target-matching task where able-bodied individuals used actual hand movements to match a virtual hand to different target hands. Three graphic representations of the hand were compared: a plain sphere, a sphere attached to the fingertip of a realistic hand and arm, and a stylized pacman-like hand. Several subjects had great difficulty using either stereo monitor for depth perception when perspective size cues were removed. A mismatch in stereo and size cues generated inappropriate depth illusions. This phenomenon has implications for choosing target and virtual hand sizes in BMI experiments. Target matching accuracy was about as good with the 2D monitor as with either 3D monitor. However, users achieved this accuracy by exploring the boundaries of the hand in the target with carefully controlled movements. This method of determining relative depth may not be possible in BMI experiments if movement control is more limited. Intuitive depth cues, such as including a virtual arm, can significantly improve depth perception accuracy with or without stereo viewing. PMID:18006069

  7. Virtual Reality Simulation of Gynecologic Laparoscopy

    PubMed

    Bernstein

    1996-08-01

    Realistic virtual simulation of gynecologic laparoscopy would permit the surgeon to practice any procedure, with any degree of pathology, at any time and as many times as necessary to achieve proficiency before attempting it in the operating room. Effective computer simulation requires accurate anatomy, realistic three-dimensional computer graphics, the ability to cut and deform tissue in response to instruments, and an appropriate hardware interface. The Visible Human Project from the National Library of Medicine has made available extremely accurate, three-dimensional, digital data that computer animation companies have begun to transform to three-dimensional graphic images. The problem of tissue deformation and movement is approached by a software package called TELEOS. Hardware consisting of two scissor-grip laparoscopic handles mounted on a sensor can interface with any simulation program to simulate a multiplicity of laparoscopic instruments. The next step will be to combine TELEOS with the three-dimensional anatomy data and configure it for gynecologic surgery. PMID:9074082

  8. 3-D Imaging In Virtual Environment: A Scientific Clinical and Teaching Tool

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; DeVincenzi, Donald L. (Technical Monitor)

    1996-01-01

    The advent of powerful graphics workstations and computers has led to the advancement of scientific knowledge through three-dimensional (3-D) reconstruction and imaging of biological cells and tissues. The Biocomputation Center at NASA Ames Research Center pioneered the effort to produce an entirely computerized method for reconstruction of objects from serial sections studied in a transmission electron microscope (TEM). The software developed, ROSS (Reconstruction of Serial Sections), is now being distributed to users across the United States through Space Act Agreements. The software is in widely disparate fields such as geology, botany, biology and medicine. In the Biocomputation Center, ROSS serves as the basis for development of virtual environment technologies for scientific and medical use. This report will describe the Virtual Surgery Workstation Project that is ongoing with clinicians at Stanford University Medical Center, and the role of the Visible Human data in the project.

  9. Molecular surface point environments for virtual screening and the elucidation of binding patterns (MOLPRINT 3D).

    PubMed

    Bender, Andreas; Mussa, Hamse Y; Gill, Gurprem S; Glen, Robert C

    2004-12-16

    A novel method (MOLPRINT 3D) for virtual screening and the elucidation of ligand-receptor binding patterns is introduced that is based on environments of molecular surface points. The descriptor uses points relative to the molecular coordinates, thus it is translationally and rotationally invariant. Due to its local nature, conformational variations cause only minor changes in the descriptor. If surface point environments are combined with the Tanimoto coefficient and applied to virtual screening, they achieve retrieval rates comparable to that of two-dimensional (2D) fingerprints. The identification of active structures with minimal 2D similarity ("scaffold hopping") is facilitated. In combination with information-gain-based feature selection and a naive Bayesian classifier, information from multiple molecules can be combined and classification performance can be improved. Selected features are consistent with experimentally determined binding patterns. Examples are given for angiotensin-converting enzyme inhibitors, 3-hydroxy-3-methylglutaryl-coenzyme A reductase inhibitors, and thromboxane A2 antagonists. PMID:15588092

  10. Visuomotor discordance in virtual reality: effects on online motor control.

    PubMed

    Bagce, Hamid F; Saleh, Soha; Adamovich, Sergei V; Tunik, Eugene

    2011-01-01

    Virtual reality (VR) applications are rapidly permeating fields such as medicine, rehabilitation, research, and military training. However, VR-induced effects on human performance remain poorly understood, particularly in relation to fine-grained motor control of the hand and fingers. We designed a novel virtual reality environment suitable for hand-finger interactions and examined the ability to use visual feedback manipulations in VR to affect online motor performance. Ten healthy subjects performed a simple finger flexion movement toward a kinesthetically-defined 45° target angle while receiving one of three types of VR-based visual feedback in real-time: veridical (in which the virtual hand motion corresponded to subjects' actual motion), or scaled-down / scaled-up feedback (in which virtual finger motion was scaled by 25% / 175% relative to actual motion). Scaled down-and scaled-up feedback led to significant online modifications (increases and decreases, respectively) in angular excursion, despite explicit instructions for subjects to maintain constant movements across conditions. The latency of these modifications was similar across conditions. These findings demonstrate that a VR-based platform may be a robust medium for presenting visuomotor discordances to engender a sense of ownership and drive sensorimotor adaptation for (retraining motor skills. This may prove to be particularly important for retraining motor skills in patients with neurologically-based movement disorders. PMID:22256015

  11. Towards a Transcription System of Sign Language for 3D Virtual Agents

    NASA Astrophysics Data System (ADS)

    Do Amaral, Wanessa Machado; de Martino, José Mario

    Accessibility is a growing concern in computer science. Since virtual information is mostly presented visually, it may seem that access for deaf people is not an issue. However, for prelingually deaf individuals, those who were deaf since before acquiring and formally learn a language, written information is often of limited accessibility than if presented in signing. Further, for this community, signing is their language of choice, and reading text in a spoken language is akin to using a foreign language. Sign language uses gestures and facial expressions and is widely used by deaf communities. To enabling efficient production of signed content on virtual environment, it is necessary to make written records of signs. Transcription systems have been developed to describe sign languages in written form, but these systems have limitations. Since they were not originally designed with computer animation in mind, in general, the recognition and reproduction of signs in these systems is an easy task only to those who deeply know the system. The aim of this work is to develop a transcription system to provide signed content in virtual environment. To animate a virtual avatar, a transcription system requires explicit enough information, such as movement speed, signs concatenation, sequence of each hold-and-movement and facial expressions, trying to articulate close to reality. Although many important studies in sign languages have been published, the transcription problem remains a challenge. Thus, a notation to describe, store and play signed content in virtual environments offers a multidisciplinary study and research tool, which may help linguistic studies to understand the sign languages structure and grammar.

  12. The dynamics of student learning within a high school virtual reality design class

    NASA Astrophysics Data System (ADS)

    Morales, Teresa M.

    This mixed method study investigated knowledge and skill development of high school students in a project-based VR design class, in which 3-D projects were developed within a student-centered, student-directed environment. This investigation focused on student content learning, and problem solving. Additionally the social dynamics of the class and the role of peer mentoring were examined to determine how these factors influenced student behavior and learning. Finally, parent and teachers perceptions of the influence of the class were examined. The participants included freshmen through senior students, parents, teachers and the high school principal. Student interviews and classroom observations were used to collect data from students, while teachers and parents completed surveys. The results of this study suggested that this application of virtual reality (VR) learning environment promoted the development of; meaningful cognitive experiences, creativity, leadership, global socialization, problem solving and a deeper understanding of academic content. Further theoretical implications for 3-D virtual reality technology are exceedingly promising, and warrant additional research and development as an instructional tool for practical use.

  13. Virtual Reality-based Telesurgery via Teleprogramming Scheme Combined with Semi-autonomous Control.

    PubMed

    Zhijiang, Du; Zhiheng, Jia; Minxiu, Kong

    2005-01-01

    Telesurgery systems have long been suffering variable and unpredictable Internet commutation time delay, operation fatigue, and other drawbacks. Based on virtual reality technology, a teleprogramming scheme combined with semi-autonomous control is introduced to guarantee the robustness and efficiency of teleoperation of HIT-RAOS, a robot-assisted orthopedic surgery system. In this system, without considering time delay, the operator can just interact with virtual environment which provides real-time 3D vision, stereophonic sound, and tactile and force feedback imitated by a parallel master manipulator. And several tasks can be managed simultaneously via semi-autonomous control. Finally, the method is experimentally demonstrated on an experiment of locking of intramedullary nails, and is shown to effectively provide stability and performances. PMID:17282656

  14. 3D virtual human atria: A computational platform for studying clinical atrial fibrillation.

    PubMed

    Aslanidi, Oleg V; Colman, Michael A; Stott, Jonathan; Dobrzynski, Halina; Boyett, Mark R; Holden, Arun V; Zhang, Henggui

    2011-10-01

    Despite a vast amount of experimental and clinical data on the underlying ionic, cellular and tissue substrates, the mechanisms of common atrial arrhythmias (such as atrial fibrillation, AF) arising from the functional interactions at the whole atria level remain unclear. Computational modelling provides a quantitative framework for integrating such multi-scale data and understanding the arrhythmogenic behaviour that emerges from the collective spatio-temporal dynamics in all parts of the heart. In this study, we have developed a multi-scale hierarchy of biophysically detailed computational models for the human atria--the 3D virtual human atria. Primarily, diffusion tensor MRI reconstruction of the tissue geometry and fibre orientation in the human sinoatrial node (SAN) and surrounding atrial muscle was integrated into the 3D model of the whole atria dissected from the Visible Human dataset. The anatomical models were combined with the heterogeneous atrial action potential (AP) models, and used to simulate the AP conduction in the human atria under various conditions: SAN pacemaking and atrial activation in the normal rhythm, break-down of regular AP wave-fronts during rapid atrial pacing, and the genesis of multiple re-entrant wavelets characteristic of AF. Contributions of different properties of the tissue to mechanisms of the normal rhythm and arrhythmogenesis were investigated. Primarily, the simulations showed that tissue heterogeneity caused the break-down of the normal AP wave-fronts at rapid pacing rates, which initiated a pair of re-entrant spiral waves; and tissue anisotropy resulted in a further break-down of the spiral waves into multiple meandering wavelets characteristic of AF. The 3D virtual atria model itself was incorporated into the torso model to simulate the body surface ECG patterns in the normal and arrhythmic conditions. Therefore, a state-of-the-art computational platform has been developed, which can be used for studying multi

  15. Virtual Reality for the Psychophysiological Assessment of Phobic Fear: Responses during Virtual Tunnel Driving

    ERIC Educational Resources Information Center

    Muhlberger, Andreas; Bulthoff, Heinrich H.; Wiedemann, Georg; Pauli, Paul

    2007-01-01

    An overall assessment of phobic fear requires not only a verbal self-report of fear but also an assessment of behavioral and physiological responses. Virtual reality can be used to simulate realistic (phobic) situations and therefore should be useful for inducing emotions in a controlled, standardized way. Verbal and physiological fear reactions…

  16. The cranial nerve skywalk: A 3D tutorial of cranial nerves in a virtual platform.

    PubMed

    Richardson-Hatcher, April; Hazzard, Matthew; Ramirez-Yanez, German

    2014-01-01

    Visualization of the complex courses of the cranial nerves by students in the health-related professions is challenging through either diagrams in books or plastic models in the gross laboratory. Furthermore, dissection of the cranial nerves in the gross laboratory is an extremely meticulous task. Teaching and learning the cranial nerve pathways is difficult using two-dimensional (2D) illustrations alone. Three-dimensional (3D) models aid the teacher in describing intricate and complex anatomical structures and help students visualize them. The study of the cranial nerves can be supplemented with 3D, which permits the students to fully visualize their distribution within the craniofacial complex. This article describes the construction and usage of a virtual anatomy platform in Second Life™, which contains 3D models of the cranial nerves III, V, VII, and IX. The Cranial Nerve Skywalk features select cranial nerves and the associated autonomic pathways in an immersive online environment. This teaching supplement was introduced to groups of pre-healthcare professional students in gross anatomy courses at both institutions and student feedback is included. PMID:24678025

  17. Scalable, High-performance 3D Imaging Software Platform: System Architecture and Application to Virtual Colonoscopy

    PubMed Central

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2013-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system. PMID:23366803

  18. Towards a 3d Based Platform for Cultural Heritage Site Survey and Virtual Exploration

    NASA Astrophysics Data System (ADS)

    Seinturier, J.; Riedinger, C.; Mahiddine, A.; Peloso, D.; Boï, J.-M.; Merad, D.; Drap, P.

    2013-07-01

    This paper present a 3D platform that enables to make both cultural heritage site survey and its virtual exploration. It provides a single and easy way to use framework for merging multi scaled 3D measurements based on photogrammetry, documentation produced by experts and the knowledge of involved domains leaving the experts able to extract and choose the relevant information to produce the final survey. Taking into account the interpretation of the real world during the process of archaeological surveys is in fact the main goal of a survey. New advances in photogrammetry and the capability to produce dense 3D point clouds do not solve the problem of surveys. New opportunities for 3D representation are now available and we must to use them and find new ways to link geometry and knowledge. The new platform is able to efficiently manage and process large 3D data (points set, meshes) thanks to the implementation of space partition methods coming from the state of the art such as octrees and kd-trees and thus can interact with dense point clouds (thousands to millions of points) in real time. The semantisation of raw 3D data relies on geometric algorithms such as geodetic path computation, surface extraction from dense points cloud and geometrical primitive optimization. The platform provide an interface that enables expert to describe geometric representations of interesting objects like ashlar blocs, stratigraphic units or generic items (contour, lines, … ) directly onto the 3D representation of the site and without explicit links to underlying algorithms. The platform provide two ways for describing geometric representation. If oriented photographs are available, the expert can draw geometry on a photograph and the system computes its 3D representation by projection on the underlying mesh or the points cloud. If photographs are not available or if the expert wants to only use the 3D representation then he can simply draw objects shape on it. When 3D

  19. Detection and localization of sounds: Virtual tones and virtual reality

    NASA Astrophysics Data System (ADS)

    Zhang, Peter Xinya

    Modern physiologically based binaural models employ internal delay lines in the pathways from left and right peripheries to central processing nuclei. Various models apply the delay lines differently, and give different predictions for the detection of dichotic pitches, wherein listeners hear a virtual tone in the noise background. Two dichotic pitch stimuli (Huggins pitch and binaural coherence edge pitch) with low boundary frequencies were used to test the predictions by two different models. The results from five experiments show that the relative dichotic pitch strengths support the equalization-cancellation model and disfavor the central activity pattern (CAP) model. The CAP model makes predictions for the lateralization of Huggins pitch based on interaural time differences (ITD). By measuring human lateralization for Huggins pitches with two different types of phase boundaries (linear-phase and stepped phase), and by comparing with lateralization of sine-tones, it was shown that the lateralization of Huggins pitch stimuli is similar to that of the corresponding sine-tones, and the lateralizations of Huggins pitch stimuli with the two different boundaries were even more similar to one another. The results agreed roughly with the CAP model predictions. Agreement was significantly improved by incorporating individualized scale factors and offsets into the model, and was further unproved with a model including compression at large ITDs. Furthermore, ambiguous stimuli, with an interaural phase difference of 180 degrees, were consistently lateralized on the left or right based on individual asymmetries---which introduces the concept of "earedness". Interaural phase difference (IPD) and interaural time difference (ITD) are two different forms of temporal cues. With varying frequency, an auditory system based on IPD or ITD gives different quantitative predictions on lateralization. A lateralization experiment with sine tones tested whether human auditory system is an

  20. Instructors' Perceptions of Three-Dimensional (3D) Virtual Worlds: Instructional Use, Implementation and Benefits for Adult Learners

    ERIC Educational Resources Information Center

    Stone, Sophia Jeffries

    2009-01-01

    The purpose of this dissertation research study was to explore instructors' perceptions of the educational application of three-dimensional (3D) virtual worlds in a variety of academic discipline areas and to assess the strengths and limitations this virtual environment presents for teaching adult learners. The guiding research question for this…

  1. Using a Quest in a 3D Virtual Environment for Student Interaction and Vocabulary Acquisition in Foreign Language Learning

    ERIC Educational Resources Information Center

    Kastoudi, Denise

    2011-01-01

    The gaming and interactional nature of the virtual environment of Second Life offers opportunities for language learning beyond the traditional pedagogy. This study case examined the potential of 3D virtual quest games to enhance vocabulary acquisition through interaction, negotiation of meaning and noticing. Four adult students of English at…

  2. Training software using virtual-reality technology and pre-calculated effective dose data.

    PubMed

    Ding, Aiping; Zhang, Di; Xu, X George

    2009-05-01

    This paper describes the development of a software package, called VR Dose Simulator, which aims to provide interactive radiation safety and ALARA training to radiation workers using virtual-reality (VR) simulations. Combined with a pre-calculated effective dose equivalent (EDE) database, a virtual radiation environment was constructed in VR authoring software, EON Studio, using 3-D models of a real nuclear power plant building. Models of avatars representing two workers were adopted with arms and legs of the avatar being controlled in the software to simulate walking and other postures. Collision detection algorithms were developed for various parts of the 3-D power plant building and avatars to confine the avatars to certain regions of the virtual environment. Ten different camera viewpoints were assigned to conveniently cover the entire virtual scenery in different viewing angles. A user can control the avatar to carry out radiological engineering tasks using two modes of avatar navigation. A user can also specify two types of radiation source: Cs and Co. The location of the avatar inside the virtual environment during the course of the avatar's movement is linked to the EDE database. The accumulative dose is calculated and displayed on the screen in real-time. Based on the final accumulated dose and the completion status of all virtual tasks, a score is given to evaluate the performance of the user. The paper concludes that VR-based simulation technologies are interactive and engaging, thus potentially useful in improving the quality of radiation safety training. The paper also summarizes several challenges: more streamlined data conversion, realistic avatar movement and posture, more intuitive implementation of the data communication between EON Studio and VB.NET, and more versatile utilization of EDE data such as a source near the body, etc., all of which needs to be addressed in future efforts to develop this type of software. PMID:19359853

  3. Fire training in a virtual-reality environment

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Jurgen; Bucken, Arno

    2005-03-01

    Although fire is very common in our daily environment - as a source of energy at home or as a tool in industry - most people cannot estimate the danger of a conflagration. Therefore it is important to train people in combating fire. Beneath training with propane simulators or real fires and real extinguishers, fire training can be performed in virtual reality, which means a pollution-free and fast way of training. In this paper we describe how to enhance a virtual-reality environment with a real-time fire simulation and visualisation in order to establish a realistic emergency-training system. The presented approach supports extinguishing of the virtual fire including recordable performance data as needed in teletraining environments. We will show how to get realistic impressions of fire using advanced particle-simulation and how to use the advantages of particles to trigger states in a modified cellular automata used for the simulation of fire-behaviour. Using particle systems that interact with cellular automata it is possible to simulate a developing, spreading fire and its reaction on different extinguishing agents like water, CO2 or oxygen. The methods proposed in this paper have been implemented and successfully tested on Cosimir, a commercial robot-and VR-simulation-system.

  4. Astronaut Prepares for Mission With Virtual Reality Hardware

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Astronaut John M. Grunsfeld, STS-109 payload commander, uses virtual reality hardware at Johnson Space Center to rehearse some of his duties prior to the STS-109 mission. The most familiar form of virtual reality technology is some form of headpiece, which fits over your eyes and displays a three dimensional computerized image of another place. Turn your head left and right, and you see what would be to your sides; turn around, and you see what might be sneaking up on you. An important part of the technology is some type of data glove that you use to propel yourself through the virtual world. This technology allows NASA astronauts to practice International Space Station work missions in advance. Currently, the medical community is using the new technologies in four major ways: To see parts of the body more accurately, for study, to make better diagnosis of disease and to plan surgery in more detail; to obtain a more accurate picture of a procedure during surgery; to perform more types of surgery with the most noninvasive, accurate methods possible; and to model interactions among molecules at a molecular level.

  5. The 3D visualization technology research of submarine pipeline based Horde3D GameEngine

    NASA Astrophysics Data System (ADS)

    Yao, Guanghui; Ma, Xiushui; Chen, Genlang; Ye, Lingjian

    2013-10-01

    With the development of 3D display and virtual reality technology, its application gets more and more widespread. This paper applies 3D display technology to the monitoring of submarine pipeline. We reconstruct the submarine pipeline and its surrounding submarine terrain in computer using Horde3D graphics rendering engine on the foundation database "submarine pipeline and relative landforms landscape synthesis database" so as to display the virtual scene of submarine pipeline based virtual reality and show the relevant data collected from the monitoring of submarine pipeline.

  6. Virtual reality applications in improving postural control and minimizing falls.

    PubMed

    Virk, Sumandeep; McConville, Kristiina M Valter

    2006-01-01

    Maintaining balance under all conditions is an absolute requirement for humans. Orientation in space and balance maintenance requires inputs from the vestibular, the visual, the proprioceptive and the somatosensory systems. All the cues coming from these systems are integrated by the central nervous system (CNS) to employ different strategies for orientation and balance. How the CNS integrates all the inputs and makes cognitive decisions about balance strategies has been an area of interest for biomedical engineers for a long time. More interesting is the fact that in the absence of one or more cues, or when the input from one of the sensors is skewed, the CNS "adapts" to the new environment and gives less weight to the conflicting inputs [1]. The focus of this paper is a review of different strategies and models put forward by researchers to explain the integration of these sensory cues. Also, the paper compares the different approaches used by young and old adults in maintaining balance. Since with age the musculoskeletal, visual and vestibular system deteriorates, the older subjects have to compensate for these impaired sensory cues for postural stability. The paper also discusses the applications of virtual reality in rehabilitation programs not only for balance in the elderly but also in occupational falls. Virtual reality has profound applications in the field of balance rehabilitation and training because of its relatively low cost. Studies will be conducted to evaluate the effectiveness of virtual reality training in modifying the head and eye movement strategies, and determine the role of these responses in the maintenance of balance. PMID:17946975

  7. Virtual Reality Intervention for Older Women with Breast Cancer

    PubMed Central

    SCHNEIDER, SUSAN M.; ELLIS, MATHEW; COOMBS, WILLIAM T.; SHONKWILER, ERIN L.; FOLSOM, LINDA C.

    2013-01-01

    This study examined the effects of a virtual reality distraction intervention on chemotherapy-related symptom distress levels in 16 women aged 50 and older. A cross-over design was used to answer the following research questions: (1) Is virtual reality an effective distraction intervention for reducing chemotherapy-related symptom distress levels in older women with breast cancer? (2) Does virtual reality have a lasting effect? Chemotherapy treatments are intensive and difficult to endure. One way to cope with chemotherapy-related symptom distress is through the use of distraction. For this study, a head-mounted display (Sony PC Glasstron PLM—S700) was used to display encompassing images and block competing stimuli during chemotherapy infusions. The Symptom Distress Scale (SDS), Revised Piper Fatigue Scale (PFS), and the State Anxiety Inventory (SAI) were used to measure symptom distress. For two matched chemotherapy treatments, one pre-test and two post-test measures were employed. Participants were randomly assigned to receive the VR distraction intervention during one chemotherapy treatment and received no distraction intervention (control condition) during an alternate chemotherapy treatment. Analysis using paired t-tests demonstrated a significant decrease in the SAI (p = 0.10) scores immediately following chemotherapy treatments when participants used VR. No significant changes were found in SDS or PFS values. There was a consistent trend toward improved symptoms on all measures 48 h following completion of chemotherapy. Evaluation of the intervention indicated that women thought the head mounted device was easy to use, they experienced no cybersickness, and 100% would use VR again. PMID:12855087

  8. Virtual Reality environment assisting post stroke hand rehabilitation: case report.

    PubMed

    Tsoupikova, Daria; Stoykov, Nikolay; Kamper, Derek; Vick, Randy

    2013-01-01

    We describe a novel art-empowered Virtual Reality (VR) system designed for hand rehabilitation therapy following stroke. The system was developed by an interdisciplinary team of engineers, art therapists, occupational therapists, and VR artist to improve patient's motivation and engagement. We describe system design, development, and user testing for efficiency, subject's satisfaction and clinical feasibility. We report initial results following use of the system on the first four subjects from the ongoing clinical efficacy trials as measured by standard clinical tests for upper extremity function. These cases demonstrate that the system is operational and can facilitate therapy for post stroke patients with upper extremity impairment. PMID:23400202

  9. Three-Dimensional User Interfaces for Immersive Virtual Reality

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1997-01-01

    The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.

  10. Virtual reality applications to automated rendezvous and capture

    NASA Technical Reports Server (NTRS)

    Hale, Joseph; Oneil, Daniel

    1991-01-01

    Virtual Reality (VR) is a rapidly developing Human/Computer Interface (HCI) technology. The evolution of high-speed graphics processors and development of specialized anthropomorphic user interface devices, that more fully involve the human senses, have enabled VR technology. Recently, the maturity of this technology has reached a level where it can be used as a tool in a variety of applications. This paper provides an overview of: VR technology, VR activities at Marshall Space Flight Center (MSFC), applications of VR to Automated Rendezvous and Capture (AR&C), and identifies areas of VR technology that requires further development.

  11. A virtual reality browser for Space Station models

    NASA Technical Reports Server (NTRS)

    Goldsby, Michael; Pandya, Abhilash; Aldridge, Ann; Maida, James

    1993-01-01

    The Graphics Analysis Facility at NASA/JSC has created a visualization and learning tool by merging its database of detailed geometric models with a virtual reality system. The system allows an interactive walk-through of models of the Space Station and other structures, providing detailed realistic stereo images. The user can activate audio messages describing the function and connectivity of selected components within his field of view. This paper presents the issues and trade-offs involved in the implementation of the VR system and discusses its suitability for its intended purposes.

  12. One's Colonies: a virtual reality environment of oriental residences

    NASA Astrophysics Data System (ADS)

    Chi, Catherine

    2013-03-01

    This paper is a statement about my virtual reality environment project, One's Colonies, and a description of the creative process of the project. I was inspired by the buildings in my hometown-Taiwan, which is really different from the architectural style in the United States. By analyzing the unique style of dwellings in Taiwan, I want to demonstrate how the difference between geography, weather and culture change the appearance of the living space. Through this project I want to express the relationship between architectural style and cultural difference, and how the emotional condition or characteristics of the residents are affected by their residencies.

  13. Research on distributed virtual reality system in electronic commerce

    NASA Astrophysics Data System (ADS)

    Xue, Qiang; Wang, Jiening; Sun, Jizhou

    2004-03-01

    In this paper, Distributed Virtual Reality (DVR) technology applied in Electronical Commerce (EC) is discussed. DVR has the capability of providing a new means for human being to recognize, analyze and resolve the large scale, complex problems, which makes it develop quickly in EC fields. The technology of CSCW (Computer Supported Cooperative Work) and middleware is introduced into the development of EC-DVR system to meet the need of a platform which can provide the necessary cooperation and communication services to avoid developing the basic module repeatedly. Finally, the paper gives a platform structure of EC-DVR system.

  14. Dots and dashes: art, virtual reality, and the telegraph

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia; Chang, Ben

    2009-02-01

    Dots and Dashes is a virtual reality artwork that explores online romance over the telegraph, based on Ella Cheever Thayer's novel Wired Love - a Romance in Dots and Dashes (an Old Story Told in a New Way)1. The uncanny similarities between this story and the world of today's virtual environments provides the springboard for an exploration of a wealth of anxieties and dreams, including the construction of identities in an electronically mediated environment, the shifting boundaries between the natural and machine worlds, and the spiritual dimensions of science and technology. In this paper we examine the parallels between the telegraph networks and our current conceptions of cyberspace, as well as unique social and cultural impacts specific to the telegraph. These include the new opportunities and roles available to women in the telegraph industry and the connection between the telegraph and the Spiritualist movement. We discuss the development of the artwork, its structure and aesthetics, and the technical development of the work.

  15. Virtual Reality of Sound Generated from Vibrating Structures

    NASA Astrophysics Data System (ADS)

    KIM, S. J.; SONG, J. Y.

    2002-11-01

    The advancement of virtual reality (VR) technology in cyberspace is amazing, but its development is mainly concentrated on the visual part. In this paper, the development of VR technology to produce sound based on the exact physics is studied. Our main concern is on the sound generated from vibrating structures. This may be useful, for example, in apprehending sound field characteristics of an aircraft cabin in design stage. To calculate sound pressure from curved surface of a structure, a new integration scheme is developed in boundary element method. Several example problems are solved to confirm our integration scheme. The pressure distributions on a uniformly driven sphere and cylinders are computed and compared with analytic solutions, and radiation efficiency of a vibrating plate under one-dimensional flow is also calculated. Also, to realize sound through computer simulation, two concepts, "structure-oriented analysis" and "human-oriented analysis", are proposed. Using these concepts, virtual sound field of an aircraft cabin is created.

  16. Auditory cues increase the hippocampal response to unimodal virtual reality.

    PubMed

    Andreano, Joseph; Liang, Kevin; Kong, Lingjun; Hubbard, David; Wiederhold, Brenda K; Wiederhold, Mark D

    2009-06-01

    Previous research suggests that the effectiveness of virtual reality exposure therapy should increase as the experience becomes more immersive. However, the neural mechanisms underlying the experience of immersion are not yet well understood. To address this question, neural activity during exposure to two virtual worlds was measured by functional magnetic resonance imaging (fMRI). Two levels of immersion were used: unimodal (video only) and multimodal (video plus audio). The results indicated increased activity in both auditory and visual sensory cortices during multimodal presentation. Additionally, multimodal presentation elicited increased activity in the hippocampus, a region well known to be involved in learning and memory. The implications of this finding for exposure therapy are discussed. PMID:19500000

  17. Virtual reality robotic telesurgery simulations using MEMICA haptic system

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Mavroidis, Constantinos; Bouzit, Mourad; Dolgin, Benjamin; Harm, Deborah L.; Kopchok, George E.; White, Rodney

    2001-01-01

    The authors conceived a haptic mechanism called MEMICA (Remote Mechanical Mirroring using Controlled stiffness and Actuators) that can enable the design of high dexterity, rapid response, and large workspace haptic system. The development of a novel MEMICA gloves and virtual reality models are being explored to allow simulation of telesurgery and other applications. The MEMICA gloves are being designed to provide intuitive mirroring of the conditions at a virtual site where a robot simulates the presence of a human operator. The key components of MEMICA are miniature electrically controlled stiffness (ECS) elements and electrically controlled force and stiffness (ECFS) actuators that are based on the use of Electro-Rheological Fluids (ERF. In this paper the design of the MEMICA system and initial experimental results are presented.

  18. Chavir: Virtual reality simulation for interventions in nuclear installations

    SciTech Connect

    Thevenon, J. B.; Tirel, O.; Lopez, L.; Chodorge, L.; Desbats, P.

    2006-07-01

    Companies involved in the nuclear industry have to prepare for interventions by precisely analyzing the radiological risks and rapidly evaluating the consequences of their operational choices. They also need to consolidate the experiences gained in the field with greater responsiveness and lower costs. This paper brings out the advantages of using virtual reality technology to meet the demands in the industry. The CHAVIR software allows the operators to prepare (and repeat) all the operations they would have to do in a safe virtual world, before performing the actual work inside the facilities. Since the decommissioning or maintenance work is carried out in an environment where there is radiation, the amount of radiation that the operator would be exposed to is calculated and integrated into the simulator. (authors)

  19. Lean on Wii: physical rehabilitation with virtual reality Wii peripherals.

    PubMed

    Anderson, Fraser; Annett, Michelle; Bischof, Walter F

    2010-01-01

    In recent years, a growing number of occupational therapists have integrated video game technologies, such as the Nintendo Wii, into rehabilitation programs. 'Wiihabilitation', or the use of the Wii in rehabilitation, has been successful in increasing patients' motivation and encouraging full body movement. The non-rehabilitative focus of Wii applications, however, presents a number of problems: games are too difficult for patients, they mainly target upper-body gross motor functions, and they lack support for task customization, grading, and quantitative measurements. To overcome these problems, we have designed a low-cost, virtual-reality based system. Our system, Virtual Wiihab, records performance and behavioral measurements, allows for activity customization, and uses auditory, visual, and haptic elements to provide extrinsic feedback and motivation to patients. PMID:20543303

  20. Can immersive virtual reality reduce phantom limb pain?

    PubMed

    Murray, Craig D; Patchick, Emma L; Caillette, Fabrice; Howard, Toby; Pettifer, Stephen

    2006-01-01

    This paper describes the design and implementation of a case-study based investigation using immersive virtual reality as a treatment for phantom limb pain. The authors' work builds upon prior research which has found the use of a mirror box (where the amputee sees a mirror image of their remaining anatomical limb in the phenomenal space of their amputated limb) can reduce phantom limb pain and voluntary movement to paralyzed phantom limbs for some amputees. The present project involves the transposition of movements made by amputees' anatomical limb into movements of a virtual limb which is presented in the phenomenal space of their phantom limb. The three case studies presented here provide qualitative data which provide tentative support for the use of this system for phantom pain relief. The authors suggest the need for further research using control trials. PMID:16404088

  1. Sensor Spatial Distortion, Visual Latency, and Update Rate Effects on 3D Tracking in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Ellis, S. R.; Adelstein, B. D.; Baumeler, S.; Jense, G. J.; Jacoby, R. H.; Trejo, Leonard (Technical Monitor)

    1998-01-01

    Several common defects that we have sought to minimize in immersing virtual environments are: static sensor spatial distortion, visual latency, and low update rates. Human performance within our environments during large amplitude 3D tracking was assessed by objective and subjective methods in the presence and absence of these defects. Results show that 1) removal of our relatively small spatial sensor distortion had minor effects on the tracking activity, 2) an Adapted Cooper-Harper controllability scale proved the most sensitive subjective indicator of the degradation of dynamic fidelity caused by increasing latency and decreasing frame rates, and 3) performance, as measured by normalized RMS tracking error or subjective impressions, was more markedly influenced by changing visual latency than by update rate.

  2. First Person Experience of Body Transfer in Virtual Reality

    PubMed Central

    Slater, Mel; Spanlang, Bernhard; Sanchez-Vives, Maria V.; Blanke, Olaf

    2010-01-01

    Background Altering the normal association between touch and its visual correlate can result in the illusory perception of a fake limb as part of our own body. Thus, when touch is seen to be applied to a rubber hand while felt synchronously on the corresponding hidden real hand, an illusion of ownership of the rubber hand usually occurs. The illusion has also been demonstrated using visuomotor correlation between the movements of the hidden real hand and the seen fake hand. This type of paradigm has been used with respect to the whole body generating out-of-the-body and body substitution illusions. However, such studies have only ever manipulated a single factor and although they used a form of virtual reality have not exploited the power of immersive virtual reality (IVR) to produce radical transformations in body ownership. Principal Findings Here we show that a first person perspective of a life-sized virtual human female body that appears to substitute the male subjects' own bodies was sufficient to generate a body transfer illusion. This was demonstrated subjectively by questionnaire and physiologically through heart-rate deceleration in response to a threat to the virtual body. This finding is in contrast to earlier experimental studies that assume visuotactile synchrony to be the critical contributory factor in ownership illusions. Our finding was possible because IVR allowed us to use a novel experimental design for this type of problem with three independent binary factors: (i) perspective position (first or third), (ii) synchronous or asynchronous mirror reflections and (iii) synchrony or asynchrony between felt and seen touch. Conclusions The results support the notion that bottom-up perceptual mechanisms can temporarily override top down knowledge resulting in a radical illusion of transfer of body ownership. The research also illustrates immersive virtual reality as a powerful tool in the study of body representation and experience, since it supports

  3. Combinatorial Pharmacophore-Based 3D-QSAR Analysis and Virtual Screening of FGFR1 Inhibitors

    PubMed Central

    Zhou, Nannan; Xu, Yuan; Liu, Xian; Wang, Yulan; Peng, Jianlong; Luo, Xiaomin; Zheng, Mingyue; Chen, Kaixian; Jiang, Hualiang

    2015-01-01

    The fibroblast growth factor/fibroblast growth factor receptor (FGF/FGFR) signaling pathway plays crucial roles in cell proliferation, angiogenesis, migration, and survival. Aberration in FGFRs correlates with several malignancies and disorders. FGFRs have proved to be attractive targets for therapeutic intervention in cancer, and it is of high interest to find FGFR inhibitors with novel scaffolds. In this study, a combinatorial three-dimensional quantitative structure-activity relationship (3D-QSAR) model was developed based on previously reported FGFR1 inhibitors with diverse structural skeletons. This model was evaluated for its prediction performance on a diverse test set containing 232 FGFR inhibitors, and it yielded a SD value of 0.75 pIC50 units from measured inhibition affinities and a Pearson’s correlation coefficient R2 of 0.53. This result suggests that the combinatorial 3D-QSAR model could be used to search for new FGFR1 hit structures and predict their potential activity. To further evaluate the performance of the model, a decoy set validation was used to measure the efficiency of the model by calculating EF (enrichment factor). Based on the combinatorial pharmacophore model, a virtual screening against SPECS database was performed. Nineteen novel active compounds were successfully identified, which provide new chemical starting points for further structural optimization of FGFR1 inhibitors. PMID:26110383

  4. Finite element visualization in the cave virtual reality environment

    SciTech Connect

    Plaskacz, E.J.; Kuhn, M.A.

    1996-03-01

    Through the use of the post-processing software, Virtual Reality visualization (VRviz), and the Cave Automatic Virtual Environment (CAVE), finite element representations can be viewed as they would be in real life. VRviz is a program written in ANSI C to translate the mathematical results generated by finite element analysis programs into a virtual representation. This virtual representation is projected into the CAVE environment and the results are animated. The animation is fully controllable. A user is able to translate the image, rotate about any axis and scale the image at any time. The user is also able to freeze the animation at any time step and control the image update rate. This allows the user to navigate around, or even inside, the image in order to effectively analyze possible failure points and redesign as necessary. Through the use of the CAVE and the real life image that is being produced by VRviz, engineers are able to save considerable time, money, and effort in the design process.

  5. Virtual Reality: Developing a VR space for Academic activities

    NASA Astrophysics Data System (ADS)

    Kaimaris, D.; Stylianidis, E.; Karanikolas, N.

    2014-05-01

    Virtual reality (VR) is extensively used in various applications; in industry, in academia, in business, and is becoming more and more affordable for end users from the financial point of view. At the same time, in academia and higher education more and more applications are developed, like in medicine, engineering, etc. and students are inquiring to be well-prepared for their professional life after their educational life cycle. Moreover, VR is providing the benefits having the possibility to improve skills but also to understand space as well. This paper presents the methodology used during a course, namely "Geoinformatics applications" at the School of Spatial Planning and Development (Eng.), Aristotle University of Thessaloniki, to create a virtual School space. The course design focuses on the methods and techniques to be used in order to develop the virtual environment. In addition the project aspires to become more and more effective for the students and provide a real virtual environment with useful information not only for the students but also for any citizen interested in the academic life at the School.

  6. Cue reactivity in virtual reality: the role of context.

    PubMed

    Paris, Megan M; Carter, Brian L; Traylor, Amy C; Bordnick, Patrick S; Day, Susan X; Armsworth, Mary W; Cinciripini, Paul M

    2011-07-01

    Cigarette smokers in laboratory experiments readily respond to smoking stimuli with increased craving. An alternative to traditional cue-reactivity methods (e.g., exposure to cigarette photos), virtual reality (VR) has been shown to be a viable cue presentation method to elicit and assess cigarette craving within complex virtual environments. However, it remains poorly understood whether contextual cues from the environment contribute to craving increases in addition to specific cues, like cigarettes. This study examined the role of contextual cues in a VR environment to evoke craving. Smokers were exposed to a virtual convenience store devoid of any specific cigarette cues followed by exposure to the same convenience store with specific cigarette cues added. Smokers reported increased craving following exposure to the virtual convenience store without specific cues, and significantly greater craving following the convenience store with cigarette cues added. However, increased craving recorded after the second convenience store may have been due to the pre-exposure to the first convenience store. This study offers evidence that an environmental context where cigarette cues are normally present (but are not), elicits significant craving in the absence of specific cigarette cues. This finding suggests that VR may have stronger ecological validity over traditional cue reactivity exposure methods by exposing smokers to the full range of cigarette-related environmental stimuli, in addition to specific cigarette cues, that smokers typically experience in their daily lives. PMID:21349649

  7. 3D modeling of the Strasbourg's Cathedral basements for interdisciplinary research and virtual visits

    NASA Astrophysics Data System (ADS)

    Landes, T.; Kuhnle, G.; Bruna, R.

    2015-08-01

    On the occasion of the millennium celebration of Strasbourg Cathedral, a transdisciplinary research group composed of archaeologists, surveyors, architects, art historians and a stonemason revised the 1966-1972 excavations under the St. Lawrence's Chapel of the Cathedral having remains of Roman and medieval masonry. The 3D modeling of the Chapel has been realized based on the combination of conventional surveying techniques for the network creation, laser scanning for the model creation and photogrammetric techniques for the texturing of a few parts. According to the requirements and the end-user of the model, the level of detail and level of accuracy have been adapted and assessed for every floor. The basement has been acquired and modeled with more details and a higher accuracy than the other parts. Thanks to this modeling work, archaeologists can confront their assumptions to those of other disciplines by simulating constructions of other worship edifices on the massive stones composing the basement. The virtual reconstructions provided evidence in support of these assumptions and served for communication via virtual visits.

  8. Effect of viewing mode on pathfinding in immersive Virtual Reality.

    PubMed

    White, Paul J; Byagowi, Ahmad; Moussavi, Zahra

    2015-08-01

    The use of Head Mounted Displays (HMDs) to view Virtual Reality Environments (VREs) has received much attention recently. This paper reports on the difference between actual humans' navigation in a VRE viewed through an HMD compared to that in the same VRE viewed on a laptop PC display. A novel Virtual Reality (VR) Navigation input device (VRNChair), designed by our team, was paired with an Oculus Rift DK2 Head-Mounted Display (HMD). People used the VRNChair to navigate a VRE, and we analyzed their navigational trajectories with and without the HMD to investigate plausible differences in performance due to the display device. It was found that people's navigational trajectories were more accurate while wearing the HMD compared to viewing an LCD monitor; however, the duration to complete a navigation task remained the same. This implies that increased immersion in VR results in an improvement in pathfinding. In addition, motion sickness caused by using an HMD can be reduced if one uses an input device such as our VRNChair. The VRNChair paired with an HMD provides vestibular stimulation as one moves in the VRE, because movements in the VRE are synchronized with movements in the real environment. PMID:26737323

  9. Validation of a Novel Virtual Reality Simulator for Robotic Surgery

    PubMed Central

    Schreuder, Henk W. R.; Persson, Jan E. U.; Wolswijk, Richard G. H.; Ihse, Ingmar; Schijven, Marlies P.; Verheijen, René H. M.

    2014-01-01

    Objective. With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. Methods. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Results. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were “time to complete” and “economy of motion” (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Conclusions. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery. PMID:24600328

  10. Low-Cost, Portable, Multi-Wall Virtual Reality

    NASA Technical Reports Server (NTRS)

    Miller, Samuel A.; Misch, Noah J.; Dalton, Aaron J.

    2005-01-01

    Virtual reality systems make compelling outreach displays, but some such systems, like the CAVE, have design features that make their use for that purpose inconvenient. In the case of the CAVE, the equipment is difficult to disassemble, transport, and reassemble, and typically CAVEs can only be afforded by large-budget research facilities. We implemented a system like the CAVE that costs less than $30,000, weighs about 500 pounds, and fits into a fifteen-passenger van. A team of six people have unpacked, assembled, and calibrated the system in less than two hours. This cost reduction versus similar virtual-reality systems stems from the unique approach we took to stereoscopic projection. We used an assembly of optical chopper wheels and commodity LCD projectors to create true active stereo at less than a fifth of the cost of comparable active-stereo technologies. The screen and frame design also optimized portability; the frame assembles in minutes with only two fasteners, and both it and the screen pack into small bundles for easy and secure shipment.

  11. Game controller modification for fMRI hyperscanning experiments in a cooperative virtual reality environment.

    PubMed

    Trees, Jason; Snider, Joseph; Falahpour, Maryam; Guo, Nick; Lu, Kun; Johnson, Douglas C; Poizner, Howard; Liu, Thomas T

    2014-01-01

    Hyperscanning, an emerging technique in which data from multiple interacting subjects' brains are simultaneously recorded, has become an increasingly popular way to address complex topics, such as "theory of mind." However, most previous fMRI hyperscanning experiments have been limited to abstract social interactions (e.g. phone conversations). Our new method utilizes a virtual reality (VR) environment used for military training, Virtual Battlespace 2 (VBS2), to create realistic avatar-avatar interactions and cooperative tasks. To control the virtual avatar, subjects use a MRI compatible Playstation 3 game controller, modified by removing all extraneous metal components and replacing any necessary ones with 3D printed plastic models. Control of both scanners' operation is initiated by a VBS2 plugin to sync scanner time to the known time within the VR environment. Our modifications include:•Modification of game controller to be MRI compatible.•Design of VBS2 virtual environment for cooperative interactions.•Syncing two MRI machines for simultaneous recording. PMID:26150964

  12. Game controller modification for fMRI hyperscanning experiments in a cooperative virtual reality environment

    PubMed Central

    Trees, Jason; Snider, Joseph; Falahpour, Maryam; Guo, Nick; Lu, Kun; Johnson, Douglas C.; Poizner, Howard; Liu, Thomas T.

    2014-01-01

    Hyperscanning, an emerging technique in which data from multiple interacting subjects’ brains are simultaneously recorded, has become an increasingly popular way to address complex topics, such as “theory of mind.” However, most previous fMRI hyperscanning experiments have been limited to abstract social interactions (e.g. phone conversations). Our new method utilizes a virtual reality (VR) environment used for military training, Virtual Battlespace 2 (VBS2), to create realistic avatar-avatar interactions and cooperative tasks. To control the virtual avatar, subjects use a MRI compatible Playstation 3 game controller, modified by removing all extraneous metal components and replacing any necessary ones with 3D printed plastic models. Control of both scanners’ operation is initiated by a VBS2 plugin to sync scanner time to the known time within the VR environment. Our modifications include:•Modification of game controller to be MRI compatible.•Design of VBS2 virtual environment for cooperative interactions.•Syncing two MRI machines for simultaneous recording. PMID:26150964

  13. Virtual Superheroes: Using Superpowers in Virtual Reality to Encourage Prosocial Behavior

    PubMed Central

    Rosenberg, Robin S.; Baughman, Shawnee L.; Bailenson, Jeremy N.

    2013-01-01

    Background Recent studies have shown that playing prosocial video games leads to greater subsequent prosocial behavior in the real world. However, immersive virtual reality allows people to occupy avatars that are different from them in a perceptually realistic manner. We examine how occupying an avatar with the superhero ability to fly increases helping behavior. Principal Findings Using a two-by-two design, participants were either given the power of flight (their arm movements were tracked to control their flight akin to Superman’s flying ability) or rode as a passenger in a helicopter, and were assigned one of two tasks, either to help find a missing diabetic child in need of insulin or to tour a virtual city. Participants in the “super-flight” conditions helped the experimenter pick up spilled pens after their virtual experience significantly more than those who were virtual passengers in a helicopter. Conclusion The results indicate that having the “superpower” of flight leads to greater helping behavior in the real world, regardless of how participants used that power. A possible mechanism for this result is that having the power of flight primed concepts and prototypes associated with superheroes (e.g., Superman). This research illustrates the potential of using experiences in virtual reality technology to increase prosocial behavior in the physical world. PMID:23383029

  14. VEVI: A Virtual Reality Tool For Robotic Planetary Explorations

    NASA Technical Reports Server (NTRS)

    Piguet, Laurent; Fong, Terry; Hine, Butler; Hontalas, Phil; Nygren, Erik

    1994-01-01

    The Virtual Environment Vehicle Interface (VEVI), developed by the NASA Ames Research Center's Intelligent Mechanisms Group, is a modular operator interface for direct teleoperation and supervisory control of robotic vehicles. Virtual environments enable the efficient display and visualization of complex data. This characteristic allows operators to perceive and control complex systems in a natural fashion, utilizing the highly-evolved human sensory system. VEVI utilizes real-time, interactive, 3D graphics and position / orientation sensors to produce a range of interface modalities from the flat panel (windowed or stereoscopic) screen displays to head mounted/head-tracking stereo displays. The interface provides generic video control capability and has been used to control wheeled, legged, air bearing, and underwater vehicles in a variety of different environments. VEVI was designed and implemented to be modular, distributed and easily operated through long-distance communication links, using a communication paradigm called SYNERGY.

  15. Virtual reality and claustrophobia: multiple components therapy involving game editor virtual environments exposure.

    PubMed

    Malbos, E; Mestre, D R; Note, I D; Gellato, C

    2008-12-01

    The effectiveness of a multiple components therapy regarding claustrophobia and involving virtual reality (VR) will be demonstrated through a trial which immersed six claustrophobic patients in multiple context-graded enclosed virtual environments (VE) using affordable VR apparatus and software. The results of the questionnaires and behavior tests exhibited a significant reduction in fear towards the enclosed space and quality of life improvement. Such gains were maintained at 6-month follow-up. Presence score indicated the patients felt immersed and present inside the game editor VE. PMID:18954278

  16. 3D-ANTLERS: Virtual Reconstruction and Three-Dimensional Measurement

    NASA Astrophysics Data System (ADS)

    Barba, S.; Fiorillo, F.; De Feo, E.

    2013-02-01

    . In the ARTEC digital mock-up for example, it shows the ability to select the individual frames, already polygonal and geo-referenced at the time of capture; however, it is not possible to make an automated texturization differently from the low-cost environment which allows to produce a good graphics' definition. Once the final 3D models were obtained, we have proceeded to do a geometric and graphic comparison of the results. Therefore, in order to provide an accuracy requirement and an assessment for the 3D reconstruction we have taken into account the following benchmarks: cost, captured points, noise (local and global), shadows and holes, operability, degree of definition, quality and accuracy. Subsequently, these studies carried out in an empirical way on the virtual reconstructions, a 3D documentation was codified with a procedural method endorsing the use of terrestrial sensors for the documentation of antlers. The results thus pursued were compared with the standards set by the current provisions (see "Manual de medición" of Government of Andalusia-Spain); to date, in fact, the identification is based on data such as length, volume, colour, texture, openness, tips, structure, etc. Data, which is currently only appreciated with traditional instruments, such as tape measure, would be well represented by a process of virtual reconstruction and cataloguing.

  17. The development, assessment and validation of virtual reality for human anatomy instruction

    NASA Technical Reports Server (NTRS)

    Marshall, Karen Benn

    1996-01-01

    This research project seeks to meet the objective of science training by developing, assessing, validating and utilizing VR as a human anatomy training medium. Current anatomy instruction is primarily in the form of lectures and usage of textbooks. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three-dimensional, unlike the one-dimensional depiction found in textbooks and the two-dimensional depiction found on the computer. Virtual reality allows one to step through the computer screen into a 3-D artificial world. The primary objective of this project is to produce a virtual reality application of the abdominopelvic region of a human cadaver that can be taken back to the classroom. The hypothesis is that an immersive learning environment affords quicker anatomic recognition and orientation and a greater level of retention in human anatomy instruction. The goal is to augment not replace traditional modes of instruction.

  18. Virtual reality applied to hepatic surgery simulation: the next revolution.

    PubMed Central

    Marescaux, J; Clément, J M; Tassetti, V; Koehl, C; Cotin, S; Russier, Y; Mutter, D; Delingette, H; Ayache, N

    1998-01-01

    OBJECTIVE: This article describes a preliminary work on virtual reality applied to liver surgery and discusses the repercussions of assisted surgical strategy and surgical simulation on tomorrow's surgery. SUMMARY BACKGROUND DATA: Liver surgery is considered difficult because of the complexity and variability of the organ. Common generic tools for presurgical medical image visualization do not fulfill the requirements for the liver, restricting comprehension of a patient's specific liver anatomy. METHODS: Using data from the National Library of Medicine, a realistic three-dimensional image was created, including the envelope and the four internal arborescences. A computer interface was developed to manipulate the organ and to define surgical resection planes according to internal anatomy. The first step of surgical simulation was implemented, providing the organ with real-time deformation computation. RESULTS: The three-dimensional anatomy of the liver could be clearly visualized. The virtual organ could be manipulated and a resection defined depending on the anatomic relations between the arborescences, the tumor, and the external envelope. The resulting parts could also be visualized and manipulated. The simulation allowed the deformation of a liver model in real time by means of a realistic laparoscopic tool. CONCLUSIONS: Three-dimensional visualization of the organ in relation to the pathology is of great help to appreciate the complex anatomy of the liver. Using virtual reality concepts (navigation, interaction, and immersion), surgical planning, training, and teaching for this complex surgical procedure may be possible. The ability to practice a given gesture repeatedly will revolutionize surgical training, and the combination of surgical planning and simulation will improve the efficiency of intervention, leading to optimal care delivery. Images Figure 1. Figure 2. Figure 3. Figure 4. Figure 5. Figure 6. Figure 7. Figure 8. PMID:9833800

  19. From Vesalius to virtual reality: How embodied cognition facilitates the visualization of anatomy

    NASA Astrophysics Data System (ADS)

    Jang, Susan

    This study examines the facilitative effects of embodiment of a complex internal anatomical structure through three-dimensional ("3-D") interactivity in a virtual reality ("VR") program. Since Shepard and Metzler's influential 1971 study, it has been known that 3-D objects (e.g., multiple-armed cube or external body parts) are visually and motorically embodied in our minds. For example, people take longer to rotate mentally an image of their hand not only when there is a greater degree of rotation, but also when the images are presented in a manner incompatible with their natural body movement (Parsons, 1987a, 1994; Cooper & Shepard, 1975; Sekiyama, 1983). Such findings confirm the notion that our mental images and rotations of those images are in fact confined by the laws of physics and biomechanics, because we perceive, think and reason in an embodied fashion. With the advancement of new technologies, virtual reality programs for medical education now enable users to interact directly in a 3-D environment with internal anatomical structures. Given that such structures are not readily viewable to users and thus not previously susceptible to embodiment, coupled with the VR environment also affording all possible degrees of rotation, how people learn from these programs raises new questions. If we embody external anatomical parts we can see, such as our hands and feet, can we embody internal anatomical parts we cannot see? Does manipulating the anatomical part in virtual space facilitate the user's embodiment of that structure and therefore the ability to visualize the structure mentally? Medical students grouped in yoked-pairs were tasked with mastering the spatial configuration of an internal anatomical structure; only one group was allowed to manipulate the images of this anatomical structure in a 3-D VR environment, whereas the other group could only view the manipulation. The manipulation group outperformed the visual group, suggesting that the interactivity

  20. Developing a Novel Measure of Body Satisfaction Using Virtual Reality

    PubMed Central

    Purvis, Clare K.; Jones, Megan; Bailey, Jakki O.; Bailenson, Jeremy; Taylor, C. Barr

    2015-01-01

    Body image disturbance (BID), considered a key feature in eating disorders, is a pervasive issue among young women. Accurate assessment of BID is critical, but the field is currently limited to self-report assessment methods. In the present study, we build upon existing research, and explore the utility of virtual reality (VR) to elicit and detect changes in BID across various immersive virtual environments. College-aged women with elevated weight and shape concerns (n = 38) and a non-weight and shape concerned control group (n = 40) were randomly exposed to four distinct virtual environments with high or low levels of body salience and social presence (i.e., presence of virtual others). Participants interacted with avatars of thin, normal weight, and overweight body size (BMI of approximately 18, 22, and 27 respectively) in virtual social settings (i.e., beach, party). We measured state-level body satisfaction (state BD) immediately after exposure to each environment. In addition, we measured participants’ minimum interpersonal distance, visual attention, and approach preference toward avatars of each size. Women with higher baseline BID reported significantly higher state BD in all settings compared to controls. Both groups reported significantly higher state BD in a beach with avatars as compared to other environments. In addition, women with elevated BID approached closer to normal weight avatars and looked longer at thin avatars compared to women in the control group. Our findings indicate that VR may serve as a novel tool for measuring state-level BID, with applications for measuring treatment outcomes. Implications for future research and clinical interventions are discussed. PMID:26469860

  1. Developing a Novel Measure of Body Satisfaction Using Virtual Reality.

    PubMed

    Purvis, Clare K; Jones, Megan; Bailey, Jakki O; Bailenson, Jeremy; Taylor, C Barr

    2015-01-01

    Body image disturbance (BID), considered a key feature in eating disorders, is a pervasive issue among young women. Accurate assessment of BID is critical, but the field is currently limited to self-report assessment methods. In the present study, we build upon existing research, and explore the utility of virtual reality (VR) to elicit and detect changes in BID across various immersive virtual environments. College-aged women with elevated weight and shape concerns (n = 38) and a non-weight and shape concerned control group (n = 40) were randomly exposed to four distinct virtual environments with high or low levels of body salience and social presence (i.e., presence of virtual others). Participants interacted with avatars of thin, normal weight, and overweight body size (BMI of approximately 18, 22, and 27 respectively) in virtual social settings (i.e., beach, party). We measured state-level body satisfaction (state BD) immediately after exposure to each environment. In addition, we measured participants' minimum interpersonal distance, visual attention, and approach preference toward avatars of each size. Women with higher baseline BID reported significantly higher state BD in all settings compared to controls. Both groups reported significantly higher state BD in a beach with avatars as compared to other environments. In addition, women with elevated BID approached closer to normal weight avatars and looked longer at thin avatars compared to women in the control group. Our findings indicate that VR may serve as a novel tool for measuring state-level BID, with applications for measuring treatment outcomes. Implications for future research and clinical interventions are discussed. PMID:26469860

  2. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  3. The Impact of Virtual Reality Programs in Career and Technical Education

    ERIC Educational Resources Information Center

    Catterson, Anna J.

    2013-01-01

    Instructional technology has evolved from blackboards with chalk to in some cases three-dimensional virtual reality environments in which students are interacting and engaging with other students worldwide. The use of this new instructional methodology, known as "virtual reality," has experienced substantial growth in higher education…

  4. Virtual Reality as Treatment for Fear of Flying: A Review of Recent Research

    ERIC Educational Resources Information Center

    Price, Matthew; Anderson, Page; Rothbaum, Barbara O.

    2008-01-01

    Virtual reality exposure has recently emerged as an important tool for exposure therapy in the treatment of fear of flying. There have been numerous empirical studies that have evaluated the effectiveness of virtual reality exposure as compared to other treatments including in vivo exposure, progressive muscle relaxation, cognitive therapy,…

  5. Psychology Student Opinion of Virtual Reality as a Tool to Educate about Schizophrenia

    ERIC Educational Resources Information Center

    Tichon, Jennifer; Loh, Jennifer; King, Robert

    2004-01-01

    Virtual Reality (VR) techniques are increasingly being used in e-health education, training and in trial clinical programs in the treatment of certain types of mental illness. Undergraduate psychology student opinion of the use of Virtual Reality (VR) to teach them about schizophrenia at the University of Queensland, was determined with reference…

  6. Exploring "Magic Cottage": A Virtual Reality Environment for Stimulating Children's Imaginative Writing

    ERIC Educational Resources Information Center

    Patera, Marianne; Draper, Steve; Naef, Martin

    2008-01-01

    This paper presents an exploratory study that created a virtual reality environment (VRE) to stimulate motivation and creativity in imaginative writing at primary school level. The main aim of the study was to investigate if an interactive, semi-immersive virtual reality world could increase motivation and stimulate pupils' imagination in the…

  7. Assessing Learning in VR: Towards Developing a Paradigm. Virtual Reality Roving Vehicles (VRRV) Project.

    ERIC Educational Resources Information Center

    Rose, Howard

    Preliminary research on virtual reality (VR) suggests that this technology could be a powerful tool for education based on its immersive and dynamic attributes. The Virtual Reality Roving Vehicles (VRRV) Project at the University of Washington is exploring these possibilities by taking VR equipment into elementary and secondary schools for…

  8. Virtual-reality-based educational laboratories in fiber optic engineering

    NASA Astrophysics Data System (ADS)

    Hayes, Dana; Turczynski, Craig; Rice, Jonny; Kozhevnikov, Michael

    2014-07-01

    Researchers and educators have observed great potential in virtual reality (VR) technology as an educational tool due to its ability to engage and spark interest in students, thus providing them with a deeper form of knowledge about a subject. The focus of this project is to develop an interactive VR educational module, Laser Diode Characteristics and Coupling to Fibers, to integrate into a fiber optics laboratory course. The developed module features a virtual laboratory populated with realistic models of optical devices in which students can set up and perform an optical experiment dealing with laser diode characteristics and fiber coupling. The module contains three increasingly complex levels for students to navigate through, with a short built-in quiz after each level to measure the student's understanding of the subject. Seventeen undergraduate students learned fiber coupling concepts using the designed computer simulation in a non-immersive desktop virtual environment (VE) condition. The analysis of students' responses on the updated pre- and post tests show statistically significant improvement of the scores for the post-test as compared to the pre-test. In addition, the students' survey responses suggest that they found the module very useful and engaging. The conducted study clearly demonstrated the feasibility of the proposed instructional technology for engineering education, where both the model of instruction and the enabling technology are equally important, in providing a better learning environment to improve students' conceptual understanding as compared to other instructional approaches.

  9. Heart rate variability (HRV) during virtual reality immersion

    PubMed Central

    Malińska, Marzena; Zużewicz, Krystyna; Bugajska, Joanna; Grabowski, Andrzej

    2015-01-01

    The goal of the study was assessment of the hour-long training involving handling virtual environment (sVR) and watching a stereoscopic 3D movie on the mechanisms of autonomic heart rate (HR) regulation among the subjects who were not predisposed to motion sickness. In order to exclude predispositions to motion sickness, all the participants (n=19) underwent a Coriolis test. During an exposure to 3D and sVR the ECG signal was continuously recorded using the Holter method. For the twelve consecutive 5-min epochs of ECG signal, the analysis of heart rate variability (HRV) in time and frequency domains was conducted. After 30 min from the beginning of the training in handling the virtual workstation a significant increase in LF spectral power was noted. The values of the sympathovagal LF/HF index while sVR indicated a significant increase in sympathetic predominance in four time intervals, namely between the 5th and the 10th minute, between the 15th and the 20th minute, between the 35th and 40th minute and between the 55th and the 60th minute of exposure. PMID:26327262

  10. Heart rate variability (HRV) during virtual reality immersion.

    PubMed

    Malińska, Marzena; Zużewicz, Krystyna; Bugajska, Joanna; Grabowski, Andrzej

    2015-01-01

    The goal of the study was assessment of the hour-long training involving handling virtual environment (sVR) and watching a stereoscopic 3D movie on the mechanisms of autonomic heart rate (HR) regulation among the subjects who were not predisposed to motion sickness. In order to exclude predispositions to motion sickness, all the participants (n=19) underwent a Coriolis test. During an exposure to 3D and sVR the ECG signal was continuously recorded using the Holter method. For the twelve consecutive 5-min epochs of ECG signal, the analysis of heart rate variability (HRV) in time and frequency domains was conducted. After 30 min from the beginning of the training in handling the virtual workstation a significant increase in LF spectral power was noted. The values of the sympathovagal LF/HF index while sVR indicated a significant increase in sympathetic predominance in four time intervals, namely between the 5th and the 10th minute, between the 15th and the 20th minute, between the 35th and 40th minute and between the 55th and the 60th minute of exposure. PMID:26327262

  11. A preliminary study in using virtual reality to train dental students.

    PubMed

    LeBlanc, Vicki R; Urbankova, Alice; Hadavi, Farhad; Lichtenthal, Richard M

    2004-03-01

    This study compared virtual reality simulator-enhanced training with laboratory-only practice on the development of dental technical skills. Sixty-eight students were randomly assigned to practice their skills in either a traditional preclinical dentistry laboratory or in combination with a virtual reality simulator. The results indicate that students who trained with the virtual reality simulator between six and ten hours improved significantly more than did the students in the control group from the first examination of the year to the final examination of the year. These results indicate that the use of virtual reality simulators holds promise for the training of future dentists. Additional research is necessary to determine the ideal implementation of virtual reality simulators into traditional dentistry curricula. PMID:15038639

  12. NanTroSEIZE in 3-D: Creating a Virtual Research Experience in Undergraduate Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Reed, D. L.; Bangs, N. L.; Moore, G. F.; Tobin, H.

    2009-12-01

    Marine research programs, both large and small, have increasingly added a web-based component to facilitate outreach to K-12 and the public, in general. These efforts have included, among other activities, information-rich websites, ship-to-shore communication with scientists during expeditions, blogs at sea, clips on YouTube, and information about daily shipboard activities. Our objective was to leverage a portion of the vast collection of data acquired through the NSF-MARGINS program to create a learning tool with a long lifespan for use in undergraduate geoscience courses. We have developed a web-based virtual expedition, NanTroSEIZE in 3-D, based on a seismic survey associated with the NanTroSEIZE program of NSF-MARGINS and IODP to study the properties of the plate boundary fault system in the upper limit of the seismogenic zone off Japan. The virtual voyage can be used in undergraduate classes at anytime, since it is not directly tied to the finite duration of a specific seagoing project. The website combines text, graphics, audio and video to place learning in an experiential framework as students participate on the expedition and carry out research. Students learn about the scientific background of the program, especially the critical role of international collaboration, and meet the chief scientists before joining the sea-going expedition. Students are presented with the principles of 3-D seismic imaging, data processing and interpretation while mapping and identifying the active faults that were the likely sources of devastating earthquakes and tsunamis in Japan in 1944 and 1948. They also learn about IODP drilling that began in 2007 and will extend through much of the next decade. The website is being tested in undergraduate classes in fall 2009 and will be distributed through the NSF-MARGINS website (http://www.nsf-margins.org/) and the MARGINS Mini-lesson section of the Science Education Resource Center (SERC) (http

  13. Brain-computer interface using P300 and virtual reality: a gaming approach for treating ADHD.

    PubMed

    Rohani, Darius Adam; Sorensen, Helge B D; Puthusserypady, Sadasivan

    2014-01-01

    This paper presents a novel brain-computer interface (BCI) system aiming at the rehabilitation of attention-deficit/hyperactive disorder in children. It uses the P300 potential in a series of feedback games to improve the subjects' attention. We applied a support vector machine (SVM) using temporal and template-based features to detect these P300 responses. In an experimental setup using five subjects, an average error below 30% was achieved. To make it more challenging the BCI system has been embedded inside an immersive 3D virtual reality (VR) classroom with simulated distractions, which was created by combining a low-cost infrared camera and an "off-axis perspective projection" algorithm. This system is intended for kids by operating with four electrodes, as well as a non-intrusive VR setting. With the promising results, and considering the simplicity of the scheme, we hope to encourage future studies to adapt the techniques presented in this study. PMID:25570771

  14. Analyzing industrial furnace efficiency using comparative visualization in a virtual reality environment.

    SciTech Connect

    Freitag, L.; Urness, T.

    1999-02-10

    We describe an interactive toolkit used to perform comparative analysis of two or more data sets arising from numerical simulations. Several techniques have been incorporated into this toolkit, including (1) successive visualization of individual data sets, (2) data comparison techniques such as computation and visualization of the differences between data sets, and (3) image comparison methods such as scalar field height profiles plotted in a common coordinate system. We describe each technique in detail and show example usage in an industrial application aimed at designing an efficient, low-NOX burner for industrial furnaces. Critical insights are obtained by interactively adjusted color maps, data culling, and data manipulation. New paradigms for scaling small values in the data comparison technique are described. The display device used for this application was the CAVE virtual reality theater, and we describe the user interface to the visualization toolkit and the benefits of immersive 3D visualization for comparative analysis.

  15. Virtual reality in planning and operations from research topic to practical issue

    SciTech Connect

    Rindahl, G.; Johnsen, T.; Mark, N. K. F.; Meyer, G.

    2006-07-01

    During the last decade of research and development on advanced visualization systems for the nuclear industry, the available technology has evolved significantly. In the same period, nuclear companies have entered a more competitive environment due to the increasingly open electricity market, resulting in strong demands on cost effective operations. This paper reports on some of the 3D applications developed by Inst. for Energy Technology in this time period, and on the emerging possibilities for practical applications of Virtual and Augmented Reality. Finally the paper proposes that well-considered deployment of recent and on-going technological advances in this field can be a contribution to improving economy and efficiency without compromising safety. (authors)

  16. Heard on The Street: GIS-Guided Immersive 3D Models as an Augmented Reality for Team Collaboration

    NASA Astrophysics Data System (ADS)

    Quinn, B. B.

    2007-12-01

    Grid computing can be configured to run physics simulations for spatially contiguous virtual 3D model spaces. Each cell is run by a single processor core simulating 1/16 square kilometer of surface and can contain up to 15,000 objects. In this work, a model of one urban block was constructed in the commercial 3D online digital world Second Life http://secondlife.com to prove concept that GIS data can guide the build of an accurate in-world model. Second Life simulators support terrain modeling at two-meter grid intervals. Access to the Second Life grid is worldwide if connections to the US-based servers are possible. This immersive 3D model allows visitors to explore the space at will, with physics simulated for object collisions, gravity, and wind forces about 40 times per second. Visitors view this world as renderings by their 3-D display card of graphic objects and raster textures that are streamed from the simulator grid to the Second Life client, based on that client's instantaneous field of view. Visitors to immersive 3D models experience a virtual world that engages their innate abilities to relate to the real immersive 3D world in which humans have evolved. These abilities enable far more complex and dynamic 3D environments to be quickly and accurately comprehended by more visitors than most non-immersive 3D environments. Objects of interest at ground surface and below can be walked around, possibly entered, viewed at arm's length or flown over at 500 meters above. Videos of renderings have been recorded (as machinima) to share a visit as part of public presentations. Key to this experience is that dozens of simultaneous visitors can experience the model at the same time, each exploring it at will and seeing (if not colliding with) one another---like twenty geology students on a virtual outcrop, where each student might fly if they chose to. This work modeled the downtown Berkeley, CA, transit station in the Second Life region "Gualala" near [170, 35, 35

  17. [A new concept in surgery of the digestive tract: surgical procedure assisted by computer, from virtual reality to telemanipulation].

    PubMed

    Marescaux, J; Clément, J M; Vix, M; Russier, Y; Tassetti, V; Mutter, D; Cotin, S; Ayache, N

    1998-02-01

    Surgical simulation increasingly appears to be an essential aspect of tomorrow's surgery. The development of a hepatic surgery simulator is an advanced concept calling for a new writing system which will transform the medical world: virtual reality. Virtual reality extends the perception of our five senses by representing more than the real state of things by the means of computer sciences and robotics. It consists of three concepts: immersion, navigation and interaction. Three reasons have led us to develop this simulator: the first is to provide the surgeon with a comprehensive visualisation of the organ. The second reasons is to allow for planning and surgical simulation that could be compared with the detailed flight-plan for a commercial jet pilot. The third lies in the fact that virtual reality is an integrated part of the concept of computer assisted surgical procedure. The project consists of a sophisticated simulator which must include five requirements: a) visual fidelity, b) interactivity, c) physical properties, d) physiological properties, e) sensory input and output. In this report we describe how to obtain a realistic 3D model of the liver from bi-dimensional 2D medical images for anatomical and surgical training. The introduction of a tumor and the consequent planning and virtual resection is also described, as are force feedback and real-time interaction. PMID:9752550

  18. Virtual reality for the psychophysiological assessment of phobic fear: responses during virtual tunnel driving.

    PubMed

    Mühlberger, Andreas; Bülthoff, Heinrich H; Wiedemann, Georg; Pauli, Paul

    2007-09-01

    An overall assessment of phobic fear requires not only a verbal self-report of fear but also an assessment of behavioral and physiological responses. Virtual reality can be used to simulate realistic (phobic) situations and therefore should be useful for inducing emotions in a controlled, standardized way. Verbal and physiological fear reactions were examined in 15 highly tunnel-fearful and 15 matched control participants in 3 virtual driving scenarios: an open environment, a partially open tunnel (gallery), and a closed tunnel. Highly tunnel-fearful participants were characterized by elevated fear responses specifically during tunnel drives as reflected in verbal fear ratings, heart rate reactions, and startle responses. Heart rate and fear ratings differentiated highly tunnel-fearful from control participants with an accuracy of 88% and 93%, respectively. Results indicate that virtual environments are valuable tools for the assessment of fear reactions and should be used in future experimental research. PMID:17845125

  19. A virtual reality catchment for data assimilation experiments

    NASA Astrophysics Data System (ADS)

    Schalge, Bernd; Rihani, Jehan; Haese, Barbara; Baroni, Gabriele; Erdal, Daniel; Neuweiler, Insa; Hendricks-Franssen, Harrie-Jan; Geppert, Gernot; Ament, Felix; Kollet, Stefan; Cirpka, Olaf; Saavedra, Pablo; Han, Xujun; Attinger, Sabine; Kunstmann, Harald; Vereecken, Harry; Simmer, Clemens

    2016-04-01

    Current data assimilation (DA) systems often lack the possibility to assimilate measurements across compartments to accurately estimate states and fluxes in subsurface-land surface-atmosphere systems (SLAS). In order to develop a new DA framework that is able to realize this cross-compartmental assimilation a comprehensive testing environment is needed. Therefore a virtual reality (VR) catchment is constructed with the Terrestrial System Modeling Platform (TerrSysMP). This catchment mimics the Neckar catchment in Germany. TerrSysMP employs the atmospheric model COSMO, the land surface model CLM and the hydrological model ParFlow coupled with the external coupler OASIS. We will show statistical tests to prove the plausibility of the VR. The VR is running in a fully-coupled mode (subsurface - land surface - atmosphere) which includes the interactions of subsurface dynamics with the atmosphere, such as the effects of soil moisture, which can influence near-surface temperatures, convection patterns or the surface heat fluxes. A reference high resolution run serves as the "truth" from which virtual observations are extracted with observation operators like virtual rain gauges, synoptic stations and satellite observations (amongst others). This effectively solves the otherwise often encountered data scarcity issues with respect to DA. Furthermore an ensemble of model runs at a reduced resolution is performed. This ensemble serves also for open loop runs to be compared with data assimilation experiments. The model runs with this ensemble served to identify sets of parameters that are especially sensitive to changes and have the largest impact on the system. These parameters were the focus of subsequent ensemble simulations and DA experiments. We will show to what extend the VR states can be re-constructed using data assimilation methods with only a limited number of virtual observations available.

  20. Surgical planning for radical prostatectomies using three-dimensional visualization and a virtual reality display system

    NASA Astrophysics Data System (ADS)

    Kay, Paul A.; Robb, Richard A.; King, Bernard F.; Myers, R. P.; Camp, Jon J.

    1995-04-01

    Thousands of radical prostatectomies for prostate cancer are performed each year. Radical prostatectomy is a challenging procedure due to anatomical variability and the adjacency of critical structures, including the external urinary sphincter and neurovascular bundles that subserve erectile function. Because of this, there are significant risks of urinary incontinence and impotence following this procedure. Preoperative interaction with three-dimensional visualization of the important anatomical structures might allow the surgeon to understand important individual anatomical relationships of patients. Such understanding might decrease the rate of morbidities, especially for surgeons in training. Patient specific anatomic data can be obtained from preoperative 3D MRI diagnostic imaging examinations of the prostate gland utilizing endorectal coils and phased array multicoils. The volumes of the important structures can then be segmented using interactive image editing tools and then displayed using 3-D surface rendering algorithms on standard work stations. Anatomic relationships can be visualized using surface displays and 3-D colorwash and transparency to allow internal visualization of hidden structures. Preoperatively a surgeon and radiologist can interactively manipulate the 3-D visualizations. Important anatomical relationships can better be visualized and used to plan the surgery. Postoperatively the 3-D displays can be compared to actual surgical experience and pathologic data. Patients can then be followed to assess the incidence of morbidities. More advanced approaches to visualize these anatomical structures in support of surgical planning will be implemented on virtual reality (VR) display systems. Such realistic displays are `immersive,' and allow surgeons to simultaneously see and manipulate the anatomy, to plan the procedure and to rehearse it in a realistic way. Ultimately the VR systems will be implemented in the operating room (OR) to assist the