Sample records for immersive visualization environment

  1. Understanding Immersivity: Image Generation and Transformation Processes in 3D Immersive Environments

    PubMed Central

    Kozhevnikov, Maria; Dhond, Rupali P.

    2012-01-01

    Most research on three-dimensional (3D) visual-spatial processing has been conducted using traditional non-immersive 2D displays. Here we investigated how individuals generate and transform mental images within 3D immersive (3DI) virtual environments, in which the viewers perceive themselves as being surrounded by a 3D world. In Experiment 1, we compared participants’ performance on the Shepard and Metzler (1971) mental rotation (MR) task across the following three types of visual presentation environments; traditional 2D non-immersive (2DNI), 3D non-immersive (3DNI – anaglyphic glasses), and 3DI (head mounted display with position and head orientation tracking). In Experiment 2, we examined how the use of different backgrounds affected MR processes within the 3DI environment. In Experiment 3, we compared electroencephalogram data recorded while participants were mentally rotating visual-spatial images presented in 3DI vs. 2DNI environments. Overall, the findings of the three experiments suggest that visual-spatial processing is different in immersive and non-immersive environments, and that immersive environments may require different image encoding and transformation strategies than the two other non-immersive environments. Specifically, in a non-immersive environment, participants may utilize a scene-based frame of reference and allocentric encoding whereas immersive environments may encourage the use of a viewer-centered frame of reference and egocentric encoding. These findings also suggest that MR performed in laboratory conditions using a traditional 2D computer screen may not reflect spatial processing as it would occur in the real world. PMID:22908003

  2. Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.

    PubMed

    Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E

    2007-01-01

    This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.

  3. Inclusion of Immersive Virtual Learning Environments and Visual Control Systems to Support the Learning of Students with Asperger Syndrome

    ERIC Educational Resources Information Center

    Lorenzo, Gonzalo; Pomares, Jorge; Lledo, Asuncion

    2013-01-01

    This paper presents the use of immersive virtual reality systems in the educational intervention with Asperger students. The starting points of this study are features of these students' cognitive style that requires an explicit teaching style supported by visual aids and highly structured environments. The proposed immersive virtual reality…

  4. The ALIVE Project: Astronomy Learning in Immersive Virtual Environments

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Sahami, K.; Denn, G.

    2008-06-01

    The Astronomy Learning in Immersive Virtual Environments (ALIVE) project seeks to discover learning modes and optimal teaching strategies using immersive virtual environments (VEs). VEs are computer-generated, three-dimensional environments that can be navigated to provide multiple perspectives. Immersive VEs provide the additional benefit of surrounding a viewer with the simulated reality. ALIVE evaluates the incorporation of an interactive, real-time ``virtual universe'' into formal college astronomy education. In the experiment, pre-course, post-course, and curriculum tests will be used to determine the efficacy of immersive visualizations presented in a digital planetarium versus the same visual simulations in the non-immersive setting of a normal classroom, as well as a control case using traditional classroom multimedia. To normalize for inter-instructor variability, each ALIVE instructor will teach at least one of each class in each of the three test groups.

  5. Influence of moving visual environment on sit-to-stand kinematics in children and adults.

    PubMed

    Slaboda, Jill C; Barton, Joseph E; Keshner, Emily A

    2009-08-01

    The effect of visual field motion on the sit-to-stand kinematics of adults and children was investigated. Children (8 to12 years of age) and adults (21 to 49 years of age) were seated in a virtual environment that rotated in the pitch and roll directions. Participants stood up either (1) concurrent with onset of visual motion or (2) after an immersion period in the moving visual environment, and (3) without visual input. Angular velocities of the head with respect to the trunk, and trunk with respect to the environment, w ere calculated as was head andtrunk center of mass. Both adults and children reduced head and trunk angular velocity after immersion in the moving visual environment. Unlike adults, children demonstrated significant differences in displacement of the head center of mass during the immersion and concurrent trials when compared to trials without visual input. Results suggest a time-dependent effect of vision on sit-to-stand kinematics in adults, whereas children are influenced by the immediate presence or absence of vision.

  6. Art, science, and immersion: data-driven experiences

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Monroe, Laura; Ford Morie, Jacquelyn; Aguilera, Julieta

    2013-03-01

    This panel and dialog-paper explores the potentials at the intersection of art, science, immersion and highly dimensional, "big" data to create new forms of engagement, insight and cultural forms. We will address questions such as: "What kinds of research questions can be identified at the intersection of art + science + immersive environments that can't be expressed otherwise?" "How is art+science+immersion distinct from state-of-the art visualization?" "What does working with immersive environments and visualization offer that other approaches don't or can't?" "Where does immersion fall short?" We will also explore current trends in the application of immersion for gaming, scientific data, entertainment, simulation, social media and other new forms of big data. We ask what expressive, arts-based approaches can contribute to these forms in the broad cultural landscape of immersive technologies.

  7. OnSight: Multi-platform Visualization of the Surface of Mars

    NASA Astrophysics Data System (ADS)

    Abercrombie, S. P.; Menzies, A.; Winter, A.; Clausen, M.; Duran, B.; Jorritsma, M.; Goddard, C.; Lidawer, A.

    2017-12-01

    A key challenge of planetary geology is to develop an understanding of an environment that humans cannot (yet) visit. Instead, scientists rely on visualizations created from images sent back by robotic explorers, such as the Curiosity Mars rover. OnSight is a multi-platform visualization tool that helps scientists and engineers to visualize the surface of Mars. Terrain visualization allows scientists to understand the scale and geometric relationships of the environment around the Curiosity rover, both for scientific understanding and for tactical consideration in safely operating the rover. OnSight includes a web-based 2D/3D visualization tool, as well as an immersive mixed reality visualization. In addition, OnSight offers a novel feature for communication among the science team. Using the multiuser feature of OnSight, scientists can meet virtually on Mars, to discuss geology in a shared spatial context. Combining web-based visualization with immersive visualization allows OnSight to leverage strengths of both platforms. This project demonstrates how 3D visualization can be adapted to either an immersive environment or a computer screen, and will discuss advantages and disadvantages of both platforms.

  8. Development of Techniques for Visualization of Scalar and Vector Fields in the Immersive Environment

    NASA Technical Reports Server (NTRS)

    Bidasaria, Hari B.; Wilson, John W.; Nealy, John E.

    2005-01-01

    Visualization of scalar and vector fields in the immersive environment (CAVE - Cave Automated Virtual Environment) is important for its application to radiation shielding research at NASA Langley Research Center. A complete methodology and the underlying software for this purpose have been developed. The developed software has been put to use for the visualization of the earth s magnetic field, and in particular for the study of the South Atlantic Anomaly. The methodology has also been put to use for the visualization of geomagnetically trapped protons and electrons within Earth's magnetosphere.

  9. Eye movements, visual search and scene memory, in an immersive virtual environment.

    PubMed

    Kit, Dmitry; Katz, Leor; Sullivan, Brian; Snyder, Kat; Ballard, Dana; Hayhoe, Mary

    2014-01-01

    Visual memory has been demonstrated to play a role in both visual search and attentional prioritization in natural scenes. However, it has been studied predominantly in experimental paradigms using multiple two-dimensional images. Natural experience, however, entails prolonged immersion in a limited number of three-dimensional environments. The goal of the present experiment was to recreate circumstances comparable to natural visual experience in order to evaluate the role of scene memory in guiding eye movements in a natural environment. Subjects performed a continuous visual-search task within an immersive virtual-reality environment over three days. We found that, similar to two-dimensional contexts, viewers rapidly learn the location of objects in the environment over time, and use spatial memory to guide search. Incidental fixations did not provide obvious benefit to subsequent search, suggesting that semantic contextual cues may often be just as efficient, or that many incidentally fixated items are not held in memory in the absence of a specific task. On the third day of the experience in the environment, previous search items changed in color. These items were fixated upon with increased probability relative to control objects, suggesting that memory-guided prioritization (or Surprise) may be a robust mechanisms for attracting gaze to novel features of natural environments, in addition to task factors and simple spatial saliency.

  10. Terrain Modelling for Immersive Visualization for the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Wright, J.; Hartman, F.; Cooper, B.; Maxwell, S.; Yen, J.; Morrison, J.

    2004-01-01

    Immersive environments are being used to support mission operations at the Jet Propulsion Laboratory. This technology contributed to the Mars Pathfinder Mission in planning sorties for the Sojourner rover and is being used for the Mars Exploration Rover (MER) missions. The stereo imagery captured by the rovers is used to create 3D terrain models, which can be viewed from any angle, to provide a powerful and information rich immersive visualization experience. These technologies contributed heavily to both the mission success and the phenomenal level of public outreach achieved by Mars Pathfinder and MER. This paper will review the utilization of terrain modelling for immersive environments in support of MER.

  11. Planning, Implementation and Optimization of Future space Missions using an Immersive Visualization Environement (IVE) Machine

    NASA Astrophysics Data System (ADS)

    Harris, E.

    Planning, Implementation and Optimization of Future Space Missions using an Immersive Visualization Environment (IVE) Machine E. N. Harris, Lockheed Martin Space Systems, Denver, CO and George.W. Morgenthaler, U. of Colorado at Boulder History: A team of 3-D engineering visualization experts at the Lockheed Martin Space Systems Company have developed innovative virtual prototyping simulation solutions for ground processing and real-time visualization of design and planning of aerospace missions over the past 6 years. At the University of Colorado, a team of 3-D visualization experts are developing the science of 3-D visualization and immersive visualization at the newly founded BP Center for Visualization, which began operations in October, 2001. (See IAF/IAA-01-13.2.09, "The Use of 3-D Immersive Visualization Environments (IVEs) to Plan Space Missions," G. A. Dorn and G. W. Morgenthaler.) Progressing from Today's 3-D Engineering Simulations to Tomorrow's 3-D IVE Mission Planning, Simulation and Optimization Techniques: 3-D (IVEs) and visualization simulation tools can be combined for efficient planning and design engineering of future aerospace exploration and commercial missions. This technology is currently being developed and will be demonstrated by Lockheed Martin in the (IVE) at the BP Center using virtual simulation for clearance checks, collision detection, ergonomics and reach-ability analyses to develop fabrication and processing flows for spacecraft and launch vehicle ground support operations and to optimize mission architecture and vehicle design subject to realistic constraints. Demonstrations: Immediate aerospace applications to be demonstrated include developing streamlined processing flows for Reusable Space Transportation Systems and Atlas Launch Vehicle operations and Mars Polar Lander visual work instructions. Long-range goals include future international human and robotic space exploration missions such as the development of a Mars Reconnaissance Orbiter and Lunar Base construction scenarios. Innovative solutions utilizing Immersive Visualization provide the key to streamlining the mission planning and optimizing engineering design phases of future aerospace missions.

  12. KinImmerse: Macromolecular VR for NMR ensembles

    PubMed Central

    Block, Jeremy N; Zielinski, David J; Chen, Vincent B; Davis, Ian W; Vinson, E Claire; Brady, Rachael; Richardson, Jane S; Richardson, David C

    2009-01-01

    Background In molecular applications, virtual reality (VR) and immersive virtual environments have generally been used and valued for the visual and interactive experience – to enhance intuition and communicate excitement – rather than as part of the actual research process. In contrast, this work develops a software infrastructure for research use and illustrates such use on a specific case. Methods The Syzygy open-source toolkit for VR software was used to write the KinImmerse program, which translates the molecular capabilities of the kinemage graphics format into software for display and manipulation in the DiVE (Duke immersive Virtual Environment) or other VR system. KinImmerse is supported by the flexible display construction and editing features in the KiNG kinemage viewer and it implements new forms of user interaction in the DiVE. Results In addition to molecular visualizations and navigation, KinImmerse provides a set of research tools for manipulation, identification, co-centering of multiple models, free-form 3D annotation, and output of results. The molecular research test case analyzes the local neighborhood around an individual atom within an ensemble of nuclear magnetic resonance (NMR) models, enabling immersive visual comparison of the local conformation with the local NMR experimental data, including target curves for residual dipolar couplings (RDCs). Conclusion The promise of KinImmerse for production-level molecular research in the DiVE is shown by the locally co-centered RDC visualization developed there, which gave new insights now being pursued in wider data analysis. PMID:19222844

  13. Eye Movements, Visual Search and Scene Memory, in an Immersive Virtual Environment

    PubMed Central

    Sullivan, Brian; Snyder, Kat; Ballard, Dana; Hayhoe, Mary

    2014-01-01

    Visual memory has been demonstrated to play a role in both visual search and attentional prioritization in natural scenes. However, it has been studied predominantly in experimental paradigms using multiple two-dimensional images. Natural experience, however, entails prolonged immersion in a limited number of three-dimensional environments. The goal of the present experiment was to recreate circumstances comparable to natural visual experience in order to evaluate the role of scene memory in guiding eye movements in a natural environment. Subjects performed a continuous visual-search task within an immersive virtual-reality environment over three days. We found that, similar to two-dimensional contexts, viewers rapidly learn the location of objects in the environment over time, and use spatial memory to guide search. Incidental fixations did not provide obvious benefit to subsequent search, suggesting that semantic contextual cues may often be just as efficient, or that many incidentally fixated items are not held in memory in the absence of a specific task. On the third day of the experience in the environment, previous search items changed in color. These items were fixated upon with increased probability relative to control objects, suggesting that memory-guided prioritization (or Surprise) may be a robust mechanisms for attracting gaze to novel features of natural environments, in addition to task factors and simple spatial saliency. PMID:24759905

  14. Immersive Training Systems: Virtual Reality and Education and Training.

    ERIC Educational Resources Information Center

    Psotka, Joseph

    1995-01-01

    Describes virtual reality (VR) technology and VR research on education and training. Focuses on immersion as the key added value of VR, analyzes cognitive variables connected to immersion, how it is generated in synthetic environments and its benefits. Discusses value of tracked, immersive visual displays over nonimmersive simulations. Contains 78…

  15. Discovering new methods of data fusion, visualization, and analysis in 3D immersive environments for hyperspectral and laser altimetry data

    NASA Astrophysics Data System (ADS)

    Moore, C. A.; Gertman, V.; Olsoy, P.; Mitchell, J.; Glenn, N. F.; Joshi, A.; Norpchen, D.; Shrestha, R.; Pernice, M.; Spaete, L.; Grover, S.; Whiting, E.; Lee, R.

    2011-12-01

    Immersive virtual reality environments such as the IQ-Station or CAVE° (Cave Automated Virtual Environment) offer new and exciting ways to visualize and explore scientific data and are powerful research and educational tools. Combining remote sensing data from a range of sensor platforms in immersive 3D environments can enhance the spectral, textural, spatial, and temporal attributes of the data, which enables scientists to interact and analyze the data in ways never before possible. Visualization and analysis of large remote sensing datasets in immersive environments requires software customization for integrating LiDAR point cloud data with hyperspectral raster imagery, the generation of quantitative tools for multidimensional analysis, and the development of methods to capture 3D visualizations for stereographic playback. This study uses hyperspectral and LiDAR data acquired over the China Hat geologic study area near Soda Springs, Idaho, USA. The data are fused into a 3D image cube for interactive data exploration and several methods of recording and playback are investigated that include: 1) creating and implementing a Virtual Reality User Interface (VRUI) patch configuration file to enable recording and playback of VRUI interactive sessions within the CAVE and 2) using the LiDAR and hyperspectral remote sensing data and GIS data to create an ArcScene 3D animated flyover, where left- and right-eye visuals are captured from two independent monitors for playback in a stereoscopic player. These visualizations can be used as outreach tools to demonstrate how integrated data and geotechnology techniques can help scientists see, explore, and more adequately comprehend scientific phenomena, both real and abstract.

  16. Simulation Exploration through Immersive Parallel Planes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas J; Bush, Brian W; Gruchalla, Kenny M

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  17. Simulation Exploration through Immersive Parallel Planes: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas; Bush, Brian W.; Gruchalla, Kenny

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  18. Enhancements to VTK enabling Scientific Visualization in Immersive Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick; Jhaveri, Sankhesh; Chaudhary, Aashish

    Modern scientific, engineering and medical computational sim- ulations, as well as experimental and observational data sens- ing/measuring devices, produce enormous amounts of data. While statistical analysis provides insight into this data, scientific vi- sualization is tactically important for scientific discovery, prod- uct design and data analysis. These benefits are impeded, how- ever, when scientific visualization algorithms are implemented from scratch—a time-consuming and redundant process in im- mersive application development. This process can greatly ben- efit from leveraging the state-of-the-art open-source Visualization Toolkit (VTK) and its community. Over the past two (almost three) decades, integrating VTK with a virtual reality (VR)more » environment has only been attempted to varying degrees of success. In this pa- per, we demonstrate two new approaches to simplify this amalga- mation of an immersive interface with visualization rendering from VTK. In addition, we cover several enhancements to VTK that pro- vide near real-time updates and efficient interaction. Finally, we demonstrate the combination of VTK with both Vrui and OpenVR immersive environments in example applications.« less

  19. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  20. Spherical Panoramas for Astrophysical Data Visualization

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.

    2017-05-01

    Data immersion has advantages in astrophysical visualization. Complex multi-dimensional data and phase spaces can be explored in a seamless and interactive viewing environment. Putting the user in the data is a first step toward immersive data analysis. We present a technique for creating 360° spherical panoramas with astrophysical data. The three-dimensional software package Blender and the Google Spatial Media module are used together to immerse users in data exploration. Several examples employing these methods exhibit how the technique works using different types of astronomical data.

  1. Influence of Immersive Human Scale Architectural Representation on Design Judgment

    NASA Astrophysics Data System (ADS)

    Elder, Rebecca L.

    Unrealistic visual representation of architecture within our existing environments have lost all reference to the human senses. As a design tool, visual and auditory stimuli can be utilized to determine human's perception of design. This experiment renders varying building inputs within different sites, simulated with corresponding immersive visual and audio sensory cues. Introducing audio has been proven to influence the way a person perceives a space, yet most inhabitants rely strictly on their sense of vision to make design judgments. Though not as apparent, users prefer spaces that have a better quality of sound and comfort. Through a series of questions, we can begin to analyze whether a design is fit for both an acoustic and visual environment.

  2. Manipulating the fidelity of lower extremity visual feedback to identify obstacle negotiation strategies in immersive virtual reality.

    PubMed

    Kim, Aram; Zhou, Zixuan; Kretch, Kari S; Finley, James M

    2017-07-01

    The ability to successfully navigate obstacles in our environment requires integration of visual information about the environment with estimates of our body's state. Previous studies have used partial occlusion of the visual field to explore how information about the body and impending obstacles are integrated to mediate a successful clearance strategy. However, because these manipulations often remove information about both the body and obstacle, it remains to be seen how information about the lower extremities alone is utilized during obstacle crossing. Here, we used an immersive virtual reality (VR) interface to explore how visual feedback of the lower extremities influences obstacle crossing performance. Participants wore a head-mounted display while walking on treadmill and were instructed to step over obstacles in a virtual corridor in four different feedback trials. The trials involved: (1) No visual feedback of the lower extremities, (2) an endpoint-only model, (3) a link-segment model, and (4) a volumetric multi-segment model. We found that the volumetric model improved success rate, placed their trailing foot before crossing and leading foot after crossing more consistently, and placed their leading foot closer to the obstacle after crossing compared to no model. This knowledge is critical for the design of obstacle negotiation tasks in immersive virtual environments as it may provide information about the fidelity necessary to reproduce ecologically valid practice environments.

  3. Scalable metadata environments (MDE): artistically impelled immersive environments for large-scale data exploration

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Margolis, Todd; Prudhomme, Andrew; Schulze, Jürgen P.; Mostafavi, Iman; Lewis, J. P.; Gossmann, Joachim; Singh, Rajvikram

    2014-02-01

    Scalable Metadata Environments (MDEs) are an artistic approach for designing immersive environments for large scale data exploration in which users interact with data by forming multiscale patterns that they alternatively disrupt and reform. Developed and prototyped as part of an art-science research collaboration, we define an MDE as a 4D virtual environment structured by quantitative and qualitative metadata describing multidimensional data collections. Entire data sets (e.g.10s of millions of records) can be visualized and sonified at multiple scales and at different levels of detail so they can be explored interactively in real-time within MDEs. They are designed to reflect similarities and differences in the underlying data or metadata such that patterns can be visually/aurally sorted in an exploratory fashion by an observer who is not familiar with the details of the mapping from data to visual, auditory or dynamic attributes. While many approaches for visual and auditory data mining exist, MDEs are distinct in that they utilize qualitative and quantitative data and metadata to construct multiple interrelated conceptual coordinate systems. These "regions" function as conceptual lattices for scalable auditory and visual representations within virtual environments computationally driven by multi-GPU CUDA-enabled fluid dyamics systems.

  4. Radiological tele-immersion for next generation networks.

    PubMed

    Ai, Z; Dech, F; Rasmussen, M; Silverstein, J C

    2000-01-01

    Since the acquisition of high-resolution three-dimensional patient images has become widespread, medical volumetric datasets (CT or MR) larger than 100 MB and encompassing more than 250 slices are common. It is important to make this patient-specific data quickly available and usable to many specialists at different geographical sites. Web-based systems have been developed to provide volume or surface rendering of medical data over networks with low fidelity, but these cannot adequately handle stereoscopic visualization or huge datasets. State-of-the-art virtual reality techniques and high speed networks have made it possible to create an environment for clinicians geographically distributed to immersively share these massive datasets in real-time. An object-oriented method for instantaneously importing medical volumetric data into Tele-Immersive environments has been developed at the Virtual Reality in Medicine Laboratory (VRMedLab) at the University of Illinois at Chicago (UIC). This networked-VR setup is based on LIMBO, an application framework or template that provides the basic capabilities of Tele-Immersion. We have developed a modular general purpose Tele-Immersion program that automatically combines 3D medical data with the methods for handling the data. For this purpose a DICOM loader for IRIS Performer has been developed. The loader was designed for SGI machines as a shared object, which is executed at LIMBO's runtime. The loader loads not only the selected DICOM dataset, but also methods for rendering, handling, and interacting with the data, bringing networked, real-time, stereoscopic interaction with radiological data to reality. Collaborative, interactive methods currently implemented in the loader include cutting planes and windowing. The Tele-Immersive environment has been tested on the UIC campus over an ATM network. We tested the environment with 3 nodes; one ImmersaDesk at the VRMedLab, one CAVE at the Electronic Visualization Laboratory (EVL) on east campus, and a CT scan machine in UIC Hospital. CT data was pulled directly from the scan machine to the Tele-Immersion server in our Laboratory, and then the data was synchronously distributed by our Onyx2 Rack server to all the VR setups. Instead of permitting medical volume visualization at one VR device, by combining teleconferencing, tele-presence, and virtual reality, the Tele-Immersive environment will enable geographically distributed clinicians to intuitively interact with the same medical volumetric models, point, gesture, converse, and see each other. This environment will bring together clinicians at different geographic locations to participate in Tele-Immersive consultation and collaboration.

  5. Immersive virtual reality for visualization of abdominal CT

    NASA Astrophysics Data System (ADS)

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A.; Bodenheimer, Robert E.

    2013-03-01

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  6. Immersive Virtual Reality for Visualization of Abdominal CT.

    PubMed

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A; Bodenheimer, Robert E

    2013-03-28

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two-dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  7. Alternative Audio Solution to Enhance Immersion in Deployable Synthetic Environments

    DTIC Science & Technology

    2003-09-01

    sense of presence. For example, the musical score of a movie increases the viewers’ emotional involvement in a cinematic feature. The character...photo-realistic way can make mental immersion difficult, because any flaw in the realism will spoil the effect [SHER 03].” One way to overcome spoiling...the visual realism is to reinforce visual clues with those from other modalities. 3. Aural Modality a. General Aural displays can be

  8. Augmented Virtuality: A Real-time Process for Presenting Real-world Visual Sensory Information in an Immersive Virtual Environment for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.

    2017-12-01

    Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.

  9. Venus Quadrangle Geological Mapping: Use of Geoscience Data Visualization Systems in Mapping and Training

    NASA Technical Reports Server (NTRS)

    Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil

    2008-01-01

    We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].

  10. Learning immersion without getting wet

    NASA Astrophysics Data System (ADS)

    Aguilera, Julieta C.

    2012-03-01

    This paper describes the teaching of an immersive environments class on the Spring of 2011. The class had students from undergraduate as well as graduate art related majors. Their digital background and interests were also diverse. These variables were channeled as different approaches throughout the semester. Class components included fundamentals of stereoscopic computer graphics to explore spatial depth, 3D modeling and skeleton animation to in turn explore presence, exposure to formats like a stereo projection wall and dome environments to compare field of view across devices, and finally, interaction and tracking to explore issues of embodiment. All these components were supported by theoretical readings discussed in class. Guest artists presented their work in Virtual Reality, Dome Environments and other immersive formats. Museum professionals also introduced students to space science visualizations, which utilize immersive formats. Here I present the assignments and their outcome, together with insights as to how the creation of immersive environments can be learned through constraints that expose students to situations of embodied cognition.

  11. VERS: a virtual environment for reconstructive surgery planning

    NASA Astrophysics Data System (ADS)

    Montgomery, Kevin N.

    1997-05-01

    The virtual environment for reconstructive surgery (VERS) project at the NASA Ames Biocomputation Center is applying virtual reality technology to aid surgeons in planning surgeries. We are working with a craniofacial surgeon at Stanford to assemble and visualize the bone structure of patients requiring reconstructive surgery either through developmental abnormalities or trauma. This project is an extension of our previous work in 3D reconstruction, mesh generation, and immersive visualization. The current VR system, consisting of an SGI Onyx RE2, FakeSpace BOOM and ImmersiveWorkbench, Virtual Technologies CyberGlove and Ascension Technologies tracker, is currently in development and has already been used to visualize defects preoperatively. In the near future it will be used to more fully plan the surgery and compute the projected result to soft tissue structure. This paper presents the work in progress and details the production of a high-performance, collaborative, and networked virtual environment.

  12. Altered Perspectives: Immersive Environments

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Webley, P. W.

    2016-12-01

    Immersive environments provide an exciting experiential technology to visualize the natural world. Given the increasing accessibility of 360o cameras and virtual reality headsets we are now able to visualize artistic principles and scientific concepts in a fully immersive environment. The technology has become popular for photographers as well as designers, industry, educational groups, and museums. Here we show a sci-art perspective on the use of optics and light in the capture and manipulation of 360o images and video of geologic phenomena and cultural heritage sites in Alaska, England, and France. Additionally, we will generate intentionally altered perspectives to lend a surrealistic quality to the landscapes. Locations include the Catacombs of Paris, the Palace of Versailles, and the Northern Lights over Fairbanks, Alaska. Some 360o view cameras now use small portable dual lens technology extending beyond the 180o fish eye lens previously used, providing better coverage and image quality. Virtual reality headsets range in level of sophistication and cost, with the most affordable versions using smart phones and Google Cardboard viewers. The equipment used in this presentation includes a Ricoh Theta S spherical imaging camera. Here we will demonstrate the use of 360o imaging with attendees being able to be part of the immersive environment and experience our locations as if they were visiting themselves.

  13. Planning, implementation and optimization of future space missions using an immersive visualization environment (IVE) machine

    NASA Astrophysics Data System (ADS)

    Nathan Harris, E.; Morgenthaler, George W.

    2004-07-01

    Beginning in 1995, a team of 3-D engineering visualization experts assembled at the Lockheed Martin Space Systems Company and began to develop innovative virtual prototyping simulation tools for performing ground processing and real-time visualization of design and planning of aerospace missions. At the University of Colorado, a team of 3-D visualization experts also began developing the science of 3-D visualization and immersive visualization at the newly founded British Petroleum (BP) Center for visualization, which began operations in October, 2001. BP acquired ARCO in the year 2000 and awarded the 3-D flexible IVE developed by ARCO (beginning in 1990) to the University of Colorado, CU, the winner in a competition among 6 Universities. CU then hired Dr. G. Dorn, the leader of the ARCO team as Center Director, and the other experts to apply 3-D immersive visualization to aerospace and to other University Research fields, while continuing research on surface interpretation of seismic data and 3-D volumes. This paper recounts further progress and outlines plans in Aerospace applications at Lockheed Martin and CU.

  14. Wireless physiological monitoring and ocular tracking: 3D calibration in a fully-immersive virtual health care environment.

    PubMed

    Zhang, Lelin; Chi, Yu Mike; Edelstein, Eve; Schulze, Jurgen; Gramann, Klaus; Velasquez, Alvaro; Cauwenberghs, Gert; Macagno, Eduardo

    2010-01-01

    Wireless physiological/neurological monitoring in virtual reality (VR) offers a unique opportunity for unobtrusively quantifying human responses to precisely controlled and readily modulated VR representations of health care environments. Here we present such a wireless, light-weight head-mounted system for measuring electrooculogram (EOG) and electroencephalogram (EEG) activity in human subjects interacting with and navigating in the Calit2 StarCAVE, a five-sided immersive 3-D visualization VR environment. The system can be easily expanded to include other measurements, such as cardiac activity and galvanic skin responses. We demonstrate the capacity of the system to track focus of gaze in 3-D and report a novel calibration procedure for estimating eye movements from responses to the presentation of a set of dynamic visual cues in the StarCAVE. We discuss cyber and clinical applications that include a 3-D cursor for visual navigation in VR interactive environments, and the monitoring of neurological and ocular dysfunction in vision/attention disorders.

  15. Wind Tunnel Data Fusion and Immersive Visualization: A Case Study

    NASA Technical Reports Server (NTRS)

    Severance, Kurt; Brewster, Paul; Lazos, Barry; Keefe, Daniel

    2001-01-01

    This case study describes the process of fusing the data from several wind tunnel experiments into a single coherent visualization. Each experiment was conducted independently and was designed to explore different flow features around airplane landing gear. In the past, it would have been very difficult to correlate results from the different experiments. However, with a single 3-D visualization representing the fusion of the three experiments, significant insight into the composite flowfield was observed that would have been extremely difficult to obtain by studying its component parts. The results are even more compelling when viewed in an immersive environment.

  16. The use of ambient audio to increase safety and immersion in location-based games

    NASA Astrophysics Data System (ADS)

    Kurczak, John Jason

    The purpose of this thesis is to propose an alternative type of interface for mobile software being used while walking or running. Our work addresses the problem of visual user interfaces for mobile software be- ing potentially unsafe for pedestrians, and not being very immersive when used for location-based games. In addition, location-based games and applications can be dif- ficult to develop when directly interfacing with the sensors used to track the user's location. These problems need to be addressed because portable computing devices are be- coming a popular tool for navigation, playing games, and accessing the internet while walking. This poses a safety problem for mobile users, who may be paying too much attention to their device to notice and react to hazards in their environment. The difficulty of developing location-based games and other location-aware applications may significantly hinder the prevalence of applications that explore new interaction techniques for ubiquitous computing. We created the TREC toolkit to address the issues with tracking sensors while developing location-based games and applications. We have developed functional location-based applications with TREC to demonstrate the amount of work that can be saved by using this toolkit. In order to have a safer and more immersive alternative to visual interfaces, we have developed ambient audio interfaces for use with mobile applications. Ambient audio uses continuous streams of sound over headphones to present information to mobile users without distracting them from walking safely. In order to test the effectiveness of ambient audio, we ran a study to compare ambient audio with handheld visual interfaces in a location-based game. We compared players' ability to safely navigate the environment, their sense of immersion in the game, and their performance at the in-game tasks. We found that ambient audio was able to significantly increase players' safety and sense of immersion compared to a visual interface, while players performed signifi- cantly better at the game tasks when using the visual interface. This makes ambient audio a legitimate alternative to visual interfaces for mobile users when safety and immersion are a priority.

  17. JackIn Head: Immersive Visual Telepresence System with Omnidirectional Wearable Camera.

    PubMed

    Kasahara, Shunichi; Nagai, Shohei; Rekimoto, Jun

    2017-03-01

    Sharing one's own immersive experience over the Internet is one of the ultimate goals of telepresence technology. In this paper, we present JackIn Head, a visual telepresence system featuring an omnidirectional wearable camera with image motion stabilization. Spherical omnidirectional video footage taken around the head of a local user is stabilized and then broadcast to others, allowing remote users to explore the immersive visual environment independently of the local user's head direction. We describe the system design of JackIn Head and report the evaluation results of real-time image stabilization and alleviation of cybersickness. Then, through an exploratory observation study, we investigate how individuals can remotely interact, communicate with, and assist each other with our system. We report our observation and analysis of inter-personal communication, demonstrating the effectiveness of our system in augmenting remote collaboration.

  18. Object Creation and Human Factors Evaluation for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1998-01-01

    The main objective of this project is to provide test objects for simulated environments utilized by the recently established Army/NASA Virtual Innovations Lab (ANVIL) at Marshall Space Flight Center, Huntsville, Al. The objective of the ANVIL lab is to provide virtual reality (VR) models and environments and to provide visualization and manipulation methods for the purpose of training and testing. Visualization equipment used in the ANVIL lab includes head-mounted and boom-mounted immersive virtual reality display devices. Objects in the environment are manipulated using data glove, hand controller, or mouse. These simulated objects are solid or surfaced three dimensional models. They may be viewed or manipulated from any location within the environment and may be viewed on-screen or via immersive VR. The objects are created using various CAD modeling packages and are converted into the virtual environment using dVise. This enables the object or environment to be viewed from any angle or distance for training or testing purposes.

  19. Immersive cyberspace system

    NASA Technical Reports Server (NTRS)

    Park, Brian V. (Inventor)

    1997-01-01

    An immersive cyberspace system is presented which provides visual, audible, and vibrational inputs to a subject remaining in neutral immersion, and also provides for subject control input. The immersive cyberspace system includes a relaxation chair and a neutral immersion display hood. The relaxation chair supports a subject positioned thereupon, and places the subject in position which merges a neutral body position, the position a body naturally assumes in zero gravity, with a savasana yoga position. The display hood, which covers the subject's head, is configured to produce light images and sounds. An image projection subsystem provides either external or internal image projection. The display hood includes a projection screen moveably attached to an opaque shroud. A motion base supports the relaxation chair and produces vibrational inputs over a range of about 0-30 Hz. The motion base also produces limited translation and rotational movements of the relaxation chair. These limited translational and rotational movements, when properly coordinated with visual stimuli, constitute motion cues which create sensations of pitch, yaw, and roll movements. Vibration transducers produce vibrational inputs from about 20 Hz to about 150 Hz. An external computer, coupled to various components of the immersive cyberspace system, executes a software program and creates the cyberspace environment. One or more neutral hand posture controllers may be coupled to the external computer system and used to control various aspects of the cyberspace environment, or to enter data during the cyberspace experience.

  20. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  1. Headphone and Head-Mounted Visual Displays for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Begault, Duran R.; Ellis, Stephen R.; Wenzel, Elizabeth M.; Trejo, Leonard J. (Technical Monitor)

    1998-01-01

    A realistic auditory environment can contribute to both the overall subjective sense of presence in a virtual display, and to a quantitative metric predicting human performance. Here, the role of audio in a virtual display and the importance of auditory-visual interaction are examined. Conjectures are proposed regarding the effectiveness of audio compared to visual information for creating a sensation of immersion, the frame of reference within a virtual display, and the compensation of visual fidelity by supplying auditory information. Future areas of research are outlined for improving simulations of virtual visual and acoustic spaces. This paper will describe some of the intersensory phenomena that arise during operator interaction within combined visual and auditory virtual environments. Conjectures regarding audio-visual interaction will be proposed.

  2. Realistic realtime illumination of complex environment for immersive systems. A case study: the Parthenon

    NASA Astrophysics Data System (ADS)

    Callieri, M.; Debevec, P.; Pair, J.; Scopigno, R.

    2005-06-01

    Offine rendering techniques have nowadays reached an astonishing level of realism but paying the cost of a long computational time. The new generation of programmable graphic hardware, on the other hand, gives the possibility to implement in realtime some of the visual effects previously available only for cinematographic production. In a collaboration between the Visual Computing Lab (ISTI-CNR) with the Institute for Creative Technologies of the University of Southern California, has been developed a realtime demo that replicate a sequence from the short movie "The Parthenon" presented at Siggraph 2004. The application is designed to run on an immersive reality system, making possible for a user to perceive the virtual environment with a cinematographic visual quality. In this paper we present the principal ideas of the project, discussing design issues and technical solution used for the realtime demo.

  3. Vroom: designing an augmented environment for remote collaboration in digital cinema production

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; Cornish, Tracy

    2013-03-01

    As media technologies become increasingly affordable, compact and inherently networked, new generations of telecollaborative platforms continue to arise which integrate these new affordances. Virtual reality has been primarily concerned with creating simulations of environments that can transport participants to real or imagined spaces that replace the "real world". Meanwhile Augmented Reality systems have evolved to interleave objects from Virtual Reality environments into the physical landscape. Perhaps now there is a new class of systems that reverse this precept to enhance dynamic media landscapes and immersive physical display environments to enable intuitive data exploration through collaboration. Vroom (Virtual Room) is a next-generation reconfigurable tiled display environment in development at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego. Vroom enables freely scalable digital collaboratories, connecting distributed, high-resolution visualization resources for collaborative work in the sciences, engineering and the arts. Vroom transforms a physical space into an immersive media environment with large format interactive display surfaces, video teleconferencing and spatialized audio built on a highspeed optical network backbone. Vroom enables group collaboration for local and remote participants to share knowledge and experiences. Possible applications include: remote learning, command and control, storyboarding, post-production editorial review, high resolution video playback, 3D visualization, screencasting and image, video and multimedia file sharing. To support these various scenarios, Vroom features support for multiple user interfaces (optical tracking, touch UI, gesture interface, etc.), support for directional and spatialized audio, giga-pixel image interactivity, 4K video streaming, 3D visualization and telematic production. This paper explains the design process that has been utilized to make Vroom an accessible and intuitive immersive environment for remote collaboration specifically for digital cinema production.

  4. Designing EvoRoom: An Immersive Simulation Environment for Collective Inquiry in Secondary Science

    NASA Astrophysics Data System (ADS)

    Lui, Michelle Mei Yee

    This dissertation investigates the design of complex inquiry for co-located students to work as a knowledge community within a mixed-reality learning environment. It presents the design of an immersive simulation called EvoRoom and corresponding collective inquiry activities that allow students to explore concepts around topics of evolution and biodiversity in a Grade 11 Biology course. EvoRoom is a room-sized simulation of a rainforest, modeled after Borneo in Southeast Asia, where several projected displays are stitched together to form a large, animated simulation on each opposing wall of the room. This serves to create an immersive environment in which students work collaboratively as individuals, in small groups and a collective community to investigate science topics using the simulations as an evidentiary base. Researchers and a secondary science teacher co-designed a multi-week curriculum that prepared students with preliminary ideas and expertise, then provided them with guided activities within EvoRoom, supported by tablet-based software as well as larger visualizations of their collective progress. Designs encompassed the broader curriculum, as well as all EvoRoom materials (e.g., projected displays, student tablet interfaces, collective visualizations) and activity sequences. This thesis describes a series of three designs that were developed and enacted iteratively over two and a half years, presenting key features that enhanced students' experiences within the immersive environment, their interactions with peers, and their inquiry outcomes. Primary research questions are concerned with the nature of effective design for such activities and environments, and the kinds of interactions that are seen at the individual, collaborative and whole-class levels. The findings fall under one of three themes: 1) the physicality of the room, 2) the pedagogical script for student observation and reflection and collaboration, and 3) ways of including collective visualizations in the activity. Discrete findings demonstrate how the above variables, through their design as inquiry components (i.e., activity, room, scripts and scaffolds on devices, collective visualizations), can mediate the students' interactions with one another, with their teacher, and impact the outcomes of their inquiry. A set of design recommendations is drawn from the results of this research to guide future design or research efforts.

  5. Game engines and immersive displays

    NASA Astrophysics Data System (ADS)

    Chang, Benjamin; Destefano, Marc

    2014-02-01

    While virtual reality and digital games share many core technologies, the programming environments, toolkits, and workflows for developing games and VR environments are often distinct. VR toolkits designed for applications in visualization and simulation often have a different feature set or design philosophy than game engines, while popular game engines often lack support for VR hardware. Extending a game engine to support systems such as the CAVE gives developers a unified development environment and the ability to easily port projects, but involves challenges beyond just adding stereo 3D visuals. In this paper we outline the issues involved in adapting a game engine for use with an immersive display system including stereoscopy, tracking, and clustering, and present example implementation details using Unity3D. We discuss application development and workflow approaches including camera management, rendering synchronization, GUI design, and issues specific to Unity3D, and present examples of projects created for a multi-wall, clustered, stereoscopic display.

  6. Immersive Environment Technologies for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Wright, John R.; Hartman, Frank

    2000-01-01

    JPL's charter includes the unmanned exploration of the Solar System. One of the tools for exploring other planets is the rover as exemplified by Sojourner on the Mars Pathfinder mission. The light speed turnaround time between Earth and the outer planets precludes the use of teleoperated rovers so autonomous operations are built in to the current and upcoming generation devices. As the level of autonomy increases, the mode of operations shifts from low-level specification of activities to a higher-level specification of goals. To support this higher-level activity, it is necessary to provide the operator with an effective understanding of the in-situ environment and also the tools needed to specify the higher-level goals. Immersive environments provide the needed sense of presence to achieve this goal. Use of immersive environments at JPL has two main thrusts that will be discussed in this talk. One is the generation of 3D models of the in-situ environment, in particular the merging of models from different sensors, different modes (orbital, descent, and lander), and even different missions. The other is the use of various tools to visualize the environment within which the rover will be operating to maximize the understanding by the operator. A suite of tools is under development which provide an integrated view into the environment while providing a variety of modes of visualization. This allows the operator to smoothly switch from one mode to another depending on the information and presentation desired.

  7. PC-Based Virtual Reality for CAD Model Viewing

    ERIC Educational Resources Information Center

    Seth, Abhishek; Smith, Shana S.-F.

    2004-01-01

    Virtual reality (VR), as an emerging visualization technology, has introduced an unprecedented communication method for collaborative design. VR refers to an immersive, interactive, multisensory, viewer-centered, 3D computer-generated environment and the combination of technologies required to build such an environment. This article introduces the…

  8. Multivariate Gradient Analysis for Evaluating and Visualizing a Learning System Platform for Computer Programming

    ERIC Educational Resources Information Center

    Mather, Richard

    2015-01-01

    This paper explores the application of canonical gradient analysis to evaluate and visualize student performance and acceptance of a learning system platform. The subject of evaluation is a first year BSc module for computer programming. This uses "Ceebot," an animated and immersive game-like development environment. Multivariate…

  9. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  10. Immersive volume rendering of blood vessels

    NASA Astrophysics Data System (ADS)

    Long, Gregory; Kim, Han Suk; Marsden, Alison; Bazilevs, Yuri; Schulze, Jürgen P.

    2012-03-01

    In this paper, we present a novel method of visualizing flow in blood vessels. Our approach reads unstructured tetrahedral data, resamples it, and uses slice based 3D texture volume rendering. Due to the sparse structure of blood vessels, we utilize an octree to efficiently store the resampled data by discarding empty regions of the volume. We use animation to convey time series data, wireframe surface to give structure, and utilize the StarCAVE, a 3D virtual reality environment, to add a fully immersive element to the visualization. Our tool has great value in interdisciplinary work, helping scientists collaborate with clinicians, by improving the understanding of blood flow simulations. Full immersion in the flow field allows for a more intuitive understanding of the flow phenomena, and can be a great help to medical experts for treatment planning.

  11. Using Auditory Cues to Perceptually Extract Visual Data in Collaborative, Immersive Big-Data Display Systems

    NASA Astrophysics Data System (ADS)

    Lee, Wendy

    The advent of multisensory display systems, such as virtual and augmented reality, has fostered a new relationship between humans and space. Not only can these systems mimic real-world environments, they have the ability to create a new space typology made solely of data. In these spaces, two-dimensional information is displayed in three dimensions, requiring human senses to be used to understand virtual, attention-based elements. Studies in the field of big data have predominately focused on visual representations and extractions of information with little focus on sounds. The goal of this research is to evaluate the most efficient methods of perceptually extracting visual data using auditory stimuli in immersive environments. Using Rensselaer's CRAIVE-Lab, a virtual reality space with 360-degree panorama visuals and an array of 128 loudspeakers, participants were asked questions based on complex visual displays using a variety of auditory cues ranging from sine tones to camera shutter sounds. Analysis of the speed and accuracy of participant responses revealed that auditory cues that were more favorable for localization and were positively perceived were best for data extraction and could help create more user-friendly systems in the future.

  12. Wayfinding and Glaucoma: A Virtual Reality Experiment.

    PubMed

    Daga, Fábio B; Macagno, Eduardo; Stevenson, Cory; Elhosseiny, Ahmed; Diniz-Filho, Alberto; Boer, Erwin R; Schulze, Jürgen; Medeiros, Felipe A

    2017-07-01

    Wayfinding, the process of determining and following a route between an origin and a destination, is an integral part of everyday tasks. The purpose of this study was to investigate the impact of glaucomatous visual field loss on wayfinding behavior using an immersive virtual reality (VR) environment. This cross-sectional study included 31 glaucomatous patients and 20 healthy subjects without evidence of overall cognitive impairment. Wayfinding experiments were modeled after the Morris water maze navigation task and conducted in an immersive VR environment. Two rooms were built varying only in the complexity of the visual scene in order to promote allocentric-based (room A, with multiple visual cues) versus egocentric-based (room B, with single visual cue) spatial representations of the environment. Wayfinding tasks in each room consisted of revisiting previously visible targets that subsequently became invisible. For room A, glaucoma patients spent on average 35.0 seconds to perform the wayfinding task, whereas healthy subjects spent an average of 24.4 seconds (P = 0.001). For room B, no statistically significant difference was seen on average time to complete the task (26.2 seconds versus 23.4 seconds, respectively; P = 0.514). For room A, each 1-dB worse binocular mean sensitivity was associated with 3.4% (P = 0.001) increase in time to complete the task. Glaucoma patients performed significantly worse on allocentric-based wayfinding tasks conducted in a VR environment, suggesting visual field loss may affect the construction of spatial cognitive maps relevant to successful wayfinding. VR environments may represent a useful approach for assessing functional vision endpoints for clinical trials of emerging therapies in ophthalmology.

  13. Emerging CAE technologies and their role in Future Ambient Intelligence Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2011-03-01

    Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.

  14. Immersive Virtual Moon Scene System Based on Panoramic Camera Data of Chang'E-3

    NASA Astrophysics Data System (ADS)

    Gao, X.; Liu, J.; Mu, L.; Yan, W.; Zeng, X.; Zhang, X.; Li, C.

    2014-12-01

    The system "Immersive Virtual Moon Scene" is used to show the virtual environment of Moon surface in immersive environment. Utilizing stereo 360-degree imagery from panoramic camera of Yutu rover, the system enables the operator to visualize the terrain and the celestial background from the rover's point of view in 3D. To avoid image distortion, stereo 360-degree panorama stitched by 112 images is projected onto inside surface of sphere according to panorama orientation coordinates and camera parameters to build the virtual scene. Stars can be seen from the Moon at any time. So we render the sun, planets and stars according to time and rover's location based on Hipparcos catalogue as the background on the sphere. Immersing in the stereo virtual environment created by this imaged-based rendering technique, the operator can zoom, pan to interact with the virtual Moon scene and mark interesting objects. Hardware of the immersive virtual Moon system is made up of four high lumen projectors and a huge curve screen which is 31 meters long and 5.5 meters high. This system which take all panoramic camera data available and use it to create an immersive environment, enable operator to interact with the environment and mark interesting objects contributed heavily to establishment of science mission goals in Chang'E-3 mission. After Chang'E-3 mission, the lab with this system will be open to public. Besides this application, Moon terrain stereo animations based on Chang'E-1 and Chang'E-2 data will be showed to public on the huge screen in the lab. Based on the data of lunar exploration,we will made more immersive virtual moon scenes and animations to help the public understand more about the Moon in the future.

  15. Stereoscopic applications for design visualization

    NASA Astrophysics Data System (ADS)

    Gilson, Kevin J.

    2007-02-01

    Advances in display technology and 3D design visualization applications have made real-time stereoscopic visualization of architectural and engineering projects a reality. Parsons Brinkerhoff (PB) is a transportation consulting firm that has used digital visualization tools from their inception and has helped pioneer the application of those tools to large scale infrastructure projects. PB is one of the first Architecture/Engineering/Construction (AEC) firms to implement a CAVE- an immersive presentation environment that includes stereoscopic rear-projection capability. The firm also employs a portable stereoscopic front-projection system, and shutter-glass systems for smaller groups. PB is using commercial real-time 3D applications in combination with traditional 3D modeling programs to visualize and present large AEC projects to planners, clients and decision makers in stereo. These presentations create more immersive and spatially realistic presentations of the proposed designs. This paper will present the basic display tools and applications, and the 3D modeling techniques PB is using to produce interactive stereoscopic content. The paper will discuss several architectural and engineering design visualizations we have produced.

  16. Postural Hypo-Reactivity in Autism Is Contingent on Development and Visual Environment: A Fully Immersive Virtual Reality Study

    ERIC Educational Resources Information Center

    Greffou, Selma; Bertone, Armando; Hahler, Eva-Maria; Hanssens, Jean-Marie; Mottron, Laurent; Faubert, Jocelyn

    2012-01-01

    Although atypical motor behaviors have been associated with autism, investigations regarding their possible origins are scarce. This study assessed the visual and vestibular components involved in atypical postural reactivity in autism. Postural reactivity and stability were measured for younger (12-15 years) and older (16-33 years) autistic…

  17. Immersive Interaction, Manipulation and Analysis of Large 3D Datasets for Planetary and Earth Sciences

    NASA Astrophysics Data System (ADS)

    Pariser, O.; Calef, F.; Manning, E. M.; Ardulov, V.

    2017-12-01

    We will present implementation and study of several use-cases of utilizing Virtual Reality (VR) for immersive display, interaction and analysis of large and complex 3D datasets. These datasets have been acquired by the instruments across several Earth, Planetary and Solar Space Robotics Missions. First, we will describe the architecture of the common application framework that was developed to input data, interface with VR display devices and program input controllers in various computing environments. Tethered and portable VR technologies will be contrasted and advantages of each highlighted. We'll proceed to presenting experimental immersive analytics visual constructs that enable augmentation of 3D datasets with 2D ones such as images and statistical and abstract data. We will conclude by presenting comparative analysis with traditional visualization applications and share the feedback provided by our users: scientists and engineers.

  18. Visual Perspectives within Educational Computer Games: Effects on Presence and Flow within Virtual Immersive Learning Environments

    ERIC Educational Resources Information Center

    Scoresby, Jon; Shelton, Brett E.

    2011-01-01

    The mis-categorizing of cognitive states involved in learning within virtual environments has complicated instructional technology research. Further, most educational computer game research does not account for how learning activity is influenced by factors of game content and differences in viewing perspectives. This study is a qualitative…

  19. CAVE2: a hybrid reality environment for immersive simulation and information analysis

    NASA Astrophysics Data System (ADS)

    Febretti, Alessandro; Nishimoto, Arthur; Thigpen, Terrance; Talandis, Jonas; Long, Lance; Pirtle, J. D.; Peterka, Tom; Verlo, Alan; Brown, Maxine; Plepys, Dana; Sandin, Dan; Renambot, Luc; Johnson, Andrew; Leigh, Jason

    2013-03-01

    Hybrid Reality Environments represent a new kind of visualization spaces that blur the line between virtual environments and high resolution tiled display walls. This paper outlines the design and implementation of the CAVE2TM Hybrid Reality Environment. CAVE2 is the world's first near-seamless flat-panel-based, surround-screen immersive system. Unique to CAVE2 is that it will enable users to simultaneously view both 2D and 3D information, providing more flexibility for mixed media applications. CAVE2 is a cylindrical system of 24 feet in diameter and 8 feet tall, and consists of 72 near-seamless, off-axisoptimized passive stereo LCD panels, creating an approximately 320 degree panoramic environment for displaying information at 37 Megapixels (in stereoscopic 3D) or 74 Megapixels in 2D and at a horizontal visual acuity of 20/20. Custom LCD panels with shifted polarizers were built so the images in the top and bottom rows of LCDs are optimized for vertical off-center viewing- allowing viewers to come closer to the displays while minimizing ghosting. CAVE2 is designed to support multiple operating modes. In the Fully Immersive mode, the entire room can be dedicated to one virtual simulation. In 2D model, the room can operate like a traditional tiled display wall enabling users to work with large numbers of documents at the same time. In the Hybrid mode, a mixture of both 2D and 3D applications can be simultaneously supported. The ability to treat immersive work spaces in this Hybrid way has never been achieved before, and leverages the special abilities of CAVE2 to enable researchers to seamlessly interact with large collections of 2D and 3D data. To realize this hybrid ability, we merged the Scalable Adaptive Graphics Environment (SAGE) - a system for supporting 2D tiled displays, with Omegalib - a virtual reality middleware supporting OpenGL, OpenSceneGraph and Vtk applications.

  20. Image-guided surgery.

    PubMed

    Wagner, A; Ploder, O; Enislidis, G; Truppe, M; Ewers, R

    1996-04-01

    Interventional video tomography (IVT), a new imaging modality, achieves virtual visualization of anatomic structures in three dimensions for intraoperative stereotactic navigation. Partial immersion into a virtual data space, which is orthotopically coregistered to the surgical field, enhances, by means of a see-through head-mounted display (HMD), the surgeon's visual perception and technique by providing visual access to nonvisual data of anatomy, physiology, and function. The presented cases document the potential of augmented reality environments in maxillofacial surgery.

  1. Molecular Dynamics Visualization (MDV): Stereoscopic 3D Display of Biomolecular Structure and Interactions Using the Unity Game Engine.

    PubMed

    Wiebrands, Michael; Malajczuk, Chris J; Woods, Andrew J; Rohl, Andrew L; Mancera, Ricardo L

    2018-06-21

    Molecular graphics systems are visualization tools which, upon integration into a 3D immersive environment, provide a unique virtual reality experience for research and teaching of biomolecular structure, function and interactions. We have developed a molecular structure and dynamics application, the Molecular Dynamics Visualization tool, that uses the Unity game engine combined with large scale, multi-user, stereoscopic visualization systems to deliver an immersive display experience, particularly with a large cylindrical projection display. The application is structured to separate the biomolecular modeling and visualization systems. The biomolecular model loading and analysis system was developed as a stand-alone C# library and provides the foundation for the custom visualization system built in Unity. All visual models displayed within the tool are generated using Unity-based procedural mesh building routines. A 3D user interface was built to allow seamless dynamic interaction with the model while being viewed in 3D space. Biomolecular structure analysis and display capabilities are exemplified with a range of complex systems involving cell membranes, protein folding and lipid droplets.

  2. Effect of visual distortion on postural balance in a full immersion stereoscopic environment

    NASA Astrophysics Data System (ADS)

    Faubert, Jocelyn; Allard, Remy

    2004-05-01

    This study attempted to determine the influence of non-linear visual movements on our capacity to maintain postural control. An 8x8x8 foot CAVE immersive virtual environment was used. Body sway recordings were obtained for both head and lower back (lumbar 2-3) positions. The subjects were presented with visual stimuli for periods of 62.5 seconds. Subjects were asked to stand still on one foot while viewing stimuli consisting of multiplied sine waves generating movement undulation of a textured surface (waves moving in checkerboard pattern). Three wave amplitudes were tested: 4 feet, 2 feet, and 1 foot. Two viewing conditions were also used; observers looking at 36 inches in front of their feet; observers looking at a distance near the horizon. The results were compiled using an instability index and the data showed a profound and consistent effect of visual disturbances on postural balance in particular for the x (side-to-side) movement. We have demonstrated that non-linear visual distortions similar to those generated by progressive ophthalmic lenses of the kind used for presbyopia corrections, can generate significant postural instability. This instability is particularly evident for the side-to-side body movement and is most evident for the near viewing condition.

  3. Immersive 3D Visualization of Astronomical Data

    NASA Astrophysics Data System (ADS)

    Schaaff, A.; Berthier, J.; Da Rocha, J.; Deparis, N.; Derriere, S.; Gaultier, P.; Houpin, R.; Normand, J.; Ocvirk, P.

    2015-09-01

    The immersive-3D visualization, or Virtual Reality in our study, was previously dedicated to specific uses (research, flight simulators, etc.) The investment in infrastructure and its cost was reserved to large laboratories or companies. Lately we saw the development of immersive-3D masks intended for wide distribution, for example the Oculus Rift and the Sony Morpheus projects. The usual reaction is to say that these tools are primarily intended for games since it is easy to imagine a player in a virtual environment and the added value to conventional 2D screens. Yet it is likely that there are many applications in the professional field if these tools are becoming common. Introducing this technology into existing applications or new developments makes sense only if interest is properly evaluated. The use in Astronomy is clear for education, it is easy to imagine mobile and light planetariums or to reproduce poorly accessible environments (e.g., large instruments). In contrast, in the field of professional astronomy the use is probably less obvious and it requires to conduct studies to determine the most appropriate ones and to assess the contributions compared to the other display modes.

  4. NASA Virtual Glovebox: An Immersive Virtual Desktop Environment for Training Astronauts in Life Science Experiments

    NASA Technical Reports Server (NTRS)

    Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard

    2003-01-01

    The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.

  5. Multi-arm multilateral haptics-based immersive tele-robotic system (HITS) for improvised explosive device disposal

    NASA Astrophysics Data System (ADS)

    Erickson, David; Lacheray, Hervé; Lai, Gilbert; Haddadi, Amir

    2014-06-01

    This paper presents the latest advancements of the Haptics-based Immersive Tele-robotic System (HITS) project, a next generation Improvised Explosive Device (IED) disposal (IEDD) robotic interface containing an immersive telepresence environment for a remotely-controlled three-articulated-robotic-arm system. While the haptic feedback enhances the operator's perception of the remote environment, a third teleoperated dexterous arm, equipped with multiple vision sensors and cameras, provides stereo vision with proper visual cues, and a 3D photo-realistic model of the potential IED. This decentralized system combines various capabilities including stable and scaled motion, singularity avoidance, cross-coupled hybrid control, active collision detection and avoidance, compliance control and constrained motion to provide a safe and intuitive control environment for the operators. Experimental results and validation of the current system are presented through various essential IEDD tasks. This project demonstrates that a two-armed anthropomorphic Explosive Ordnance Disposal (EOD) robot interface can achieve complex neutralization techniques against realistic IEDs without the operator approaching at any time.

  6. Stereoscopic display of 3D models for design visualization

    NASA Astrophysics Data System (ADS)

    Gilson, Kevin J.

    2006-02-01

    Advances in display technology and 3D design visualization applications have made real-time stereoscopic visualization of architectural and engineering projects a reality. Parsons Brinkerhoff (PB) is a transportation consulting firm that has used digital visualization tools from their inception and has helped pioneer the application of those tools to large scale infrastructure projects. PB is one of the first Architecture/Engineering/Construction (AEC) firms to implement a CAVE- an immersive presentation environment that includes stereoscopic rear-projection capability. The firm also employs a portable stereoscopic front-projection system, and shutter-glass systems for smaller groups. PB is using commercial real-time 3D applications in combination with traditional 3D modeling programs to visualize and present large AEC projects to planners, clients and decision makers in stereo. These presentations create more immersive and spatially realistic presentations of the proposed designs. This paper will present the basic display tools and applications, and the 3D modeling techniques PB is using to produce interactive stereoscopic content. The paper will discuss several architectural and engineering design visualizations we have produced.

  7. Postural and Spatial Orientation Driven by Virtual Reality

    PubMed Central

    Keshner, Emily A.; Kenyon, Robert V.

    2009-01-01

    Orientation in space is a perceptual variable intimately related to postural orientation that relies on visual and vestibular signals to correctly identify our position relative to vertical. We have combined a virtual environment with motion of a posture platform to produce visual-vestibular conditions that allow us to explore how motion of the visual environment may affect perception of vertical and, consequently, affect postural stabilizing responses. In order to involve a higher level perceptual process, we needed to create a visual environment that was immersive. We did this by developing visual scenes that possess contextual information using color, texture, and 3-dimensional structures. Update latency of the visual scene was close to physiological latencies of the vestibulo-ocular reflex. Using this system we found that even when healthy young adults stand and walk on a stable support surface, they are unable to ignore wide field of view visual motion and they adapt their postural orientation to the parameters of the visual motion. Balance training within our environment elicited measurable rehabilitation outcomes. Thus we believe that virtual environments can serve as a clinical tool for evaluation and training of movement in situations that closely reflect conditions found in the physical world. PMID:19592796

  8. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D

    PubMed Central

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron

    2017-01-01

    Abstract Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. PMID:28814063

  9. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D.

    PubMed

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron; Gümüs, Zeynep H

    2017-08-01

    Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. © The Authors 2017. Published by Oxford University Press.

  10. Multisensory Integration in the Virtual Hand Illusion with Active Movement

    PubMed Central

    Satoh, Satoru; Hachimura, Kozaburo

    2016-01-01

    Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality. PMID:27847822

  11. The interplays among technology and content, immersant and VE

    NASA Astrophysics Data System (ADS)

    Song, Meehae; Gromala, Diane; Shaw, Chris; Barnes, Steven J.

    2010-01-01

    The research program aims to explore and examine the fine balance necessary for maintaining the interplays between technology and the immersant, including identifying qualities that contribute to creating and maintaining a sense of "presence" and "immersion" in an immersive virtual reality (IVR) experience. Building upon and extending previous work, we compare sitting meditation with walking meditation in a virtual environment (VE). The Virtual Meditative Walk, a new work-in-progress, integrates VR and biofeedback technologies with a self-directed, uni-directional treadmill. As immersants learn how to meditate while walking, robust, real-time biofeedback technology continuously measures breathing, skin conductance and heart rate. The physiological states of the immersant will in turn affect the audio and stereoscopic visual media through shutter glasses. We plan to test the potential benefits and limitations of this physically active form of meditation with data from a sitting form of meditation. A mixed-methods approach to testing user outcomes parallels the knowledge bases of the collaborative team: a physician, computer scientists and artists.

  12. Envisioning the future of home care: applications of immersive virtual reality.

    PubMed

    Brennan, Patricia Flatley; Arnott Smith, Catherine; Ponto, Kevin; Radwin, Robert; Kreutz, Kendra

    2013-01-01

    Accelerating the design of technologies to support health in the home requires 1) better understanding of how the household context shapes consumer health behaviors and (2) the opportunity to afford engineers, designers, and health professionals the chance to systematically study the home environment. We developed the Living Environments Laboratory (LEL) with a fully immersive, six-sided virtual reality CAVE to enable recreation of a broad range of household environments. We have successfully developed a virtual apartment, including a kitchen, living space, and bathroom. Over 2000 people have visited the LEL CAVE. Participants use an electronic wand to activate common household affordances such as opening a refrigerator door or lifting a cup. Challenges currently being explored include creating natural gesture to interface with virtual objects, developing robust, simple procedures to capture actual living environments and rendering them in a 3D visualization, and devising systematic stable terminologies to characterize home environments.

  13. Chemistry in Second Life

    PubMed Central

    Lang, Andrew SID; Bradley, Jean-Claude

    2009-01-01

    This review will focus on the current level on chemistry research, education, and visualization possible within the multi-user virtual environment of Second Life. We discuss how Second Life has been used as a platform for the interactive and collaborative visualization of data from molecules and proteins to spectra and experimental data. We then review how these visualizations can be scripted for immersive educational activities and real-life collaborative research. We also discuss the benefits of the social networking affordances of Second Life for both chemists and chemistry students. PMID:19852781

  14. Chemistry in second life.

    PubMed

    Lang, Andrew S I D; Bradley, Jean-Claude

    2009-10-23

    This review will focus on the current level on chemistry research, education, and visualization possible within the multi-user virtual environment of Second Life. We discuss how Second Life has been used as a platform for the interactive and collaborative visualization of data from molecules and proteins to spectra and experimental data. We then review how these visualizations can be scripted for immersive educational activities and real-life collaborative research. We also discuss the benefits of the social networking affordances of Second Life for both chemists and chemistry students.

  15. The IQ-wall and IQ-station -- harnessing our collective intelligence to realize the potential of ultra-resolution and immersive visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eric A. Wernert; William R. Sherman; Chris Eller

    2012-03-01

    We present a pair of open-recipe, affordably-priced, easy-to-integrate, and easy-to-use visualization systems. The IQ-wall is an ultra-resolution tiled display wall that scales up to 24 screens with a single PC. The IQ-station is a semi-immersive display system that utilizes commodity stereoscopic displays, lower cost tracking systems, and touch overlays. These systems have been designed to support a wide range of research, education, creative activities, and information presentations. They were designed to work equally well as stand-alone installations or as part of a larger distributed visualization ecosystem. We detail the hardware and software components of these systems, describe our deployments andmore » experiences in a variety of research lab and university environments, and share our insights for effective support and community development.« less

  16. How incorporation of scents could enhance immersive virtual experiences

    PubMed Central

    Ischer, Matthieu; Baron, Naëm; Mermoud, Christophe; Cayeux, Isabelle; Porcherot, Christelle; Sander, David; Delplanque, Sylvain

    2014-01-01

    Under normal everyday conditions, senses all work together to create experiences that fill a typical person's life. Unfortunately for behavioral and cognitive researchers who investigate such experiences, standard laboratory tests are usually conducted in a nondescript room in front of a computer screen. They are very far from replicating the complexity of real world experiences. Recently, immersive virtual reality (IVR) environments became promising methods to immerse people into an almost real environment that involves more senses. IVR environments provide many similarities to the complexity of the real world and at the same time allow experimenters to constrain experimental parameters to obtain empirical data. This can eventually lead to better treatment options and/or new mechanistic hypotheses. The idea that increasing sensory modalities improve the realism of IVR environments has been empirically supported, but the senses used did not usually include olfaction. In this technology report, we will present an odor delivery system applied to a state-of-the-art IVR technology. The platform provides a three-dimensional, immersive, and fully interactive visualization environment called “Brain and Behavioral Laboratory—Immersive System” (BBL-IS). The solution we propose can reliably deliver various complex scents during different virtual scenarios, at a precise time and space and without contamination of the environment. The main features of this platform are: (i) the limited cross-contamination between odorant streams with a fast odor delivery (< 500 ms), (ii) the ease of use and control, and (iii) the possibility to synchronize the delivery of the odorant with pictures, videos or sounds. How this unique technology could be used to investigate typical research questions in olfaction (e.g., emotional elicitation, memory encoding or attentional capture by scents) will also be addressed. PMID:25101017

  17. The big picture: effects of surround on immersion and size perception.

    PubMed

    Baranowski, Andreas M; Hecht, Heiko

    2014-01-01

    Despite the fear of the entertainment industry that illegal downloads of films might ruin their business, going to the movies continues to be a popular leisure activity. One reason why people prefer to watch movies in cinemas may be the surround of the movie screen or its physically huge size. To disentangle the factors that might contribute to the size impression, we tested several measures of subjective size and immersion in different viewing environments. For this purpose we built a model cinema that provided visual angle information comparable with that of a real cinema. Subjects watched identical movie clips in a real cinema, a model cinema, and on a display monitor in isolation. Whereas the isolated display monitor was inferior, the addition of a contextual model improved the viewing immersion to the extent that it was comparable with the movie theater experience, provided the viewing angle remained the same. In a further study we built an identical but even smaller model cinema to unconfound visual angle and viewing distance. Both model cinemas produced similar results. There was a trend for the larger screen to be more immersive; however, viewing angle did not play a role in how the movie was evaluated.

  18. Locomotive Recalibration and Prism Adaptation of Children and Teens in Immersive Virtual Environments.

    PubMed

    Adams, Haley; Narasimham, Gayathri; Rieser, John; Creem-Regehr, Sarah; Stefanucci, Jeanine; Bodenheimer, Bobby

    2018-04-01

    As virtual reality expands in popularity, an increasingly diverse audience is gaining exposure to immersive virtual environments (IVEs). A significant body of research has demonstrated how perception and action work in such environments, but most of this work has been done studying adults. Less is known about how physical and cognitive development affect perception and action in IVEs, particularly as applied to preteen and teenage children. Accordingly, in the current study we assess how preteens (children aged 8-12 years) and teenagers (children aged 15-18 years) respond to mismatches between their motor behavior and the visual information presented by an IVE. Over two experiments, we evaluate how these individuals recalibrate their actions across functionally distinct systems of movement. The first experiment analyzed forward walking recalibration after exposure to an IVE with either increased or decreased visual flow. Visual flow during normal bipedal locomotion was manipulated to be either twice or half as fast as the physical gait. The second experiment leveraged a prism throwing adaptation paradigm to test the effect of recalibration on throwing movement. In the first experiment, our results show no differences across age groups, although subjects generally experienced a post-exposure effect of shortened distance estimation after experiencing visually faster flow and longer distance estimation after experiencing visually slower flow. In the second experiment, subjects generally showed the typical prism adaptation behavior of a throwing after-effect error. The error lasted longer for preteens than older children. Our results have implications for the design of virtual systems with children as a target audience.

  19. Evaluating an immersive virtual environment prototyping and simulation system

    NASA Astrophysics Data System (ADS)

    Nemire, Kenneth

    1997-05-01

    An immersive virtual environment (IVE) modeling and simulation tool is being developed for designing advanced weapon and training systems. One unique feature of the tool is that the design, and not just visualization of the design is accomplished with the IVE tool. Acceptance of IVE tools requires comparisons with current commercial applications. In this pilot study, expert users of a popular desktop 3D graphics application performed identical modeling and simulation tasks using both the desktop and IVE applications. The IVE tool consisted of a head-mounted display, 3D spatialized sound, spatial trackers on head and hands, instrumented gloves, and a simulated speech recognition system. The results are preliminary because performance from only four users has been examined. When using the IVE system, users completed the tasks to criteria in less time than when using the desktop application. Subjective ratings of the visual displays in each system were similar. Ratings for the desktop controls were higher than for the IVE controls. Ratings of immersion and user enjoyment were higher for the IVE than for the desktop application. These results are particular remarkable because participants had used the desktop application regularly for three to five years and the prototype IVE tool for only three to six hours.

  20. Sensor Spatial Distortion, Visual Latency, and Update Rate Effects on 3D Tracking in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Ellis, S. R.; Adelstein, B. D.; Baumeler, S.; Jense, G. J.; Jacoby, R. H.; Trejo, Leonard (Technical Monitor)

    1998-01-01

    Several common defects that we have sought to minimize in immersing virtual environments are: static sensor spatial distortion, visual latency, and low update rates. Human performance within our environments during large amplitude 3D tracking was assessed by objective and subjective methods in the presence and absence of these defects. Results show that 1) removal of our relatively small spatial sensor distortion had minor effects on the tracking activity, 2) an Adapted Cooper-Harper controllability scale proved the most sensitive subjective indicator of the degradation of dynamic fidelity caused by increasing latency and decreasing frame rates, and 3) performance, as measured by normalized RMS tracking error or subjective impressions, was more markedly influenced by changing visual latency than by update rate.

  1. Adoption of the Creative Process According to the Immersive Method

    ERIC Educational Resources Information Center

    Vuk, Sonja; Tacol, Tonka; Vogrinc, Janez

    2015-01-01

    The immersive method is a new concept of visual education that is better suited to the needs of students in contemporary post-industrial society. The features of the immersive method are: (1) it emerges from interaction with visual culture; (2) it encourages understanding of contemporary art (as an integral part of visual culture); and (3) it…

  2. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  3. The Use of Virtual Reality in Psychology: A Case Study in Visual Perception

    PubMed Central

    Wilson, Christopher J.; Soranzo, Alessandro

    2015-01-01

    Recent proliferation of available virtual reality (VR) tools has seen increased use in psychological research. This is due to a number of advantages afforded over traditional experimental apparatus such as tighter control of the environment and the possibility of creating more ecologically valid stimulus presentation and response protocols. At the same time, higher levels of immersion and visual fidelity afforded by VR do not necessarily evoke presence or elicit a “realistic” psychological response. The current paper reviews some current uses for VR environments in psychological research and discusses some ongoing questions for researchers. Finally, we focus on the area of visual perception, where both the advantages and challenges of VR are particularly salient. PMID:26339281

  4. The effect of visual and interaction fidelity on spatial cognition in immersive virtual environments.

    PubMed

    Mania, Katerina; Wooldridge, Dave; Coxon, Matthew; Robinson, Andrew

    2006-01-01

    Accuracy of memory performance per se is an imperfect reflection of the cognitive activity (awareness states) that underlies performance in memory tasks. The aim of this research is to investigate the effect of varied visual and interaction fidelity of immersive virtual environments on memory awareness states. A between groups experiment was carried out to explore the effect of rendering quality on location-based recognition memory for objects and associated states of awareness. The experimental space, consisting of two interconnected rooms, was rendered either flat-shaded or using radiosity rendering. The computer graphics simulations were displayed on a stereo head-tracked Head Mounted Display. Participants completed a recognition memory task after exposure to the experimental space and reported one of four states of awareness following object recognition. These reflected the level of visual mental imagery involved during retrieval, the familiarity of the recollection, and also included guesses. Experimental results revealed variations in the distribution of participants' awareness states across conditions while memory performance failed to reveal any. Interestingly, results revealed a higher proportion of recollections associated with mental imagery in the flat-shaded condition. These findings comply with similar effects revealed in two earlier studies summarized here, which demonstrated that the less "naturalistic" interaction interface or interface of low interaction fidelity provoked a higher proportion of recognitions based on visual mental images.

  5. Immersive Earth Science: Data Visualization in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Skolnik, S.; Ramirez-Linan, R.

    2017-12-01

    Utilizing next generation technology, Navteca's exploration of 3D and volumetric temporal data in Virtual Reality (VR) takes advantage of immersive user experiences where stakeholders are literally inside the data. No longer restricted by the edges of a screen, VR provides an innovative way of viewing spatially distributed 2D and 3D data that leverages a 360 field of view and positional-tracking input, allowing users to see and experience data differently. These concepts are relevant to many sectors, industries, and fields of study, as real-time collaboration in VR can enhance understanding and mission with VR visualizations that display temporally-aware 3D, meteorological, and other volumetric datasets. The ability to view data that is traditionally "difficult" to visualize, such as subsurface features or air columns, is a particularly compelling use of the technology. Various development iterations have resulted in Navteca's proof of concept that imports and renders volumetric point-cloud data in the virtual reality environment by interfacing PC-based VR hardware to a back-end server and popular GIS software. The integration of the geo-located data in VR and subsequent display of changeable basemaps, overlaid datasets, and the ability to zoom, navigate, and select specific areas show the potential for immersive VR to revolutionize the way Earth data is viewed, analyzed, and communicated.

  6. Stereoscopic displays for virtual reality in the car manufacturing industry: application to design review and ergonomic studies

    NASA Astrophysics Data System (ADS)

    Moreau, Guillaume; Fuchs, Philippe

    2002-05-01

    In the car manufacturing industry the trend is to drastically reduce the time-to-market by increasing the use of the Digital Mock-up instead of physical prototypes. Design review and ergonomic studies are specific tasks because they involve qualitative or even subjective judgements. In this paper, we present IMAVE (IMmersion Adapted to a VEhicle) designed for immersive styling review, gaps visualization and simple ergonomic studies. We show that stereoscopic displays are necessary and must fulfill several constraints due to the proximity and size of the car dashboard. The duration fo the work sessions forces us to eliminate all vertical parallax, and 1:1 scale is obviously required for a valid immersion. Two demonstrators were realized allowing us to have a large set of testers (over 100). More than 80% of the testers saw an immediate use of the IMAVE system. We discuss the good and bad marks awarded to the system. Future work include being able to use several rear-projected stereo screens for doors and central console visualization, but without the parallax presently visible in some CAVE-like environments.

  7. Analysis of brain activity and response during monoscopic and stereoscopic visualization

    NASA Astrophysics Data System (ADS)

    Calore, Enrico; Folgieri, Raffaella; Gadia, Davide; Marini, Daniele

    2012-03-01

    Stereoscopic visualization in cinematography and Virtual Reality (VR) creates an illusion of depth by means of two bidimensional images corresponding to different views of a scene. This perceptual trick is used to enhance the emotional response and the sense of presence and immersivity of the observers. An interesting question is if and how it is possible to measure and analyze the level of emotional involvement and attention of the observers during a stereoscopic visualization of a movie or of a virtual environment. The research aims represent a challenge, due to the large number of sensorial, physiological and cognitive stimuli involved. In this paper we begin this research by analyzing possible differences in the brain activity of subjects during the viewing of monoscopic or stereoscopic contents. To this aim, we have performed some preliminary experiments collecting electroencephalographic (EEG) data of a group of users using a Brain- Computer Interface (BCI) during the viewing of stereoscopic and monoscopic short movies in a VR immersive installation.

  8. Coupled auralization and virtual video for immersive multimedia displays

    NASA Astrophysics Data System (ADS)

    Henderson, Paul D.; Torres, Rendell R.; Shimizu, Yasushi; Radke, Richard; Lonsway, Brian

    2003-04-01

    The implementation of maximally-immersive interactive multimedia in exhibit spaces requires not only the presentation of realistic visual imagery but also the creation of a perceptually accurate aural experience. While conventional implementations treat audio and video problems as essentially independent, this research seeks to couple the visual sensory information with dynamic auralization in order to enhance perceptual accuracy. An implemented system has been developed for integrating accurate auralizations with virtual video techniques for both interactive presentation and multi-way communication. The current system utilizes a multi-channel loudspeaker array and real-time signal processing techniques for synthesizing the direct sound, early reflections, and reverberant field excited by a moving sound source whose path may be interactively defined in real-time or derived from coupled video tracking data. In this implementation, any virtual acoustic environment may be synthesized and presented in a perceptually-accurate fashion to many participants over a large listening and viewing area. Subject tests support the hypothesis that the cross-modal coupling of aural and visual displays significantly affects perceptual localization accuracy.

  9. Data Visualization Using Immersive Virtual Reality Tools

    NASA Astrophysics Data System (ADS)

    Cioc, Alexandru; Djorgovski, S. G.; Donalek, C.; Lawler, E.; Sauer, F.; Longo, G.

    2013-01-01

    The growing complexity of scientific data poses serious challenges for an effective visualization. Data sets, e.g., catalogs of objects detected in sky surveys, can have a very high dimensionality, ~ 100 - 1000. Visualizing such hyper-dimensional data parameter spaces is essentially impossible, but there are ways of visualizing up to ~ 10 dimensions in a pseudo-3D display. We have been experimenting with the emerging technologies of immersive virtual reality (VR) as a platform for a scientific, interactive, collaborative data visualization. Our initial experiments used the virtual world of Second Life, and more recently VR worlds based on its open source code, OpenSimulator. There we can visualize up to ~ 100,000 data points in ~ 7 - 8 dimensions (3 spatial and others encoded as shapes, colors, sizes, etc.), in an immersive virtual space where scientists can interact with their data and with each other. We are now developing a more scalable visualization environment using the popular (practically an emerging standard) Unity 3D Game Engine, coded using C#, JavaScript, and the Unity Scripting Language. This visualization tool can be used through a standard web browser, or a standalone browser of its own. Rather than merely plotting data points, the application creates interactive three-dimensional objects of various shapes, colors, and sizes, and of course the XYZ positions, encoding various dimensions of the parameter space, that can be associated interactively. Multiple users can navigate through this data space simultaneously, either with their own, independent vantage points, or with a shared view. At this stage ~ 100,000 data points can be easily visualized within seconds on a simple laptop. The displayed data points can contain linked information; e.g., upon a clicking on a data point, a webpage with additional information can be rendered within the 3D world. A range of functionalities has been already deployed, and more are being added. We expect to make this visualization tool freely available to the academic community within a few months, on an experimental (beta testing) basis.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eric A. Wernert; William R. Sherman; Patrick O'Leary

    Immersive visualization makes use of the medium of virtual reality (VR) - it is a subset of virtual reality focused on the application of VR technologies to scientific and information visualization. As the name implies, there is a particular focus on the physically immersive aspect of VR that more fully engages the perceptual and kinesthetic capabilities of the scientist with the goal of producing greater insight. The immersive visualization community is uniquely positioned to address the analysis needs of the wide spectrum of domain scientists who are becoming increasingly overwhelmed by data. The outputs of computational science simulations and high-resolutionmore » sensors are creating a data deluge. Data is coming in faster than it can be analyzed, and there are countless opportunities for discovery that are missed as the data speeds by. By more fully utilizing the scientists visual and other sensory systems, and by offering a more natural user interface with which to interact with computer-generated representations, immersive visualization offers great promise in taming this data torrent. However, increasing the adoption of immersive visualization in scientific research communities can only happen by simultaneously lowering the engagement threshold while raising the measurable benefits of adoption. Scientists time spent immersed with their data will thus be rewarded with higher productivity, deeper insight, and improved creativity. Immersive visualization ties together technologies and methodologies from a variety of related but frequently disjoint areas, including hardware, software and human-computer interaction (HCI) disciplines. In many ways, hardware is a solved problem. There are well established technologies including large walk-in systems such as the CAVE{trademark} and head-based systems such as the Wide-5{trademark}. The advent of new consumer-level technologies now enable an entirely new generation of immersive displays, with smaller footprints and costs, widening the potential consumer base. While one would be hard-pressed to call software a solved problem, we now understand considerably more about best practices for designing and developing sustainable, scalable software systems, and we have useful software examples that illuminate the way to even better implementations. As with any research endeavour, HCI will always be exploring new topics in interface design, but we now have a sizable knowledge base of the strengths and weaknesses of the human perceptual systems and we know how to design effective interfaces for immersive systems. So, in a research landscape with a clear need for better visualization and analysis tools, a methodology in immersive visualization that has been shown to effectively address some of those needs, and vastly improved supporting technologies and knowledge of hardware, software, and HCI, why hasn't immersive visualization 'caught on' more with scientists? What can we do as a community of immersive visualization researchers and practitioners to facilitate greater adoption by scientific communities so as to make the transition from 'the promise of virtual reality' to 'the reality of virtual reality'.« less

  11. Research on Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Lobeck, William E.

    2002-01-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  12. Research on Intelligent Synthesis Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.; Loftin, R. Bowen

    2002-12-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  13. Manifold compositions, music visualization, and scientific sonification in an immersive virtual-reality environment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaper, H. G.

    1998-01-05

    An interdisciplinary project encompassing sound synthesis, music composition, sonification, and visualization of music is facilitated by the high-performance computing capabilities and the virtual-reality environments available at Argonne National Laboratory. The paper describes the main features of the project's centerpiece, DIASS (Digital Instrument for Additive Sound Synthesis); ''A.N.L.-folds'', an equivalence class of compositions produced with DIASS; and application of DIASS in two experiments in the sonification of complex scientific data. Some of the larger issues connected with this project, such as the changing ways in which both scientists and composers perform their tasks, are briefly discussed.

  14. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    NASA Astrophysics Data System (ADS)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  15. Hybrid Reality Lab Capabilities - Video 2

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2016-01-01

    Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.

  16. Virtual environment display for a 3D audio room simulation

    NASA Technical Reports Server (NTRS)

    Chapin, William L.; Foster, Scott H.

    1992-01-01

    The development of a virtual environment simulation system integrating a 3D acoustic audio model with an immersive 3D visual scene is discussed. The system complements the acoustic model and is specified to: allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; reinforce the listener's feeling of telepresence in the acoustical environment with visual and proprioceptive sensations; enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations.

  17. High Resolution Visualization Applied to Future Heavy Airlift Concept Development and Evaluation

    NASA Technical Reports Server (NTRS)

    FordCook, A. B.; King, T.

    2012-01-01

    This paper explores the use of high resolution 3D visualization tools for exploring the feasibility and advantages of future military cargo airlift concepts and evaluating compatibility with existing and future payload requirements. Realistic 3D graphic representations of future airlifters are immersed in rich, supporting environments to demonstrate concepts of operations to key personnel for evaluation, feedback, and development of critical joint support. Accurate concept visualizations are reviewed by commanders, platform developers, loadmasters, soldiers, scientists, engineers, and key principal decision makers at various stages of development. The insight gained through the review of these physically and operationally realistic visualizations is essential to refining design concepts to meet competing requirements in a fiscally conservative defense finance environment. In addition, highly accurate 3D geometric models of existing and evolving large military vehicles are loaded into existing and proposed aircraft cargo bays. In this virtual aircraft test-loading environment, materiel developers, engineers, managers, and soldiers can realistically evaluate the compatibility of current and next-generation airlifters with proposed cargo.

  18. Visualization Center Dedicated

    NASA Image and Video Library

    2003-10-17

    The dedication ceremony of the University of Southern Mississippi Center of Higher Learning (CHL) High-Performance Visualization Center at SSC was held Oct. 17. The center's RAVE II 3-D visualization system, available to both on- and off-site scientists, turns data into a fully immersive environment for the user. Cutting the ribbon are, from left, Rear Adm. Thomas Donaldson, commander of the Naval Meteorology and Oceanography Command; Jim Meredith, former director of the CHL; USM President Dr. Shelby Thames; Lt. Gov. Amy Tuck; Dr. Peter Ranelli, director of the CHL; Dewey Herring, chairman of the policy board for the CHL; and former Sen. Cecil Burge.

  19. Visualization Center Dedicated

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The dedication ceremony of the University of Southern Mississippi Center of Higher Learning (CHL) High-Performance Visualization Center at SSC was held Oct. 17. The center's RAVE II 3-D visualization system, available to both on- and off-site scientists, turns data into a fully immersive environment for the user. Cutting the ribbon are, from left, Rear Adm. Thomas Donaldson, commander of the Naval Meteorology and Oceanography Command; Jim Meredith, former director of the CHL; USM President Dr. Shelby Thames; Lt. Gov. Amy Tuck; Dr. Peter Ranelli, director of the CHL; Dewey Herring, chairman of the policy board for the CHL; and former Sen. Cecil Burge.

  20. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges

    PubMed Central

    Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel

    2015-01-01

    The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants’ behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants’ height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother). PMID:26157414

  1. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges.

    PubMed

    Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel

    2015-01-01

    The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants' behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants' height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother).

  2. The Flostation - an Immersive Cyberspace System

    NASA Technical Reports Server (NTRS)

    Park, Brian

    2006-01-01

    A flostation is a computer-controlled apparatus that, along with one or more computer(s) and other computer-controlled equipment, is part of an immersive cyberspace system. The system is said to be immersive in two senses of the word: (1) It supports the body in a modified form neutral posture experienced in zero gravity and (2) it is equipped with computer-controlled display equipment that helps to give the occupant of the chair a feeling of immersion in an environment that the system is designed to simulate. Neutral immersion was conceived during the Gemini program as a means of training astronauts for working in a zerogravity environment. Current derivatives include neutral-buoyancy tanks and the KC-135 airplane, each of which mimics the effects of zero gravity. While these have performed well in simulating the shorter-duration flights typical of the space program to date, a training device that can take astronauts to the next level will be needed for simulating longer-duration flights such as that of the International Space Station. The flostation is expected to satisfy this need. The flostation could also be adapted and replicated for use in commercial ventures ranging from home entertainment to medical treatment. The use of neutral immersion in the flostation enables the occupant to recline in an optimal posture of rest and meditation. This posture, combines savasana (known to practitioners of yoga) and a modified form of the neutral posture assumed by astronauts in outer space. As the occupant relaxes, awareness of the physical body is reduced. The neutral body posture, which can be maintained for hours without discomfort, is extended to the eyes, ears, and hands. The occupant can be surrounded with a full-field-of-view visual display and nearphone sound, and can be stimulated with full-body vibration and motion cueing. Once fully immersed, the occupant can use neutral hand controllers (that is, hand-posture sensors) to control various aspects of the simulated environment.

  3. Low-cost telepresence for collaborative virtual environments.

    PubMed

    Rhee, Seon-Min; Ziegler, Remo; Park, Jiyoung; Naef, Martin; Gross, Markus; Kim, Myoung-Hee

    2007-01-01

    We present a novel low-cost method for visual communication and telepresence in a CAVE -like environment, relying on 2D stereo-based video avatars. The system combines a selection of proven efficient algorithms and approximations in a unique way, resulting in a convincing stereoscopic real-time representation of a remote user acquired in a spatially immersive display. The system was designed to extend existing projection systems with acquisition capabilities requiring minimal hardware modifications and cost. The system uses infrared-based image segmentation to enable concurrent acquisition and projection in an immersive environment without a static background. The system consists of two color cameras and two additional b/w cameras used for segmentation in the near-IR spectrum. There is no need for special optics as the mask and color image are merged using image-warping based on a depth estimation. The resulting stereo image stream is compressed, streamed across a network, and displayed as a frame-sequential stereo texture on a billboard in the remote virtual environment.

  4. Recent Advances in Immersive Visualization of Ocean Data: Virtual Reality Through the Web on Your Laptop Computer

    NASA Astrophysics Data System (ADS)

    Hermann, A. J.; Moore, C.; Soreide, N. N.

    2002-12-01

    Ocean circulation is irrefutably three dimensional, and powerful new measurement technologies and numerical models promise to expand our three-dimensional knowledge of the dynamics further each year. Yet, most ocean data and model output is still viewed using two-dimensional maps. Immersive visualization techniques allow the investigator to view their data as a three dimensional world of surfaces and vectors which evolves through time. The experience is not unlike holding a part of the ocean basin in one's hand, turning and examining it from different angles. While immersive, three dimensional visualization has been possible for at least a decade, the technology was until recently inaccessible (both physically and financially) for most researchers. It is not yet fully appreciated by practicing oceanographers how new, inexpensive computing hardware and software (e.g. graphics cards and controllers designed for the huge PC gaming market) can be employed for immersive, three dimensional, color visualization of their increasingly huge datasets and model output. In fact, the latest developments allow immersive visualization through web servers, giving scientists the ability to "fly through" three-dimensional data stored half a world away. Here we explore what additional insight is gained through immersive visualization, describe how scientists of very modest means can easily avail themselves of the latest technology, and demonstrate its implementation on a web server for Pacific Ocean model output.

  5. Novel Safranin-Tinted Candida rugosa Lipase Nanoconjugates Reagent for Visualizing Latent Fingerprints on Stainless Steel Knives Immersed in a Natural Outdoor Pond.

    PubMed

    Azman, Aida Rasyidah; Mahat, Naji Arafat; Abdul Wahab, Roswanira; Abdul Razak, Fazira Ilyana; Hamzah, Hafezul Helmi

    2018-05-25

    Waterways are popular locations for the disposition of criminal evidence because the recovery of latent fingerprints from such evidence is difficult. Currently, small particle reagent is a method often used to visualize latent fingerprints containing carcinogenic and hazardous compounds. This study proposes an eco-friendly, safranin-tinted Candida rugosa lipase (triacylglycerol ester hydrolysis EC 3.1.1.3) with functionalized carbon nanotubes (CRL-MWCNTS/GA/SAF) as an alternative reagent to the small particle reagent. The CRL-MWCNTS/GA/SAF reagent was compared with the small particle reagent to visualize groomed, full fingerprints deposited on stainless steel knives which were immersed in a natural outdoor pond for 30 days. The quality of visualized fingerprints using the new reagent was similar (modified-Centre for Applied Science and Technology grade: 4; p > 0.05) to small particle reagent, even after 15 days of immersion. Despite the slight decrease in quality of visualized fingerprints using the CRL-MWCNTS/GA/SAF on the last three immersion periods, the fingerprints remained forensically identifiable (modified-Centre for Applied Science and Technology grade: 3). The possible chemical interactions that enabled successful visualization is also discussed. Thus, this novel reagent may provide a relatively greener alternative for the visualization of latent fingerprints on immersed non-porous objects.

  6. Eye-tracking novice and expert geologist groups in the field and laboratory

    NASA Astrophysics Data System (ADS)

    Cottrell, R. D.; Evans, K. M.; Jacobs, R. A.; May, B. B.; Pelz, J. B.; Rosen, M. R.; Tarduno, J. A.; Voronov, J.

    2010-12-01

    We are using an Active Vision approach to learn how novices and expert geologists acquire visual information in the field. The Active Vision approach emphasizes that visual perception is an active process wherein new information is acquired about a particular environment through exploratory eye movements. Eye movements are not only influenced by physical stimuli, but are also strongly influenced by high-level perceptual and cognitive processes. Eye-tracking data were collected on ten novices (undergraduate geology students) and 3 experts during a 10-day field trip across California focused on neotectonics. In addition, high-resolution panoramic images were captured at each key locality for use in a semi-immersive laboratory environment. Examples of each data type will be presented. The number of observers will be increased in subsequent field trips, but expert/novice differences are already apparent in the first set of individual eye-tracking records, including gaze time, gaze pattern and object recognition. We will review efforts to quantify these patterns, and development of semi-immersive environments to display geologic scenes. The research is a collaborative effort between Earth scientists, Cognitive scientists and Imaging scientists at the University of Rochester and the Rochester Institute of Technology and with funding from the National Science Foundation.

  7. Virtual reality: a reality for future military pilotage?

    NASA Astrophysics Data System (ADS)

    McIntire, John P.; Martinsen, Gary L.; Marasco, Peter L.; Havig, Paul R.

    2009-05-01

    Virtual reality (VR) systems provide exciting new ways to interact with information and with the world. The visual VR environment can be synthetic (computer generated) or be an indirect view of the real world using sensors and displays. With the potential opportunities of a VR system, the question arises about what benefits or detriments a military pilot might incur by operating in such an environment. Immersive and compelling VR displays could be accomplished with an HMD (e.g., imagery on the visor), large area collimated displays, or by putting the imagery on an opaque canopy. But what issues arise when, instead of viewing the world directly, a pilot views a "virtual" image of the world? Is 20/20 visual acuity in a VR system good enough? To deliver this acuity over the entire visual field would require over 43 megapixels (MP) of display surface for an HMD or about 150 MP for an immersive CAVE system, either of which presents a serious challenge with current technology. Additionally, the same number of sensor pixels would be required to drive the displays to this resolution (and formidable network architectures required to relay this information), or massive computer clusters are necessary to create an entirely computer-generated virtual reality with this resolution. Can we presently implement such a system? What other visual requirements or engineering issues should be considered? With the evolving technology, there are many technological issues and human factors considerations that need to be addressed before a pilot is placed within a virtual cockpit.

  8. Sounds of silence: How to animate virtual worlds with sound

    NASA Technical Reports Server (NTRS)

    Astheimer, Peter

    1993-01-01

    Sounds are an integral and sometimes annoying part of our daily life. Virtual worlds which imitate natural environments gain a lot of authenticity from fast, high quality visualization combined with sound effects. Sounds help to increase the degree of immersion for human dwellers in imaginary worlds significantly. The virtual reality toolkit of IGD (Institute for Computer Graphics) features a broad range of standard visual and advanced real-time audio components which interpret an object-oriented definition of the scene. The virtual reality system 'Virtual Design' realized with the toolkit enables the designer of virtual worlds to create a true audiovisual environment. Several examples on video demonstrate the usage of the audio features in Virtual Design.

  9. Combining photorealistic immersive geovisualization and high-resolution geospatial data to enhance human-scale viewshed modelling

    NASA Astrophysics Data System (ADS)

    Tabrizian, P.; Petrasova, A.; Baran, P.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.

    2017-12-01

    Viewshed modelling- a process of defining, parsing and analysis of landscape visual space's structure within GIS- has been commonly used in applications ranging from landscape planning and ecosystem services assessment to geography and archaeology. However, less effort has been made to understand whether and to what extent these objective analyses predict actual on-the-ground perception of human observer. Moreover, viewshed modelling at the human-scale level require incorporation of fine-grained landscape structure (eg., vegetation) and patterns (e.g, landcover) that are typically omitted from visibility calculations or unrealistically simulated leading to significant error in predicting visual attributes. This poster illustrates how photorealistic Immersive Virtual Environments and high-resolution geospatial data can be used to integrate objective and subjective assessments of visual characteristics at the human-scale level. We performed viewshed modelling for a systematically sampled set of viewpoints (N=340) across an urban park using open-source GIS (GRASS GIS). For each point a binary viewshed was computed on a 3D surface model derived from high-density leaf-off LIDAR (QL2) points. Viewshed map was combined with high-resolution landcover (.5m) derived through fusion of orthoimagery, lidar vegetation, and vector data. Geo-statistics and landscape structure analysis was performed to compute topological and compositional metrics for visual-scale (e.g., openness), complexity (pattern, shape and object diversity), and naturalness. Based on the viewshed model output, a sample of 24 viewpoints representing the variation of visual characteristics were selected and geolocated. For each location, 360o imagery were captured using a DSL camera mounted on a GIGA PAN robot. We programmed a virtual reality application through which human subjects (N=100) immersively experienced a random representation of selected environments via a head-mounted display (Oculus Rift CV1), and rated each location on perceived openness, naturalness and complexity. Regression models were performed to correlate model outputs with participants' responses. The results indicated strong, significant correlations for openness, and naturalness and moderate correlation for complexity estimations.

  10. Not Just a Game … When We Play Together, We Learn Together: Interactive Virtual Environments and Gaming Engines for Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Anderson, J. W.

    2017-12-01

    An ideal tool for ecologists and land managers to investigate the impacts of both projected environmental changes and policy alternatives is the creation of immersive, interactive, virtual landscapes. As a new frontier in visualizing and understanding geospatial data, virtual landscapes require a new toolbox for data visualization that includes traditional GIS tools and uncommon tools such as the Unity3d game engine. Game engines provide capabilities to not only explore data but to build and interact with dynamic models collaboratively. These virtual worlds can be used to display and illustrate data that is often more understandable and plausible to both stakeholders and policy makers than is achieved using traditional maps.Within this context we will present funded research that has been developed utilizing virtual landscapes for geographic visualization and decision support among varied stakeholders. We will highlight the challenges and lessons learned when developing interactive virtual environments that require large multidisciplinary team efforts with varied competences. The results will emphasize the importance of visualization and interactive virtual environments and the link with emerging research disciplines within Visual Analytics.

  11. Motion parallax in immersive cylindrical display systems

    NASA Astrophysics Data System (ADS)

    Filliard, N.; Reymond, G.; Kemeny, A.; Berthoz, A.

    2012-03-01

    Motion parallax is a crucial visual cue produced by translations of the observer for the perception of depth and selfmotion. Therefore, tracking the observer viewpoint has become inevitable in immersive virtual (VR) reality systems (cylindrical screens, CAVE, head mounted displays) used e.g. in automotive industry (style reviews, architecture design, ergonomics studies) or in scientific studies of visual perception. The perception of a stable and rigid world requires that this visual cue be coherent with other extra-retinal (e.g. vestibular, kinesthetic) cues signaling ego-motion. Although world stability is never questioned in real world, rendering head coupled viewpoint in VR can lead to the perception of an illusory perception of unstable environments, unless a non-unity scale factor is applied on recorded head movements. Besides, cylindrical screens are usually used with static observers due to image distortions when rendering image for viewpoints different from a sweet spot. We developed a technique to compensate in real-time these non-linear visual distortions, in an industrial VR setup, based on a cylindrical screen projection system. Additionally, to evaluate the amount of discrepancies tolerated without perceptual distortions between visual and extraretinal cues, a "motion parallax gain" between the velocity of the observer's head and that of the virtual camera was introduced in this system. The influence of this artificial gain was measured on the gait stability of free-standing participants. Results indicate that, below unity, gains significantly alter postural control. Conversely, the influence of higher gains remains limited, suggesting a certain tolerance of observers to these conditions. Parallax gain amplification is therefore proposed as a possible solution to provide a wider exploration of space to users of immersive virtual reality systems.

  12. Art-Science-Technology collaboration through immersive, interactive 3D visualization

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2014-12-01

    At the W. M. Keck Center for Active Visualization in Earth Sciences (KeckCAVES), a group of geoscientists and computer scientists collaborate to develop and use of interactive, immersive, 3D visualization technology to view, manipulate, and interpret data for scientific research. The visual impact of immersion in a CAVE environment can be extremely compelling, and from the outset KeckCAVES scientists have collaborated with artists to bring this technology to creative works, including theater and dance performance, installations, and gamification. The first full-fledged collaboration designed and produced a performance called "Collapse: Suddenly falling down", choreographed by Della Davidson, which investigated the human and cultural response to natural and man-made disasters. Scientific data (lidar scans of disaster sites, such as landslides and mine collapses) were fully integrated into the performance by the Sideshow Physical Theatre. This presentation will discuss both the technological and creative characteristics of, and lessons learned from the collaboration. Many parallels between the artistic and scientific process emerged. We observed that both artists and scientists set out to investigate a topic, solve a problem, or answer a question. Refining that question or problem is an essential part of both the creative and scientific workflow. Both artists and scientists seek understanding (in this case understanding of natural disasters). Differences also emerged; the group noted that the scientists sought clarity (including but not limited to quantitative measurements) as a means to understanding, while the artists embraced ambiguity, also as a means to understanding. Subsequent art-science-technology collaborations have responded to evolving technology for visualization and include gamification as a means to explore data, and use of augmented reality for informal learning in museum settings.

  13. Immersive Virtual Reality Therapy with Myoelectric Control for Treatment-resistant Phantom Limb Pain: Case Report.

    PubMed

    Chau, Brian; Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc

    2017-01-01

    Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain.

  14. Immersive Virtual Reality Therapy with Myoelectric Control for Treatment-resistant Phantom Limb Pain: Case Report

    PubMed Central

    Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc

    2017-01-01

    Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain. PMID:29616149

  15. Subjective experiences of watching stereoscopic Avatar and U2 3D in a cinema

    NASA Astrophysics Data System (ADS)

    Pölönen, Monika; Salmimaa, Marja; Takatalo, Jari; Häkkinen, Jukka

    2012-01-01

    A stereoscopic 3-D version of the film Avatar was shown to 85 people who subsequently answered questions related to sickness, visual strain, stereoscopic image quality, and sense of presence. Viewing Avatar for 165 min induced some symptoms of visual strain and sickness, but the symptom levels remained low. A comparison between Avatar and previously published results for the film U2 3D showed that sickness and visual strain levels were similar despite the films' runtimes. The genre of the film had a significant effect on the viewers' opinions and sense of presence. Avatar, which has been described as a combination of action, adventure, and sci-fi genres, was experienced as more immersive and engaging than the music documentary U2 3D. However, participants in both studies were immersed, focused, and absorbed in watching the stereoscopic 3-D (S3-D) film and were pleased with the film environments. The results also showed that previous stereoscopic 3-D experience significantly reduced the amount of reported eye strain and complaints about the weight of the viewing glasses.

  16. Real-time recording and classification of eye movements in an immersive virtual environment.

    PubMed

    Diaz, Gabriel; Cooper, Joseph; Kit, Dmitry; Hayhoe, Mary

    2013-10-10

    Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements.

  17. Real-time recording and classification of eye movements in an immersive virtual environment

    PubMed Central

    Diaz, Gabriel; Cooper, Joseph; Kit, Dmitry; Hayhoe, Mary

    2013-01-01

    Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements. PMID:24113087

  18. Visualization of reservoir simulation data with an immersive virtual reality system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, B.K.

    1996-10-01

    This paper discusses an investigation into the use of an immersive virtual reality (VR) system to visualize reservoir simulation output data. The hardware and software configurations of the test-immersive VR system are described and compared to a nonimmersive VR system and to an existing workstation screen-based visualization system. The structure of 3D reservoir simulation data and the actions to be performed on the data within the VR system are discussed. The subjective results of the investigation are then presented, followed by a discussion of possible future work.

  19. Immersive Visual Analytics for Transformative Neutron Scattering Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Daniel, Jamison R; Drouhard, Margaret

    The ORNL Spallation Neutron Source (SNS) provides the most intense pulsed neutron beams in the world for scientific research and development across a broad range of disciplines. SNS experiments produce large volumes of complex data that are analyzed by scientists with varying degrees of experience using 3D visualization and analysis systems. However, it is notoriously difficult to achieve proficiency with 3D visualizations. Because 3D representations are key to understanding the neutron scattering data, scientists are unable to analyze their data in a timely fashion resulting in inefficient use of the limited and expensive SNS beam time. We believe a moremore » intuitive interface for exploring neutron scattering data can be created by combining immersive virtual reality technology with high performance data analytics and human interaction. In this paper, we present our initial investigations of immersive visualization concepts as well as our vision for an immersive visual analytics framework that could lower the barriers to 3D exploratory data analysis of neutron scattering data at the SNS.« less

  20. Designing Experiential Modes: A Key Focus for Immersive Learning Environments

    ERIC Educational Resources Information Center

    Appelman, Robert

    2005-01-01

    A student sitting in a class and listening to an instructor talk is experiencing a particular mode of instruction sensed through visual and audio channels. She is aware that she is in the center of a classroom and also in close proximity to other students. Occasionally they gesture to the instructor at the front of the room, who stops talking when…

  1. High sensitivity to multisensory conflicts in agoraphobia exhibited by virtual reality.

    PubMed

    Viaud-Delmon, Isabelle; Warusfel, Olivier; Seguelas, Angeline; Rio, Emmanuel; Jouvent, Roland

    2006-10-01

    The primary aim of this study was to evaluate the effect of auditory feedback in a VR system planned for clinical use and to address the different factors that should be taken into account in building a bimodal virtual environment (VE). We conducted an experiment in which we assessed spatial performances in agoraphobic patients and normal subjects comparing two kinds of VEs, visual alone (Vis) and auditory-visual (AVis), during separate sessions. Subjects were equipped with a head-mounted display coupled with an electromagnetic sensor system and immersed in a virtual town. Their task was to locate different landmarks and become familiar with the town. In the AVis condition subjects were equipped with the head-mounted display and headphones, which delivered a soundscape updated in real-time according to their movement in the virtual town. While general performances remained comparable across the conditions, the reported feeling of immersion was more compelling in the AVis environment. However, patients exhibited more cybersickness symptoms in this condition. The result of this study points to the multisensory integration deficit of agoraphobic patients and underline the need for further research on multimodal VR systems for clinical use.

  2. Single-shot water-immersion microscopy platform for qualitative visualization and quantitative phase imaging of biosamples

    NASA Astrophysics Data System (ADS)

    Picazo-Bueno, José Ángel; Cojoc, Dan; Torre, Vincent; Micó, Vicente

    2017-07-01

    We present the combination of a single-shot water-immersion digital holographic microscopy with broadband illumination for simultaneous visualization of coherent and incoherent images using microbeads and different biosamples.

  3. Foreign language learning in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Chang, Benjamin; Sheldon, Lee; Si, Mei; Hand, Anton

    2012-03-01

    Virtual reality has long been used for training simulations in fields from medicine to welding to vehicular operation, but simulations involving more complex cognitive skills present new design challenges. Foreign language learning, for example, is increasingly vital in the global economy, but computer-assisted education is still in its early stages. Immersive virtual reality is a promising avenue for language learning as a way of dynamically creating believable scenes for conversational training and role-play simulation. Visual immersion alone, however, only provides a starting point. We suggest that the addition of social interactions and motivated engagement through narrative gameplay can lead to truly effective language learning in virtual environments. In this paper, we describe the development of a novel application for teaching Mandarin using CAVE-like VR, physical props, human actors and intelligent virtual agents, all within a semester-long multiplayer mystery game. Students travel (virtually) to China on a class field trip, which soon becomes complicated with intrigue and mystery surrounding the lost manuscript of an early Chinese literary classic. Virtual reality environments such as the Forbidden City and a Beijing teahouse provide the setting for learning language, cultural traditions, and social customs, as well as the discovery of clues through conversation in Mandarin with characters in the game.

  4. The Worldviews Network: Transformative Global Change Education in Immersive Environments

    NASA Astrophysics Data System (ADS)

    Hamilton, H.; Yu, K. C.; Gardiner, N.; McConville, D.; Connolly, R.; "Irving, Lindsay", L. S.

    2011-12-01

    Our modern age is defined by an astounding capacity to generate scientific information. From DNA to dark matter, human ingenuity and technologies create an endless stream of data about ourselves and the world of which we are a part. Yet we largely founder in transforming information into understanding, and understanding into rational action for our society as a whole. Earth and biodiversity scientists are especially frustrated by this impasse because the data they gather often point to a clash between Earth's capacity to sustain life and the decisions that humans make to garner the planet's resources. Immersive virtual environments offer an underexplored link in the translation of scientific data into public understanding, dialogue, and action. The Worldviews Network is a collaboration of scientists, artists, and educators focused on developing best practices for the use of immersive environments for science-based ecological literacy education. A central tenet of the Worldviews Network is that there are multiple ways to know and experience the world, so we are developing scientifically accurate, geographically relevant, and culturally appropriate programming to promote ecological literacy within informal science education programs across the United States. The goal of Worldviews Network is to offer transformative learning experiences, in which participants are guided on a process integrating immersive visual explorations, critical reflection and dialogue, and design-oriented approaches to action - or more simply, seeing, knowing, and doing. Our methods center on live presentations, interactive scientific visualizations, and sustainability dialogues hosted at informal science institutions. Our approach uses datasets from the life, Earth, and space sciences to illuminate the complex conditions that support life on earth and the ways in which ecological systems interact. We are leveraging scientific data from federal agencies, non-governmental organizations, and our own research to develop a library of immersive visualization stories and templates that explore ecological relationships across time at cosmic, global, and bioregional scales, with learning goals aligned to climate and earth science literacy principles. These experiential narratives are used to increase participants' awareness of global change issues as well as to engage them in dialogues and design processes focused on steps they can take within their own communities to systemically address these interconnected challenges. More than 600 digital planetariums in the U.S. collectively represent a pioneering opportunity for distributing Earth systems messages over large geographic areas. By placing the viewer-and Earth itself-within the context of the rest of the universe, digital planetariums can uniquely provide essential transcalar perspectives on the complex interdependencies of Earth's interacting physical and biological systems. The Worldviews Network is creating innovative, data-driven approaches for engaging the American public in dialogues about human-induced global changes.

  5. Learning Relative Motion Concepts in Immersive and Non-Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria

    2013-01-01

    The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop…

  6. GROTTO visualization for decision support

    NASA Astrophysics Data System (ADS)

    Lanzagorta, Marco O.; Kuo, Eddy; Uhlmann, Jeffrey K.

    1998-08-01

    In this paper we describe the GROTTO visualization projects being carried out at the Naval Research Laboratory. GROTTO is a CAVE-like system, that is, a surround-screen, surround- sound, immersive virtual reality device. We have explored the GROTTO visualization in a variety of scientific areas including oceanography, meteorology, chemistry, biochemistry, computational fluid dynamics and space sciences. Research has emphasized the applications of GROTTO visualization for military, land and sea-based command and control. Examples include the visualization of ocean current models for the simulation and stud of mine drifting and, inside our computational steering project, the effects of electro-magnetic radiation on missile defense satellites. We discuss plans to apply this technology to decision support applications involving the deployment of autonomous vehicles into contaminated battlefield environments, fire fighter control and hostage rescue operations.

  7. Immersive visualization of rail simulation data.

    DOT National Transportation Integrated Search

    2016-01-01

    The prime objective of this project was to create scientific, immersive visualizations of a Rail-simulation. This project is a part of a larger initiative that consists of three distinct parts. The first step consists of performing a finite element a...

  8. Real-time, interactive, visually updated simulator system for telepresence

    NASA Technical Reports Server (NTRS)

    Schebor, Frederick S.; Turney, Jerry L.; Marzwell, Neville I.

    1991-01-01

    Time delays and limited sensory feedback of remote telerobotic systems tend to disorient teleoperators and dramatically decrease the operator's performance. To remove the effects of time delays, key components were designed and developed of a prototype forward simulation subsystem, the Global-Local Environment Telerobotic Simulator (GLETS) that buffers the operator from the remote task. GLETS totally immerses an operator in a real-time, interactive, simulated, visually updated artificial environment of the remote telerobotic site. Using GLETS, the operator will, in effect, enter into a telerobotic virtual reality and can easily form a gestalt of the virtual 'local site' that matches the operator's normal interactions with the remote site. In addition to use in space based telerobotics, GLETS, due to its extendable architecture, can also be used in other teleoperational environments such as toxic material handling, construction, and undersea exploration.

  9. Immersive 3D geovisualisation in higher education

    NASA Astrophysics Data System (ADS)

    Philips, Andrea; Walz, Ariane; Bergner, Andreas; Graeff, Thomas; Heistermann, Maik; Kienzler, Sarah; Korup, Oliver; Lipp, Torsten; Schwanghart, Wolfgang; Zeilinger, Gerold

    2014-05-01

    Through geovisualisation we explore spatial data, we analyse it towards a specific questions, we synthesise results, and we present and communicate them to a specific audience (MacEachren & Kraak 1997). After centuries of paper maps, the means to represent and visualise our physical environment and its abstract qualities have changed dramatically since the 1990s - and accordingly the methods how to use geovisualisation in teaching. Whereas some people might still consider the traditional classroom as ideal setting for teaching and learning geographic relationships and its mapping, we used a 3D CAVE (computer-animated virtual environment) as environment for a problem-oriented learning project called "GEOSimulator". Focussing on this project, we empirically investigated, if such a technological advance like the CAVE make 3D visualisation, including 3D geovisualisation, not only an important tool for businesses (Abulrub et al. 2012) and for the public (Wissen et al. 2008), but also for educational purposes, for which it had hardly been used yet. The 3D CAVE is a three-sided visualisation platform, that allows for immersive and stereoscopic visualisation of observed and simulated spatial data. We examined the benefits of immersive 3D visualisation for geographic research and education and synthesized three fundamental technology-based visual aspects: First, the conception and comprehension of space and location does not need to be generated, but is instantaneously and intuitively present through stereoscopy. Second, optical immersion into virtual reality strengthens this spatial perception which is in particular important for complex 3D geometries. And third, a significant benefit is interactivity, which is enhanced through immersion and allows for multi-discursive and dynamic data exploration and knowledge transfer. Based on our problem-oriented learning project, which concentrates on a case study on flood risk management at the Wilde Weisseritz in Germany, a river that significantly contributed to the hundred-year flooding in Dresden in 2002, we empirically evaluated the usefulness of this immersive 3D technology towards learning success. Results show that immersive 3D geovisualisation have educational and content-related advantages compared to 2D geovisualisations through the mentioned benefits. This innovative way of geovisualisation is thus not only entertaining and motivating for students, but can also be constructive for research studies by, for instance, facilitating the study of complex environments or decision-making processes.

  10. Comparison of path visualizations and cognitive measures relative to travel technique in a virtual environment.

    PubMed

    Zanbaka, Catherine A; Lok, Benjamin C; Babu, Sabarish V; Ulinski, Amy C; Hodges, Larry F

    2005-01-01

    We describe a between-subjects experiment that compared four different methods of travel and their effect on cognition and paths taken in an immersive virtual environment (IVE). Participants answered a set of questions based on Crook's condensation of Bloom's taxonomy that assessed their cognition of the IVE with respect to knowledge, understanding and application, and higher mental processes. Participants also drew a sketch map of the IVE and the objects within it. The users' sense of presence was measured using the Steed-Usoh-Slater Presence Questionnaire. The participants' position and head orientation were automatically logged during their exposure to the virtual environment. These logs were later used to create visualizations of the paths taken. Path analysis, such as exploring the overlaid path visualizations and dwell data information, revealed further differences among the travel techniques. Our results suggest that, for applications where problem solving and evaluation of information is important or where opportunity to train is minimal, then having a large tracked space so that the participant can walk around the virtual environment provides benefits over common virtual travel techniques.

  11. Immersive viewing engine

    NASA Astrophysics Data System (ADS)

    Schonlau, William J.

    2006-05-01

    An immersive viewing engine providing basic telepresence functionality for a variety of application types is presented. Augmented reality, teleoperation and virtual reality applications all benefit from the use of head mounted display devices that present imagery appropriate to the user's head orientation at full frame rates. Our primary application is the viewing of remote environments, as with a camera equipped teleoperated vehicle. The conventional approach where imagery from a narrow field camera onboard the vehicle is presented to the user on a small rectangular screen is contrasted with an immersive viewing system where a cylindrical or spherical format image is received from a panoramic camera on the vehicle, resampled in response to sensed user head orientation and presented via wide field eyewear display, approaching 180 degrees of horizontal field. Of primary interest is the user's enhanced ability to perceive and understand image content, even when image resolution parameters are poor, due to the innate visual integration and 3-D model generation capabilities of the human visual system. A mathematical model for tracking user head position and resampling the panoramic image to attain distortion free viewing of the region appropriate to the user's current head pose is presented and consideration is given to providing the user with stereo viewing generated from depth map information derived using stereo from motion algorithms.

  12. Visual influence on path integration in darkness indicates a multimodal representation of large-scale space

    PubMed Central

    Tcheang, Lili; Bülthoff, Heinrich H.; Burgess, Neil

    2011-01-01

    Our ability to return to the start of a route recently performed in darkness is thought to reflect path integration of motion-related information. Here we provide evidence that motion-related interoceptive representations (proprioceptive, vestibular, and motor efference copy) combine with visual representations to form a single multimodal representation guiding navigation. We used immersive virtual reality to decouple visual input from motion-related interoception by manipulating the rotation or translation gain of the visual projection. First, participants walked an outbound path with both visual and interoceptive input, and returned to the start in darkness, demonstrating the influences of both visual and interoceptive information in a virtual reality environment. Next, participants adapted to visual rotation gains in the virtual environment, and then performed the path integration task entirely in darkness. Our findings were accurately predicted by a quantitative model in which visual and interoceptive inputs combine into a single multimodal representation guiding navigation, and are incompatible with a model of separate visual and interoceptive influences on action (in which path integration in darkness must rely solely on interoceptive representations). Overall, our findings suggest that a combined multimodal representation guides large-scale navigation, consistent with a role for visual imagery or a cognitive map. PMID:21199934

  13. The Hologram in My Hand: How Effective is Interactive Exploration of 3D Visualizations in Immersive Tangible Augmented Reality?

    PubMed

    Bach, Benjamin; Sicat, Ronell; Beyer, Johanna; Cordeil, Maxime; Pfister, Hanspeter

    2018-01-01

    We report on a controlled user study comparing three visualization environments for common 3D exploration. Our environments differ in how they exploit natural human perception and interaction capabilities. We compare an augmented-reality head-mounted display (Microsoft HoloLens), a handheld tablet, and a desktop setup. The novel head-mounted HoloLens display projects stereoscopic images of virtual content into a user's real world and allows for interaction in-situ at the spatial position of the 3D hologram. The tablet is able to interact with 3D content through touch, spatial positioning, and tangible markers, however, 3D content is still presented on a 2D surface. Our hypothesis is that visualization environments that match human perceptual and interaction capabilities better to the task at hand improve understanding of 3D visualizations. To better understand the space of display and interaction modalities in visualization environments, we first propose a classification based on three dimensions: perception, interaction, and the spatial and cognitive proximity of the two. Each technique in our study is located at a different position along these three dimensions. We asked 15 participants to perform four tasks, each task having different levels of difficulty for both spatial perception and degrees of freedom for interaction. Our results show that each of the tested environments is more effective for certain tasks, but that generally the desktop environment is still fastest and most precise in almost all cases.

  14. How virtual reality works: illusions of vision in "real" and virtual environments

    NASA Astrophysics Data System (ADS)

    Stark, Lawrence W.

    1995-04-01

    Visual illusions abound in normal vision--illusions of clarity and completeness, of continuity in time and space, of presence and vivacity--and are part and parcel of the visual world inwhich we live. These illusions are discussed in terms of the human visual system, with its high- resolution fovea, moved from point to point in the visual scene by rapid saccadic eye movements (EMs). This sampling of visual information is supplemented by a low-resolution, wide peripheral field of view, especially sensitive to motion. Cognitive-spatial models controlling perception, imagery, and 'seeing,' also control the EMs that shift the fovea in the Scanpath mode. These illusions provide for presence, the sense off being within an environment. They equally well lead to 'Telepresence,' the sense of being within a virtual display, especially if the operator is intensely interacting within an eye-hand and head-eye human-machine interface that provides for congruent visual and motor frames of reference. Interaction, immersion, and interest compel telepresence; intuitive functioning and engineered information flows can optimize human adaptation to the artificial new world of virtual reality, as virtual reality expands into entertainment, simulation, telerobotics, and scientific visualization and other professional work.

  15. The Effects of Vision-Related Aspects on Noise Perception of Wind Turbines in Quiet Areas

    PubMed Central

    Maffei, Luigi; Iachini, Tina; Masullo, Massimiliano; Aletta, Francesco; Sorrentino, Francesco; Senese, Vincenzo Paolo; Ruotolo, Francesco

    2013-01-01

    Preserving the soundscape and geographic extension of quiet areas is a great challenge against the wide-spreading of environmental noise. The E.U. Environmental Noise Directive underlines the need to preserve quiet areas as a new aim for the management of noise in European countries. At the same time, due to their low population density, rural areas characterized by suitable wind are considered appropriate locations for installing wind farms. However, despite the fact that wind farms are represented as environmentally friendly projects, these plants are often viewed as visual and audible intruders, that spoil the landscape and generate noise. Even though the correlations are still unclear, it is obvious that visual impacts of wind farms could increase due to their size and coherence with respect to the rural/quiet environment. In this paper, by using the Immersive Virtual Reality technique, some visual and acoustical aspects of the impact of a wind farm on a sample of subjects were assessed and analyzed. The subjects were immersed in a virtual scenario that represented a situation of a typical rural outdoor scenario that they experienced at different distances from the wind turbines. The influence of the number and the colour of wind turbines on global, visual and auditory judgment were investigated. The main results showed that, regarding the number of wind turbines, the visual component has a weak effect on individual reactions, while the colour influences both visual and auditory individual reactions, although in a different way. PMID:23624578

  16. The effects of vision-related aspects on noise perception of wind turbines in quiet areas.

    PubMed

    Maffei, Luigi; Iachini, Tina; Masullo, Massimiliano; Aletta, Francesco; Sorrentino, Francesco; Senese, Vincenzo Paolo; Ruotolo, Francesco

    2013-04-26

    Preserving the soundscape and geographic extension of quiet areas is a great challenge against the wide-spreading of environmental noise. The E.U. Environmental Noise Directive underlines the need to preserve quiet areas as a new aim for the management of noise in European countries. At the same time, due to their low population density, rural areas characterized by suitable wind are considered appropriate locations for installing wind farms. However, despite the fact that wind farms are represented as environmentally friendly projects, these plants are often viewed as visual and audible intruders, that spoil the landscape and generate noise. Even though the correlations are still unclear, it is obvious that visual impacts of wind farms could increase due to their size and coherence with respect to the rural/quiet environment. In this paper, by using the Immersive Virtual Reality technique, some visual and acoustical aspects of the impact of a wind farm on a sample of subjects were assessed and analyzed. The subjects were immersed in a virtual scenario that represented a situation of a typical rural outdoor scenario that they experienced at different distances from the wind turbines. The influence of the number and the colour of wind turbines on global, visual and auditory judgment were investigated. The main results showed that, regarding the number of wind turbines, the visual component has a weak effect on individual reactions, while the colour influences both visual and auditory individual reactions, although in a different way.

  17. Assessing a VR-based learning environment for anatomy education.

    PubMed

    Hoffman, H; Murray, M; Hettinger, L; Viirre, E

    1998-01-01

    The purpose of the research proposed herein is to develop an empirical, methodological tool for the assessment of visual depth perception in virtual environments (VEs). Our goal is to develop and employ a behaviorally-based method for assessing the impact of VE design features on the perception of visual depth as indexed by the performance of fundamental perceptual-motor activities. Specifically, in this experiment we will assess the affect of two dimensions of VE system design--(1) viewing condition or "level of immersion", and (2) layout/design of the VE--on the performance of an engaging, game-like task. The characteristics of the task to be employed are as follows--(1) it places no demands on cognition in the form of problem solving, retrieval of previously learned information, or other analytic activity in order to assure that (2) variations in task performance can be exclusively attributed to the extent to which the experimental factors influence visual depth perception. Subjects' performance will be assessed in terms of the speed and accuracy of task performance, as well as underlying dimensions of performance such as workload, fatigue, and physiological well being (i.e., cybersickness). The results of this experiment will provide important information on the effect of VE immersion and other VE design issues on human perception and performance. Further development, refinement, and validation of this behaviorally-based methodology will be pursued to provide user-centered design criteria for the design and use of VE systems.

  18. Corridor One:An Integrated Distance Visualization Enuronments for SSI+ASCI Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher R. Johnson, Charles D. Hansen

    2001-10-29

    The goal of Corridor One: An Integrated Distance Visualization Environment for ASCI and SSI Application was to combine the forces of six leading edge laboratories working in the areas of visualization and distributed computing and high performance networking (Argonne National Laboratory, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, University of Illinois, University of Utah and Princeton University) to develop and deploy the most advanced integrated distance visualization environment for large-scale scientific visualization and demonstrate it on applications relevant to the DOE SSI and ASCI programs. The Corridor One team brought world class expertise in parallel rendering, deep image basedmore » rendering, immersive environment technology, large-format multi-projector wall based displays, volume and surface visualization algorithms, collaboration tools and streaming media technology, network protocols for image transmission, high-performance networking, quality of service technology and distributed computing middleware. Our strategy was to build on the very successful teams that produced the I-WAY, ''Computational Grids'' and CAVE technology and to add these to the teams that have developed the fastest parallel visualizations systems and the most widely used networking infrastructure for multicast and distributed media. Unfortunately, just as we were getting going on the Corridor One project, DOE cut the program after the first year. As such, our final report consists of our progress during year one of the grant.« less

  19. Human Fear Conditioning Conducted in Full Immersion 3-Dimensional Virtual Reality

    PubMed Central

    Huff, Nicole C.; Zielinski, David J.; Fecteau, Matthew E.; Brady, Rachael; LaBar, Kevin S.

    2010-01-01

    Fear conditioning is a widely used paradigm in non-human animal research to investigate the neural mechanisms underlying fear and anxiety. A major challenge in conducting conditioning studies in humans is the ability to strongly manipulate or simulate the environmental contexts that are associated with conditioned emotional behaviors. In this regard, virtual reality (VR) technology is a promising tool. Yet, adapting this technology to meet experimental constraints requires special accommodations. Here we address the methodological issues involved when conducting fear conditioning in a fully immersive 6-sided VR environment and present fear conditioning data. In the real world, traumatic events occur in complex environments that are made up of many cues, engaging all of our sensory modalities. For example, cues that form the environmental configuration include not only visual elements, but aural, olfactory, and even tactile. In rodent studies of fear conditioning animals are fully immersed in a context that is rich with novel visual, tactile and olfactory cues. However, standard laboratory tests of fear conditioning in humans are typically conducted in a nondescript room in front of a flat or 2D computer screen and do not replicate the complexity of real world experiences. On the other hand, a major limitation of clinical studies aimed at reducing (extinguishing) fear and preventing relapse in anxiety disorders is that treatment occurs after participants have acquired a fear in an uncontrolled and largely unknown context. Thus the experimenters are left without information about the duration of exposure, the true nature of the stimulus, and associated background cues in the environment1. In the absence of this information it can be difficult to truly extinguish a fear that is both cue and context-dependent. Virtual reality environments address these issues by providing the complexity of the real world, and at the same time allowing experimenters to constrain fear conditioning and extinction parameters to yield empirical data that can suggest better treatment options and/or analyze mechanistic hypotheses. In order to test the hypothesis that fear conditioning may be richly encoded and context specific when conducted in a fully immersive environment, we developed distinct virtual reality 3-D contexts in which participants experienced fear conditioning to virtual snakes or spiders. Auditory cues co-occurred with the CS in order to further evoke orienting responses and a feeling of "presence" in subjects 2 . Skin conductance response served as the dependent measure of fear acquisition, memory retention and extinction. PMID:20736913

  20. Defense Science Board 2006 Summer Study on 21st Century Strategic Technology Vectors, Volume 2: Critical Capabilities and Enabling Technologies

    DTIC Science & Technology

    2007-02-01

    neurosciences , 12 I CH APT ER 2 particularly those analytic elements that create models to assist in understanding individual and...precision geo-location 10. Cause-effect models (environment, infrastructure, socio-cultural, DIME, PMESII) 11. Storytelling , gisting and advanced...sources/TRL 5 Storytelling , gisting and advanced visualization)/TRL 2-5 High fidelity, socio-culturally relevant immersive games, training and mission

  1. The quality of visual information about the lower extremities influences visuomotor coordination during virtual obstacle negotiation.

    PubMed

    Kim, Aram; Kretch, Kari S; Zhou, Zixuan; Finley, James M

    2018-05-09

    Successful negotiation of obstacles during walking relies on the integration of visual information about the environment with ongoing locomotor commands. When information about the body and environment are removed through occlusion of the lower visual field, individuals increase downward head pitch angle, reduce foot placement precision, and increase safety margins during crossing. However, whether these effects are mediated by loss of visual information about the lower extremities, the obstacle, or both remains to be seen. Here, we used a fully immersive, virtual obstacle negotiation task to investigate how visual information about the lower extremities is integrated with information about the environment to facilitate skillful obstacle negotiation. Participants stepped over virtual obstacles while walking on a treadmill with one of three types of visual feedback about the lower extremities: no feedback, end-point feedback, or a link-segment model. We found that absence of visual information about the lower extremities led to an increase in the variability of leading foot placement after crossing. The presence of a visual representation of the lower extremities promoted greater downward head pitch angle during the approach to and subsequent crossing of an obstacle. In addition, having greater downward head pitch was associated with closer placement of the trailing foot to the obstacle, further placement of the leading foot after the obstacle, and higher trailing foot clearance. These results demonstrate that the fidelity of visual information about the lower extremities influences both feed-forward and feedback aspects of visuomotor coordination during obstacle negotiation.

  2. Research on spatial features of streets under the influence of immersion communication technology brought by new media

    NASA Astrophysics Data System (ADS)

    Xu, Hua-wei; Feng, Chen

    2017-04-01

    The rapid development of new media has exacerbated the complexity of urban street space’s information interaction. With the influence of the immersion communication, the streetscape has constructed a special scene like ‘media convergence’, which has brought a huge challenge for maintaining the urban streetscape order. The Spatial Visual Communication Research Method which should break the limitation of the traditional aesthetic space research, can provide a brand new prospect for this phenomenon research. This study aims to analyze and summarize the communication characteristics of new media and its context, which will be helpful for understanding the social meaning within the order change of the street’s spatial and physical environment.

  3. Design and development of physics simulations in the field of oscillations and waves suitable for k-12 and undergraduate instruction using video game technology

    NASA Astrophysics Data System (ADS)

    Tomesh, Trevor; Price, Colin

    2011-03-01

    Using the scripting language for the Unreal Tournament 2004 Engine, Unreal Script, demonstrations in the field of oscillations and waves were designed and developed. Variations on Euler's method and the Runge-Kutta method were used to numerically solve the equations of motion for seven different physical systems which were visually represented in the immersive environment of Unreal Tournament 2004. Data from each system was written to an output file, plotted and analyzed. The over-arching goal of this research is to successfully design and develop useful teaching tools for the k-12 and undergraduate classroom which, presented in the form of a video game, is immersive, engaging and educational.

  4. Research on evaluation techniques for immersive multimedia

    NASA Astrophysics Data System (ADS)

    Hashim, Aslinda M.; Romli, Fakaruddin Fahmi; Zainal Osman, Zosipha

    2013-03-01

    Nowadays Immersive Multimedia covers most usage in tremendous ways, such as healthcare/surgery, military, architecture, art, entertainment, education, business, media, sport, rehabilitation/treatment and training areas. Moreover, the significant of Immersive Multimedia to directly meet the end-users, clients and customers needs for a diversity of feature and purpose is the assembly of multiple elements that drive effective Immersive Multimedia system design, so evaluation techniques is crucial for Immersive Multimedia environments. A brief general idea of virtual environment (VE) context and `realism' concept that formulate the Immersive Multimedia environments is then provided. This is followed by a concise summary of the elements of VE assessment technique that is applied in Immersive Multimedia system design, which outlines the classification space for Immersive Multimedia environments evaluation techniques and gives an overview of the types of results reported. A particular focus is placed on the implications of the Immersive Multimedia environments evaluation techniques in relation to the elements of VE assessment technique, which is the primary purpose of producing this research. The paper will then conclude with an extensive overview of the recommendations emanating from the research.

  5. The effect on lower spine muscle activation of walking on a narrow beam in virtual reality.

    PubMed

    Antley, Angus; Slater, Mel

    2011-02-01

    To what extent do people behave in immersive virtual environments as they would in similar situations in a physical environment? There are many ways to address this question, ranging from questionnaires, behavioral studies, and the use of physiological measures. Here, we compare the onsets of muscle activity using surface electromyography (EMG) while participants were walking under three different conditions: on a normal floor surface, on a narrow ribbon along the floor, and on a narrow platform raised off the floor. The same situation was rendered in an immersive virtual environment (IVE) Cave-like system, and 12 participants did the three types of walking in a counter-balanced within-groups design. The mean number of EMG activity onsets per unit time followed the same pattern in the virtual environment as in the physical environment-significantly higher for walking on the platform compared to walking on the floor. Even though participants knew that they were in fact really walking at floor level in the virtual environment condition, the visual illusion of walking on a raised platform was sufficient to influence their behavior in a measurable way. This opens up the door for this technique to be used in gait and posture related scenarios including rehabilitation.

  6. Using CLIPS to represent knowledge in a VR simulation

    NASA Technical Reports Server (NTRS)

    Engelberg, Mark L.

    1994-01-01

    Virtual reality (VR) is an exciting use of advanced hardware and software technologies to achieve an immersive simulation. Until recently, the majority of virtual environments were merely 'fly-throughs' in which a user could freely explore a 3-dimensional world or a visualized dataset. Now that the underlying technologies are reaching a level of maturity, programmers are seeking ways to increase the complexity and interactivity of immersive simulations. In most cases, interactivity in a virtual environment can be specified in the form 'whenever such-and-such happens to object X, it reacts in the following manner.' CLIPS and COOL provide a simple and elegant framework for representing this knowledge-base in an efficient manner that can be extended incrementally. The complexity of a detailed simulation becomes more manageable when the control flow is governed by CLIPS' rule-based inference engine as opposed to by traditional procedural mechanisms. Examples in this paper will illustrate an effective way to represent VR information in CLIPS, and to tie this knowledge base to the input and output C routines of a typical virtual environment.

  7. Immersive Molecular Visualization with Omnidirectional Stereoscopic Ray Tracing and Remote Rendering

    PubMed Central

    Stone, John E.; Sherman, William R.; Schulten, Klaus

    2016-01-01

    Immersive molecular visualization provides the viewer with intuitive perception of complex structures and spatial relationships that are of critical interest to structural biologists. The recent availability of commodity head mounted displays (HMDs) provides a compelling opportunity for widespread adoption of immersive visualization by molecular scientists, but HMDs pose additional challenges due to the need for low-latency, high-frame-rate rendering. State-of-the-art molecular dynamics simulations produce terabytes of data that can be impractical to transfer from remote supercomputers, necessitating routine use of remote visualization. Hardware-accelerated video encoding has profoundly increased frame rates and image resolution for remote visualization, however round-trip network latencies would cause simulator sickness when using HMDs. We present a novel two-phase rendering approach that overcomes network latencies with the combination of omnidirectional stereoscopic progressive ray tracing and high performance rasterization, and its implementation within VMD, a widely used molecular visualization and analysis tool. The new rendering approach enables immersive molecular visualization with rendering techniques such as shadows, ambient occlusion lighting, depth-of-field, and high quality transparency, that are particularly helpful for the study of large biomolecular complexes. We describe ray tracing algorithms that are used to optimize interactivity and quality, and we report key performance metrics of the system. The new techniques can also benefit many other application domains. PMID:27747138

  8. Immersive Visualization of the Solid Earth

    NASA Astrophysics Data System (ADS)

    Kreylos, O.; Kellogg, L. H.

    2017-12-01

    Immersive visualization using virtual reality (VR) display technology offers unique benefits for the visual analysis of complex three-dimensional data such as tomographic images of the mantle and higher-dimensional data such as computational geodynamics models of mantle convection or even planetary dynamos. Unlike "traditional" visualization, which has to project 3D scalar data or vectors onto a 2D screen for display, VR can display 3D data in a pseudo-holographic (head-tracked stereoscopic) form, and does therefore not suffer the distortions of relative positions, sizes, distances, and angles that are inherent in 2D projection and interfere with interpretation. As a result, researchers can apply their spatial reasoning skills to 3D data in the same way they can to real objects or environments, as well as to complex objects like vector fields. 3D Visualizer is an application to visualize 3D volumetric data, such as results from mantle convection simulations or seismic tomography reconstructions, using VR display technology and a strong focus on interactive exploration. Unlike other visualization software, 3D Visualizer does not present static visualizations, such as a set of cross-sections at pre-selected positions and orientations, but instead lets users ask questions of their data, for example by dragging a cross-section through the data's domain with their hands and seeing data mapped onto that cross-section in real time, or by touching a point inside the data domain, and immediately seeing an isosurface connecting all points having the same data value as the touched point. Combined with tools allowing 3D measurements of positions, distances, and angles, and with annotation tools that allow free-hand sketching directly in 3D data space, the outcome of using 3D Visualizer is not primarily a set of pictures, but derived data to be used for subsequent analysis. 3D Visualizer works best in virtual reality, either in high-end facility-scale environments such as CAVEs, or using commodity low-cost virtual reality headsets such as HTC's Vive. The recent emergence of high-quality commodity VR means that researchers can buy a complete VR system off the shelf, install it and the 3D Visualizer software themselves, and start using it for data analysis immediately.

  9. ConfocalVR: Immersive Visualization Applied to Confocal Microscopy.

    PubMed

    Stefani, Caroline; Lacy-Hulbert, Adam; Skillman, Thomas

    2018-06-24

    ConfocalVR is a virtual reality (VR) application created to improve the ability of researchers to study the complexity of cell architecture. Confocal microscopes take pictures of fluorescently labeled proteins or molecules at different focal planes to create a stack of 2D images throughout the specimen. Current software applications reconstruct the 3D image and render it as a 2D projection onto a computer screen where users need to rotate the image to expose the full 3D structure. This process is mentally taxing, breaks down if you stop the rotation, and does not take advantage of the eye's full field of view. ConfocalVR exploits consumer-grade virtual reality (VR) systems to fully immerse the user in the 3D cellular image. In this virtual environment the user can: 1) adjust image viewing parameters without leaving the virtual space, 2) reach out and grab the image to quickly rotate and scale the image to focus on key features, and 3) interact with other users in a shared virtual space enabling real-time collaborative exploration and discussion. We found that immersive VR technology allows the user to rapidly understand cellular architecture and protein or molecule distribution. We note that it is impossible to understand the value of immersive visualization without experiencing it first hand, so we encourage readers to get access to a VR system, download this software, and evaluate it for yourself. The ConfocalVR software is available for download at http://www.confocalvr.com, and is free for nonprofits. Copyright © 2018. Published by Elsevier Ltd.

  10. Sculpting 3D worlds with music: advanced texturing techniques

    NASA Astrophysics Data System (ADS)

    Greuel, Christian; Bolas, Mark T.; Bolas, Niko; McDowall, Ian E.

    1996-04-01

    Sound within the virtual environment is often considered to be secondary to the graphics. In a typical scenario, either audio cues are locally associated with specific 3D objects or a general aural ambiance is supplied in order to alleviate the sterility of an artificial experience. This paper discusses a completely different approach, in which cues are extracted from live or recorded music in order to create geometry and control object behaviors within a computer- generated environment. Advanced texturing techniques used to generate complex stereoscopic images are also discussed. By analyzing music for standard audio characteristics such as rhythm and frequency, information is extracted and repackaged for processing. With the Soundsculpt Toolkit, this data is mapped onto individual objects within the virtual environment, along with one or more predetermined behaviors. Mapping decisions are implemented with a user definable schedule and are based on the aesthetic requirements of directors and designers. This provides for visually active, immersive environments in which virtual objects behave in real-time correlation with the music. The resulting music-driven virtual reality opens up several possibilities for new types of artistic and entertainment experiences, such as fully immersive 3D `music videos' and interactive landscapes for live performance.

  11. Towards photorealistic and immersive virtual-reality environments for simulated prosthetic vision: integrating recent breakthroughs in consumer hardware and software.

    PubMed

    Zapf, Marc P; Matteucci, Paul B; Lovell, Nigel H; Zheng, Steven; Suaning, Gregg J

    2014-01-01

    Simulated prosthetic vision (SPV) in normally sighted subjects is an established way of investigating the prospective efficacy of visual prosthesis designs in visually guided tasks such as mobility. To perform meaningful SPV mobility studies in computer-based environments, a credible representation of both the virtual scene to navigate and the experienced artificial vision has to be established. It is therefore prudent to make optimal use of existing hardware and software solutions when establishing a testing framework. The authors aimed at improving the realism and immersion of SPV by integrating state-of-the-art yet low-cost consumer technology. The feasibility of body motion tracking to control movement in photo-realistic virtual environments was evaluated in a pilot study. Five subjects were recruited and performed an obstacle avoidance and wayfinding task using either keyboard and mouse, gamepad or Kinect motion tracking. Walking speed and collisions were analyzed as basic measures for task performance. Kinect motion tracking resulted in lower performance as compared to classical input methods, yet results were more uniform across vision conditions. The chosen framework was successfully applied in a basic virtual task and is suited to realistically simulate real-world scenes under SPV in mobility research. Classical input peripherals remain a feasible and effective way of controlling the virtual movement. Motion tracking, despite its limitations and early state of implementation, is intuitive and can eliminate between-subject differences due to familiarity to established input methods.

  12. A Demonstration of ‘Broken’ Visual Space

    PubMed Central

    Gilson, Stuart

    2012-01-01

    It has long been assumed that there is a distorted mapping between real and ‘perceived’ space, based on demonstrations of systematic errors in judgements of slant, curvature, direction and separation. Here, we have applied a direct test to the notion of a coherent visual space. In an immersive virtual environment, participants judged the relative distance of two squares displayed in separate intervals. On some trials, the virtual scene expanded by a factor of four between intervals although, in line with recent results, participants did not report any noticeable change in the scene. We found that there was no consistent depth ordering of objects that can explain the distance matches participants made in this environment (e.g. A>B>D yet also A

  13. Immersive Visual Data Analysis For Geoscience Using Commodity VR Hardware

    NASA Astrophysics Data System (ADS)

    Kreylos, O.; Kellogg, L. H.

    2017-12-01

    Immersive visualization using virtual reality (VR) display technology offers tremendous benefits for the visual analysis of complex three-dimensional data like those commonly obtained from geophysical and geological observations and models. Unlike "traditional" visualization, which has to project 3D data onto a 2D screen for display, VR can side-step this projection and display 3D data directly, in a pseudo-holographic (head-tracked stereoscopic) form, and does therefore not suffer the distortions of relative positions, sizes, distances, and angles that are inherent in 2D projection. As a result, researchers can apply their spatial reasoning skills to virtual data in the same way they can to real objects or environments. The UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES, http://keckcaves.org) has been developing VR methods for data analysis since 2005, but the high cost of VR displays has been preventing large-scale deployment and adoption of KeckCAVES technology. The recent emergence of high-quality commodity VR, spearheaded by the Oculus Rift and HTC Vive, has fundamentally changed the field. With KeckCAVES' foundational VR operating system, Vrui, now running natively on the HTC Vive, all KeckCAVES visualization software, including 3D Visualizer, LiDAR Viewer, Crusta, Nanotech Construction Kit, and ProtoShop, are now available to small labs, single researchers, and even home users. LiDAR Viewer and Crusta have been used for rapid response to geologic events including earthquakes and landslides, to visualize the impacts of sealevel rise, to investigate reconstructed paleooceanographic masses, and for exploration of the surface of Mars. The Nanotech Construction Kit is being used to explore the phases of carbon in Earth's deep interior, while ProtoShop can be used to construct and investigate protein structures.

  14. Scientific work environments in the next decade

    NASA Technical Reports Server (NTRS)

    Gomez, Julian E.

    1989-01-01

    The applications of contemporary computer graphics to scientific visualization is described, with emphasis on the nonintuitive problems. A radically different approach is proposed which centers on the idea of the scientist being in the simulation display space rather than observing it on a screen. Interaction is performed with nonstandard input devices to preserve the feeling of being immersed in the three-dimensional display space. Construction of such a system could begin now with currently available technology.

  15. Educational Uses of Virtual Reality Technology.

    DTIC Science & Technology

    1998-01-01

    technology. It is affordable in that a basic level of technology can be achieved on most existing personal computers at either no cost or some minimal...actually present in a virtual environment is termed "presence" and is an artifact of being visually immersed in the computer -generated virtual world...Carolina University, VREL Teachers 1996 onward £ CO ■3 u VR in Education University of Illinois, National Center for Super- computing Applications

  16. Spatial Learning and Wayfinding in an Immersive Environment: The Digital Fulldome.

    PubMed

    Hedge, Craig; Weaver, Ruth; Schnall, Simone

    2017-05-01

    Previous work has examined whether immersive technologies can benefit learning in virtual environments, but the potential benefits of technology in this context are confounded by individual differences such as spatial ability. We assessed spatial knowledge acquisition in male and female participants using a technology not previously examined empirically: the digital fulldome. Our primary aim was to examine whether performance on a test of survey knowledge was better in a fulldome (N = 28, 12 males) relative to a large, flat screen display (N = 27, 13 males). Regression analysis showed that, compared to a flat screen display, males showed higher levels of performance on a test of survey knowledge after learning in the fulldome, but no benefit occurred for females. Furthermore, performance correlated with spatial visualization ability in male participants, but not in female participants. Thus, the digital fulldome is a potentially useful learning aid, capable of accommodating multiple users, but individual differences and use of strategy need to be considered.

  17. An Overview of Virtual Acoustic Simulation of Aircraft Flyover Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.

    2013-01-01

    Methods for testing human subject response to aircraft flyover noise have greatly advanced in recent years as a result of advances in simulation technology. Capabilities have been developed which now allow subjects to be immersed both visually and aurally in a three-dimensional, virtual environment. While suitable for displaying recorded aircraft noise, the true potential is found when synthesizing aircraft flyover noise because it allows the flexibility and freedom to study sounds from aircraft not yet flown. A virtual acoustic simulation method is described which is built upon prediction-based source noise synthesis, engineering-based propagation modeling, and empirically-based receiver modeling. This source-path-receiver paradigm allows complete control over all aspects of flyover auralization. With this capability, it is now possible to assess human response to flyover noise by systematically evaluating source noise reductions within the context of a system level simulation. Examples of auralized flyover noise and movie clips representative of an immersive aircraft flyover environment are made in the presentation.

  18. Social Interaction Development through Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Beach, Jason; Wendt, Jeremy

    2014-01-01

    The purpose of this pilot study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity…

  19. Developing effective serious games: the effect of background sound on visual fidelity perception with varying texture resolution.

    PubMed

    Rojas, David; Kapralos, Bill; Cristancho, Sayra; Collins, Karen; Hogue, Andrew; Conati, Cristina; Dubrowski, Adam

    2012-01-01

    Despite the benefits associated with virtual learning environments and serious games, there are open, fundamental issues regarding simulation fidelity and multi-modal cue interaction and their effect on immersion, transfer of knowledge, and retention. Here we describe the results of a study that examined the effect of ambient (background) sound on the perception of visual fidelity (defined with respect to texture resolution). Results suggest that the perception of visual fidelity is dependent on ambient sound and more specifically, white noise can have detrimental effects on our perception of high quality visuals. The results of this study will guide future studies that will ultimately aid in developing an understanding of the role that fidelity, and multi-modal interactions play with respect to knowledge transfer and retention for users of virtual simulations and serious games.

  20. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases.

    PubMed

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-04-15

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients' brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies.

  1. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases

    PubMed Central

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-01-01

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients’ brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies. PMID:25206907

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drouhard, Margaret MEG G; Steed, Chad A; Hahn, Steven E

    In this paper, we propose strategies and objectives for immersive data visualization with applications in materials science using the Oculus Rift virtual reality headset. We provide background on currently available analysis tools for neutron scattering data and other large-scale materials science projects. In the context of the current challenges facing scientists, we discuss immersive virtual reality visualization as a potentially powerful solution. We introduce a prototype immersive visual- ization system, developed in conjunction with materials scientists at the Spallation Neutron Source, which we have used to explore large crystal structures and neutron scattering data. Finally, we offer our perspective onmore » the greatest challenges that must be addressed to build effective and intuitive virtual reality analysis tools that will be useful for scientists in a wide range of fields.« less

  3. Spatial awareness in immersive virtual environments revealed in open-loop walking

    NASA Astrophysics Data System (ADS)

    Turano, Kathleen A.; Chaudhury, Sidhartha

    2005-03-01

    People are able to walk without vision to previously viewed targets in the real world. This ability to update one"s position in space has been attributed to a path integration system that uses internally generated self-motion signals together with the perceived object-to-self distance of the target. In a previous study using an immersive virtual environment (VE), we found that many subjects were unable to walk without vision to a previously viewed target located 4 m away. Their walking paths were influenced by the room structure that varied trial to trial. In this study we investigated whether the phenomenon is specific to a VE by testing subjects in a real world and a VE. The real world was viewed with field restricting goggles and via cameras using the same head-mounted display as in the VE. The results showed that only in the VE were walking paths influenced by the room structure. Women were more affected than men, and the effect decreased over trials and after subjects performed the task in the real world. The results also showed that a brief (<0.5 s) exposure to the visual scene during self-motion was sufficient to reduce the influence of the room structure on walking paths. The results are consistent with the idea that without visual experience within the VE, the path integration system is unable to effectively update one"s spatial position. As a result, people rely on other cues to define their position in space. Women, unlike men, choose to use visual cues about environmental structure to reorient.

  4. The role of vestibular and support-tactile-proprioceptive inputs in visual-manual tracking

    NASA Astrophysics Data System (ADS)

    Kornilova, Ludmila; Naumov, Ivan; Glukhikh, Dmitriy; Khabarova, Ekaterina; Pavlova, Aleksandra; Ekimovskiy, Georgiy; Sagalovitch, Viktor; Smirnov, Yuriy; Kozlovskaya, Inesa

    Sensorimotor disorders in weightlessness are caused by changes of functioning of gravity-dependent systems, first of all - vestibular and support. The question arises, what’s the role and the specific contribution of the support afferentation in the development of observed disorders. To determine the role and effects of vestibular, support, tactile and proprioceptive afferentation on characteristics of visual-manual tracking (VMT) we conducted a comparative analysis of the data obtained after prolonged spaceflight and in a model of weightlessness - horizontal “dry” immersion. Altogether we examined 16 Russian cosmonauts before and after prolonged spaceflights (129-215 days) and 30 subjects who stayed in immersion bath for 5-7 days to evaluate the state of the vestibular function (VF) using videooculography and characteristics of the visual-manual tracking (VMT) using electrooculography & joystick with biological visual feedback. Evaluation of the VF has shown that both after immersion and after prolonged spaceflight there were significant decrease of the static torsional otolith-cervical-ocular reflex (OCOR) and simultaneous significant increase of the dynamic vestibular-cervical-ocular reactions (VCOR) with a revealed negative correlation between parameters of the otoliths and canals reactions, as well as significant changes in accuracy of perception of the subjective visual vertical which correlated with changes in OCOR. Analyze of the VMT has shown that significant disorders of the visual tracking (VT) occurred from the beginning of the immersion up to 3-4 day after while in cosmonauts similar but much more pronounced oculomotor disorders and significant changes from the baseline were observed up to R+9 day postflight. Significant changes of the manual tracking (MT) were revealed only for gain and occurred on 1 and 3 days in immersion while after spaceflight such changes were observed up to R+5 day postflight. We found correlation between characteristics of the VT and MT, between characteristics of the VF and VT and no correlation between VF and MT. It was found that removal of the support and minimization of the proprioceptive afferentation has a greater impact upon accuracy of the VT then accuracy of the MT. Hand tracking accuracy was higher than the eyes for all subjects. The hand’ motor coordination was more stable to changes in support-proprioceptive afferentation then visual tracking. The observed changes in and after immersion are similar but less pronounced with changes observed on cosmonauts after prolonged spaceflight. Keywords: visual-manual tracking, vestibular function, weightlessness, immersion.

  5. On the Usability and Usefulness of 3d (geo)visualizations - a Focus on Virtual Reality Environments

    NASA Astrophysics Data System (ADS)

    Çöltekin, A.; Lokka, I.; Zahner, M.

    2016-06-01

    Whether and when should we show data in 3D is an on-going debate in communities conducting visualization research. A strong opposition exists in the information visualization (Infovis) community, and seemingly unnecessary/unwarranted use of 3D, e.g., in plots, bar or pie charts, is heavily criticized. The scientific visualization (Scivis) community, on the other hand, is more supportive of the use of 3D as it allows `seeing' invisible phenomena, or designing and printing things that are used in e.g., surgeries, educational settings etc. Geographic visualization (Geovis) stands between the Infovis and Scivis communities. In geographic information science, most visuo-spatial analyses have been sufficiently conducted in 2D or 2.5D, including analyses related to terrain and much of the urban phenomena. On the other hand, there has always been a strong interest in 3D, with similar motivations as in Scivis community. Among many types of 3D visualizations, a popular one that is exploited both for visual analysis and visualization is the highly realistic (geo)virtual environments. Such environments may be engaging and memorable for the viewers because they offer highly immersive experiences. However, it is not yet well-established if we should opt to show the data in 3D; and if yes, a) what type of 3D we should use, b) for what task types, and c) for whom. In this paper, we identify some of the central arguments for and against the use of 3D visualizations around these three considerations in a concise interdisciplinary literature review.

  6. Ergonomic approaches to designing educational materials for immersive multi-projection system

    NASA Astrophysics Data System (ADS)

    Shibata, Takashi; Lee, JaeLin; Inoue, Tetsuri

    2014-02-01

    Rapid advances in computer and display technologies have made it possible to present high quality virtual reality (VR) environment. To use such virtual environments effectively, research should be performed into how users perceive and react to virtual environment in view of particular human factors. We created a VR simulation of sea fish for science education, and we conducted an experiment to examine how observers perceive the size and depth of an object within their reach and evaluated their visual fatigue. We chose a multi-projection system for presenting the educational VR simulation, because this system can provide actual-size objects and produce stereo images located close to the observer. The results of the experiment show that estimation of size and depth was relatively accurate when subjects used physical actions to assess them. Presenting images within the observer's reach is suggested to be useful for education in VR environment. Evaluation of visual fatigue shows that the level of symptoms from viewing stereo images with a large disparity in VR environment was low in a short time.

  7. High End Visualization of Geophysical Datasets Using Immersive Technology: The SIO Visualization Center.

    NASA Astrophysics Data System (ADS)

    Newman, R. L.

    2002-12-01

    How many images can you display at one time with Power Point without getting "postage stamps"? Do you have fantastic datasets that you cannot view because your computer is too slow/small? Do you assume a few 2-D images of a 3-D picture are sufficient? High-end visualization centers can minimize and often eliminate these problems. The new visualization center [http://siovizcenter.ucsd.edu] at Scripps Institution of Oceanography [SIO] immerses users into a virtual world by projecting 3-D images onto a Panoram GVR-120E wall-sized floor-to-ceiling curved screen [7' x 23'] that has 3.2 mega-pixels of resolution. The Infinite Reality graphics subsystem is driven by a single-pipe SGI Onyx 3400 with a system bandwidth of 44 Gbps. The Onyx is powered by 16 MIPS R12K processors and 16 GB of addressable memory. The system is also equipped with transmitters and LCD shutter glasses which permit stereographic 3-D viewing of high-resolution images. This center is ideal for groups of up to 60 people who can simultaneously view these large-format images. A wide range of hardware and software is available, giving the users a totally immersive working environment in which to display, analyze, and discuss large datasets. The system enables simultaneous display of video and audio streams from sources such as SGI megadesktop and stereo megadesktop, S-VHS video, DVD video, and video from a Macintosh or PC. For instance, one-third of the screen might be displaying S-VHS video from a remotely-operated-vehicle [ROV], while the remaining portion of the screen might be used for an interactive 3-D flight over the same parcel of seafloor. The video and audio combinations using this system are numerous, allowing users to combine and explore data and images in innovative ways, greatly enhancing scientists' ability to visualize, understand and collaborate on complex datasets. In the not-distant future, with the rapid growth in networking speeds in the US, it will be possible for Earth Sciences Departments to collaborate effectively while limiting the amount of physical travel required. This includes porting visualization content to the popular, low-cost Geowall visualization systems, and providing web-based access to databanks filled with stock geoscience visualizations.

  8. Scientific Visualization Made Easy for the Scientist

    NASA Astrophysics Data System (ADS)

    Westerhoff, M.; Henderson, B.

    2002-12-01

    amirar is an application program used in creating 3D visualizations and geometric models of 3D image data sets from various application areas, e.g. medicine, biology, biochemistry, chemistry, physics, and engineering. It has demonstrated significant adoption in the market place since becoming commercially available in 2000. The rapid adoption has expanded the features being requested by the user base and broadened the scope of the amira product offering. The amira product offering includes amira Standard, amiraDevT, used to extend the product capabilities by users, amiraMolT, used for molecular visualization, amiraDeconvT, used to improve quality of image data, and amiraVRT, used in immersive VR environments. amira allows the user to construct a visualization tailored to his or her needs without requiring any programming knowledge. It also allows 3D objects to be represented as grids suitable for numerical simulations, notably as triangular surfaces and volumetric tetrahedral grids. The amira application also provides methods to generate such grids from voxel data representing an image volume, and it includes a general-purpose interactive 3D viewer. amiraDev provides an application-programming interface (API) that allows the user to add new components by C++ programming. amira supports many import formats including a 'raw' format allowing immediate access to your native uniform data sets. amira uses the power and speed of the OpenGLr and Open InventorT graphics libraries and 3D graphics accelerators to allow you to access over 145 modules, enabling you to process, probe, analyze and visualize your data. The amiraMolT extension adds powerful tools for molecular visualization to the existing amira platform. amiraMolT contains support for standard molecular file formats, tools for visualization and analysis of static molecules as well as molecular trajectories (time series). amiraDeconv adds tools for the deconvolution of 3D microscopic images. Deconvolution is the process of increasing image quality and resolution by computationally compensating artifacts of the recording process. amiraDeconv supports 3D wide field microscopy as well as 3D confocal microscopy. It offers both non-blind and blind image deconvolution algorithms. Non-blind deconvolution uses an individual measured point spread function, while non-blind algorithms work on the basis of only a few recording parameters (like numerical aperture or zoom factor). amiraVR is a specialized and extended version of the amira visualization system which is dedicated for use in immersive installations, such as large-screen stereoscopic projections, CAVEr or Holobenchr systems. Among others, it supports multi-threaded multi-pipe rendering, head-tracking, advanced 3D interaction concepts, and 3D menus allowing interaction with any amira object in the same way as on the desktop. With its unique set of features, amiraVR represents both a VR (Virtual Reality) ready application for scientific and medical visualization in immersive environments, and a development platform that allows building VR applications.

  9. Can walking motions improve visually induced rotational self-motion illusions in virtual reality?

    PubMed

    Riecke, Bernhard E; Freiberg, Jacob B; Grechkin, Timofey Y

    2015-02-04

    Illusions of self-motion (vection) can provide compelling sensations of moving through virtual environments without the need for complex motion simulators or large tracked physical walking spaces. Here we explore the interaction between biomechanical cues (stepping along a rotating circular treadmill) and visual cues (viewing simulated self-rotation) for providing stationary users a compelling sensation of rotational self-motion (circular vection). When tested individually, biomechanical and visual cues were similarly effective in eliciting self-motion illusions. However, in combination they yielded significantly more intense self-motion illusions. These findings provide the first compelling evidence that walking motions can be used to significantly enhance visually induced rotational self-motion perception in virtual environments (and vice versa) without having to provide for physical self-motion or motion platforms. This is noteworthy, as linear treadmills have been found to actually impair visually induced translational self-motion perception (Ash, Palmisano, Apthorp, & Allison, 2013). Given the predominant focus on linear walking interfaces for virtual-reality locomotion, our findings suggest that investigating circular and curvilinear walking interfaces offers a promising direction for future research and development and can help to enhance self-motion illusions, presence and immersion in virtual-reality systems. © 2015 ARVO.

  10. Design of an immersive simulator for assisted power wheelchair driving.

    PubMed

    Devigne, Louise; Babel, Marie; Nouviale, Florian; Narayanan, Vishnu K; Pasteau, Francois; Gallien, Philippe

    2017-07-01

    Driving a power wheelchair is a difficult and complex visual-cognitive task. As a result, some people with visual and/or cognitive disabilities cannot access the benefits of a power wheelchair because their impairments prevent them from driving safely. In order to improve their access to mobility, we have previously designed a semi-autonomous assistive wheelchair system which progressively corrects the trajectory as the user manually drives the wheelchair and smoothly avoids obstacles. Developing and testing such systems for wheelchair driving assistance requires a significant amount of material resources and clinician time. With Virtual Reality technology, prototypes can be developed and tested in a risk-free and highly flexible Virtual Environment before equipping and testing a physical prototype. Additionally, users can "virtually" test and train more easily during the development process. In this paper, we introduce a power wheelchair driving simulator allowing the user to navigate with a standard wheelchair in an immersive 3D Virtual Environment. The simulation framework is designed to be flexible so that we can use different control inputs. In order to validate the framework, we first performed tests on the simulator with able-bodied participants during which the user's Quality of Experience (QoE) was assessed through a set of questionnaires. Results show that the simulator is a promising tool for future works as it generates a good sense of presence and requires rather low cognitive effort from users.

  11. Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution

    PubMed Central

    Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir

    2016-01-01

    Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks–walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience. PMID:26882473

  12. Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution.

    PubMed

    Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir

    2016-01-01

    Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks-walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience.

  13. Scientific Visualization & Modeling for Earth Systems Science Education

    NASA Technical Reports Server (NTRS)

    Chaudhury, S. Raj; Rodriguez, Waldo J.

    2003-01-01

    Providing research experiences for undergraduate students in Earth Systems Science (ESS) poses several challenges at smaller academic institutions that might lack dedicated resources for this area of study. This paper describes the development of an innovative model that involves students with majors in diverse scientific disciplines in authentic ESS research. In studying global climate change, experts typically use scientific visualization techniques applied to remote sensing data collected by satellites. In particular, many problems related to environmental phenomena can be quantitatively addressed by investigations based on datasets related to the scientific endeavours such as the Earth Radiation Budget Experiment (ERBE). Working with data products stored at NASA's Distributed Active Archive Centers, visualization software specifically designed for students and an advanced, immersive Virtual Reality (VR) environment, students engage in guided research projects during a structured 6-week summer program. Over the 5-year span, this program has afforded the opportunity for students majoring in biology, chemistry, mathematics, computer science, physics, engineering and science education to work collaboratively in teams on research projects that emphasize the use of scientific visualization in studying the environment. Recently, a hands-on component has been added through science student partnerships with school-teachers in data collection and reporting for the GLOBE Program (GLobal Observations to Benefit the Environment).

  14. An Examination of the Effects of Collaborative Scientific Visualization via Model-Based Reasoning on Science, Technology, Engineering, and Mathematics (STEM) Learning within an Immersive 3D World

    ERIC Educational Resources Information Center

    Soleimani, Ali

    2013-01-01

    Immersive 3D worlds can be designed to effectively engage students in peer-to-peer collaborative learning activities, supported by scientific visualization, to help with understanding complex concepts associated with learning science, technology, engineering, and mathematics (STEM). Previous research studies have shown STEM learning benefits…

  15. Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Sims, Michael; Kunz, Clayton; Lees, David; Bowman, Judd

    2005-01-01

    Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of. engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 30 visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 30 terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission.

  16. The Virtual Pelvic Floor, a tele-immersive educational environment.

    PubMed Central

    Pearl, R. K.; Evenhouse, R.; Rasmussen, M.; Dech, F.; Silverstein, J. C.; Prokasy, S.; Panko, W. B.

    1999-01-01

    This paper describes the development of the Virtual Pelvic Floor, a new method of teaching the complex anatomy of the pelvic region utilizing virtual reality and advanced networking technology. Virtual reality technology allows improved visualization of three-dimensional structures over conventional media because it supports stereo vision, viewer-centered perspective, large angles of view, and interactivity. Two or more ImmersaDesk systems, drafting table format virtual reality displays, are networked together providing an environment where teacher and students share a high quality three-dimensional anatomical model, and are able to converse, see each other, and to point in three dimensions to indicate areas of interest. This project was realized by the teamwork of surgeons, medical artists and sculptors, computer scientists, and computer visualization experts. It demonstrates the future of virtual reality for surgical education and applications for the Next Generation Internet. Images Figure 1 Figure 2 Figure 3 PMID:10566378

  17. Long-Term Audience Impacts of Live Fulldome Planetarium Lectures for Earth Science and Global Change Education

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Champlin, D. M.; Goldsworth, D. A.; Raynolds, R. G.; Dechesne, M.

    2011-09-01

    Digital Earth visualization technologies, from ArcGIS to Google Earth, have allowed for the integration of complex, disparate data sets to produce visually rich and compelling three-dimensional models of sub-surface and surface resource distribution patterns. The rendering of these models allows the public to quickly understand complicated geospatial relationships that would otherwise take much longer to explain using traditional media. At the Denver Museum of Nature & Science (DMNS), we have used such visualization technologies, including real-time virtual reality software running in the immersive digital "fulldome" Gates Planetarium, to impact the community through topical policy presentations. DMNS public lectures have covered regional issues like water resources, as well as global topics such as earthquakes, tsunamis, and resource depletion. The Gates Planetarium allows an audience to have an immersive experience-similar to virtual reality "CAVE" environments found in academia-that would otherwise not be available to the general public. Public lectures in the dome allow audiences of over 100 people to comprehend dynamically changing geospatial datasets in an exciting and engaging fashion. Surveys and interviews show that these talks are effective in heightening visitor interest in the subjects weeks or months after the presentation. Many visitors take additional steps to learn more, while one was so inspired that she actively worked to bring the same programming to her children's school. These preliminary findings suggest that fulldome real-time visualizations can have a substantial long-term impact on an audience's engagement and interest in science topics.

  18. Generation IV Nuclear Energy Systems Construction Cost Reductions through the Use of Virtual Environments - Task 4 Report: Virtual Mockup Maintenance Task Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy Shaw; Anthony Baratta; Vaughn Whisker

    2005-02-28

    Task 4 report of 3 year DOE NERI-sponsored effort evaluating immersive virtual reality (CAVE) technology for design review, construction planning, and maintenance planning and training for next generation nuclear power plants. Program covers development of full-scale virtual mockups generated from 3D CAD data presented in a CAVE visualization facility. This report focuses on using Full-scale virtual mockups for nuclear power plant training applications.

  19. The Efficacy of an Immersive 3D Virtual versus 2D Web Environment in Intercultural Sensitivity Acquisition

    ERIC Educational Resources Information Center

    Coffey, Amy Jo; Kamhawi, Rasha; Fishwick, Paul; Henderson, Julie

    2017-01-01

    Relatively few studies have empirically tested computer-based immersive virtual environments' efficacy in teaching or enhancing pro-social attitudes, such as intercultural sensitivity. This channel study experiment was conducted (N = 159) to compare what effects, if any, an immersive 3D virtual environment would have upon subjects' intercultural…

  20. Virtual hydrology observatory: an immersive visualization of hydrology modeling

    NASA Astrophysics Data System (ADS)

    Su, Simon; Cruz-Neira, Carolina; Habib, Emad; Gerndt, Andreas

    2009-02-01

    The Virtual Hydrology Observatory will provide students with the ability to observe the integrated hydrology simulation with an instructional interface by using a desktop based or immersive virtual reality setup. It is the goal of the virtual hydrology observatory application to facilitate the introduction of field experience and observational skills into hydrology courses through innovative virtual techniques that mimic activities during actual field visits. The simulation part of the application is developed from the integrated atmospheric forecast model: Weather Research and Forecasting (WRF), and the hydrology model: Gridded Surface/Subsurface Hydrologic Analysis (GSSHA). Both the output from WRF and GSSHA models are then used to generate the final visualization components of the Virtual Hydrology Observatory. The various visualization data processing techniques provided by VTK are 2D Delaunay triangulation and data optimization. Once all the visualization components are generated, they are integrated into the simulation data using VRFlowVis and VR Juggler software toolkit. VR Juggler is used primarily to provide the Virtual Hydrology Observatory application with fully immersive and real time 3D interaction experience; while VRFlowVis provides the integration framework for the hydrologic simulation data, graphical objects and user interaction. A six-sided CAVETM like system is used to run the Virtual Hydrology Observatory to provide the students with a fully immersive experience.

  1. Authentic Astronomical Discovery in Planetariums: Data-Driven Immersive Lectures

    NASA Astrophysics Data System (ADS)

    Wyatt, Ryan Jason

    2018-01-01

    Planetariums are akin to “branch offices” for astronomy in major cities and other locations around the globe. With immersive, fulldome video technology, modern digital planetariums offer the opportunity to integrate authentic astronomical data into both pre-recorded shows and live lectures. At the California Academy of Sciences Morrison Planetarium, we host the monthly Benjamin Dean Astronomy Lecture Series, which features researchers describing their cutting-edge work to well-informed lay audiences. The Academy’s visualization studio and engineering teams work with researchers to visualize their data in both pre-rendered and real-time formats, and these visualizations are integrated into a variety of programs—including lectures! The assets are then made available to any other planetariums with similar software to support their programming. A lecturer can thus give the same immersive presentation to audiences in a variety of planetariums. The Academy has also collaborated with Chicago’s Adler Planetarium to bring Kavli Fulldome Lecture Series to San Francisco, and the two theaters have also linked together in live “domecasts” to share real-time content with audiences in both cities. These lecture series and other, similar projects suggest a bright future for astronomers to bring their research to the public in an immersive and visually compelling format.

  2. Saliency in VR: How Do People Explore Virtual Environments?

    PubMed

    Sitzmann, Vincent; Serrano, Ana; Pavel, Amy; Agrawala, Maneesh; Gutierrez, Diego; Masia, Belen; Wetzstein, Gordon

    2018-04-01

    Understanding how people explore immersive virtual environments is crucial for many applications, such as designing virtual reality (VR) content, developing new compression algorithms, or learning computational models of saliency or visual attention. Whereas a body of recent work has focused on modeling saliency in desktop viewing conditions, VR is very different from these conditions in that viewing behavior is governed by stereoscopic vision and by the complex interaction of head orientation, gaze, and other kinematic constraints. To further our understanding of viewing behavior and saliency in VR, we capture and analyze gaze and head orientation data of 169 users exploring stereoscopic, static omni-directional panoramas, for a total of 1980 head and gaze trajectories for three different viewing conditions. We provide a thorough analysis of our data, which leads to several important insights, such as the existence of a particular fixation bias, which we then use to adapt existing saliency predictors to immersive VR conditions. In addition, we explore other applications of our data and analysis, including automatic alignment of VR video cuts, panorama thumbnails, panorama video synopsis, and saliency-basedcompression.

  3. The Human Retrosplenial Cortex and Thalamus Code Head Direction in a Global Reference Frame.

    PubMed

    Shine, Jonathan P; Valdés-Herrera, José P; Hegarty, Mary; Wolbers, Thomas

    2016-06-15

    Spatial navigation is a multisensory process involving integration of visual and body-based cues. In rodents, head direction (HD) cells, which are most abundant in the thalamus, integrate these cues to code facing direction. Human fMRI studies examining HD coding in virtual environments (VE) have reported effects in retrosplenial complex and (pre-)subiculum, but not the thalamus. Furthermore, HD coding appeared insensitive to global landmarks. These tasks, however, provided only visual cues for orientation, and attending to global landmarks did not benefit task performance. In the present study, participants explored a VE comprising four separate locales, surrounded by four global landmarks. To provide body-based cues, participants wore a head-mounted display so that physical rotations changed facing direction in the VE. During subsequent MRI scanning, subjects saw stationary views of the environment and judged whether their orientation was the same as in the preceding trial. Parameter estimates extracted from retrosplenial cortex and the thalamus revealed significantly reduced BOLD responses when HD was repeated. Moreover, consistent with rodent findings, the signal did not continue to adapt over repetitions of the same HD. These results were supported by a whole-brain analysis showing additional repetition suppression in the precuneus. Together, our findings suggest that: (1) consistent with the rodent literature, the human thalamus may integrate visual and body-based, orientation cues; (2) global reference frame cues can be used to integrate HD across separate individual locales; and (3) immersive training procedures providing full body-based cues may help to elucidate the neural mechanisms supporting spatial navigation. In rodents, head direction (HD) cells signal facing direction in the environment via increased firing when the animal assumes a certain orientation. Distinct brain regions, the retrosplenial cortex (RSC) and thalamus, code for visual and vestibular cues of orientation, respectively. Putative HD signals have been observed in human RSC but not the thalamus, potentially because body-based cues were not provided. Here, participants encoded HD in a novel virtual environment while wearing a head-mounted display to provide body-based cues for orientation. In subsequent fMRI scanning, we found evidence of an HD signal in RSC, thalamus, and precuneus. These findings harmonize rodent and human data, and suggest that immersive training procedures provide a viable way to examine the neural basis of navigation. Copyright © 2016 the authors 0270-6474/16/366371-11$15.00/0.

  4. Color stability of ceramic brackets immersed in potentially staining solutions

    PubMed Central

    Guignone, Bruna Coser; Silva, Ludimila Karsbergen; Soares, Rodrigo Villamarim; Akaki, Emilio; Goiato, Marcelo Coelho; Pithon, Matheus Melo; Oliveira, Dauro Douglas

    2015-01-01

    OBJECTIVE: To assess the color stability of five types of ceramic brackets after immersion in potentially staining solutions. METHODS: Ninety brackets were divided into 5 groups (n = 18) according to brackets commercial brands and the solutions in which they were immersed (coffee, red wine, coke and artificial saliva). The brackets assessed were Transcend (3M/Unitek, Monrovia, CA, USA), Radiance (American Orthodontics, Sheboygan, WI, USA), Mystique (GAC International Inc., Bohemia, NY, USA) and Luxi II (Rocky Mountain Orthodontics, Denver, CO, USA). Chromatic changes were analyzed with the aid of a reflectance spectrophotometer and by visual inspection at five specific time intervals. Assessment periods were as received from the manufacturer (T0), 24 hours (T1), 72 hours (T2), as well as 7 days (T3) and 14 days (T4) of immersion in the aforementioned solutions. Results were submitted to statistical analysis with ANOVA and Bonferroni correction, as well as to a multivariate profile analysis for independent and paired samples with significance level set at 5%. RESULTS: The duration of the immersion period influenced color alteration of all tested brackets, even though these changes could not always be visually observed. Different behaviors were observed for each immersion solution; however, brackets immersed in one solution progressed similarly despite minor variations. CONCLUSIONS: Staining became more intense over time and all brackets underwent color alterations when immersed in the aforementioned solutions. PMID:26352842

  5. Color stability of ceramic brackets immersed in potentially staining solutions.

    PubMed

    Guignone, Bruna Coser; Silva, Ludimila Karsbergen; Soares, Rodrigo Villamarim; Akaki, Emilio; Goiato, Marcelo Coelho; Pithon, Matheus Melo; Oliveira, Dauro Douglas

    2015-01-01

    To assess the color stability of five types of ceramic brackets after immersion in potentially staining solutions. Ninety brackets were divided into 5 groups (n = 18) according to brackets commercial brands and the solutions in which they were immersed (coffee, red wine, coke and artificial saliva). The brackets assessed were Transcend (3M/Unitek, Monrovia, CA, USA), Radiance (American Orthodontics, Sheboygan, WI, USA), Mystique (GAC International Inc., Bohemia, NY, USA) and Luxi II (Rocky Mountain Orthodontics, Denver, CO, USA). Chromatic changes were analyzed with the aid of a reflectance spectrophotometer and by visual inspection at five specific time intervals. Assessment periods were as received from the manufacturer (T0), 24 hours (T1), 72 hours (T2), as well as 7 days (T3) and 14 days (T4) of immersion in the aforementioned solutions. Results were submitted to statistical analysis with ANOVA and Bonferroni correction, as well as to a multivariate profile analysis for independent and paired samples with significance level set at 5%. The duration of the immersion period influenced color alteration of all tested brackets, even though these changes could not always be visually observed. Different behaviors were observed for each immersion solution; however, brackets immersed in one solution progressed similarly despite minor variations. Staining became more intense over time and all brackets underwent color alterations when immersed in the aforementioned solutions.

  6. Building Virtual Mars

    NASA Astrophysics Data System (ADS)

    Abercrombie, S. P.; Menzies, A.; Goddard, C.

    2017-12-01

    Virtual and augmented reality enable scientists to visualize environments that are very difficult, or even impossible to visit, such as the surface of Mars. A useful immersive visualization begins with a high quality reconstruction of the environment under study. This presentation will discuss a photogrammetry pipeline developed at the Jet Propulsion Laboratory to reconstruct 3D models of the surface of Mars using stereo images sent back to Earth by the Curiosity Mars rover. The resulting models are used to support a virtual reality tool (OnSight) that allows scientists and engineers to visualize the surface of Mars as if they were standing on the red planet. Images of Mars present challenges to existing scene reconstruction solutions. Surface images of Mars are sparse with minimal overlap, and are often taken from extremely different viewpoints. In addition, the specialized cameras used by Mars rovers are significantly different than consumer cameras, and GPS localization data is not available on Mars. This presentation will discuss scene reconstruction with an emphasis on coping with limited input data, and on creating models suitable for rendering in virtual reality at high frame rate.

  7. A Critical Review of the Use of Virtual Reality in Construction Engineering Education and Training.

    PubMed

    Wang, Peng; Wu, Peng; Wang, Jun; Chi, Hung-Lin; Wang, Xiangyu

    2018-06-08

    Virtual Reality (VR) has been rapidly recognized and implemented in construction engineering education and training (CEET) in recent years due to its benefits of providing an engaging and immersive environment. The objective of this review is to critically collect and analyze the VR applications in CEET, aiming at all VR-related journal papers published from 1997 to 2017. The review follows a three-stage analysis on VR technologies, applications and future directions through a systematic analysis. It is found that the VR technologies adopted for CEET evolve over time, from desktop-based VR, immersive VR, 3D game-based VR, to Building Information Modelling (BIM)-enabled VR. A sibling technology, Augmented Reality (AR), for CEET adoptions has also emerged in recent years. These technologies have been applied in architecture and design visualization, construction health and safety training, equipment and operational task training, as well as structural analysis. Future research directions, including the integration of VR with emerging education paradigms and visualization technologies, have also been provided. The findings are useful for both researchers and educators to usefully integrate VR in their education and training programs to improve the training performance.

  8. Virtualized Traffic: reconstructing traffic flows from discrete spatiotemporal data.

    PubMed

    Sewall, Jason; van den Berg, Jur; Lin, Ming C; Manocha, Dinesh

    2011-01-01

    We present a novel concept, Virtualized Traffic, to reconstruct and visualize continuous traffic flows from discrete spatiotemporal data provided by traffic sensors or generated artificially to enhance a sense of immersion in a dynamic virtual world. Given the positions of each car at two recorded locations on a highway and the corresponding time instances, our approach can reconstruct the traffic flows (i.e., the dynamic motions of multiple cars over time) between the two locations along the highway for immersive visualization of virtual cities or other environments. Our algorithm is applicable to high-density traffic on highways with an arbitrary number of lanes and takes into account the geometric, kinematic, and dynamic constraints on the cars. Our method reconstructs the car motion that automatically minimizes the number of lane changes, respects safety distance to other cars, and computes the acceleration necessary to obtain a smooth traffic flow subject to the given constraints. Furthermore, our framework can process a continuous stream of input data in real time, enabling the users to view virtualized traffic events in a virtual world as they occur. We demonstrate our reconstruction technique with both synthetic and real-world input. © 2011 IEEE Published by the IEEE Computer Society

  9. IQ-Station: A Low Cost Portable Immersive Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eric Whiting; Patrick O'Leary; William Sherman

    2010-11-01

    The emergence of inexpensive 3D TV’s, affordable input and rendering hardware and open-source software has created a yeasty atmosphere for the development of low-cost immersive environments (IE). A low cost IE system, or IQ-station, fashioned from commercial off the shelf technology (COTS), coupled with a targeted immersive application can be a viable laboratory instrument for enhancing scientific workflow for exploration and analysis. The use of an IQ-station in a laboratory setting also has the potential of quickening the adoption of a more sophisticated immersive environment as a critical enabler in modern scientific and engineering workflows. Prior work in immersive environmentsmore » generally required either a head mounted display (HMD) system or a large projector-based implementation both of which have limitations in terms of cost, usability, or space requirements. The solution presented here provides an alternative platform providing a reasonable immersive experience that addresses those limitations. Our work brings together the needed hardware and software to create a fully integrated immersive display and interface system that can be readily deployed in laboratories and common workspaces. By doing so, it is now feasible for immersive technologies to be included in researchers’ day-to-day workflows. The IQ-Station sets the stage for much wider adoption of immersive environments outside the small communities of virtual reality centers.« less

  10. The Selimiye Mosque of Edirne, Turkey - AN Immersive and Interactive Virtual Reality Experience Using Htc Vive

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Büyüksalih, G.; Tschirschwitz, F.; Kan, T.; Deggim, S.; Kaya, Y.; Baskaraca, A. P.

    2017-05-01

    Recent advances in contemporary Virtual Reality (VR) technologies are going to have a significant impact on veryday life. Through VR it is possible to virtually explore a computer-generated environment as a different reality, and to immerse oneself into the past or in a virtual museum without leaving the current real-life situation. For such the ultimate VR experience, the user should only see the virtual world. Currently, the user must wear a VR headset which fits around the head and over the eyes to visually separate themselves from the physical world. Via the headset images are fed to the eyes through two small lenses. Cultural heritage monuments are ideally suited both for thorough multi-dimensional geometric documentation and for realistic interactive visualisation in immersive VR applications. Additionally, the game industry offers tools for interactive visualisation of objects to motivate users to virtually visit objects and places. In this paper the generation of a virtual 3D model of the Selimiye mosque in the city of Edirne, Turkey and its processing for data integration into the game engine Unity is presented. The project has been carried out as a co-operation between BİMTAŞ, a company of the Greater Municipality of Istanbul, Turkey and the Photogrammetry & Laser Scanning Lab of the HafenCity University Hamburg, Germany to demonstrate an immersive and interactive visualisation using the new VR system HTC Vive. The workflow from data acquisition to VR visualisation, including the necessary programming for navigation, is described. Furthermore, the possible use (including simultaneous multiple users environments) of such a VR visualisation for a CH monument is discussed in this contribution.

  11. A standardized set of 3-D objects for virtual reality research and applications.

    PubMed

    Peeters, David

    2018-06-01

    The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.

  12. Using Interactive Visualization to Analyze Solid Earth Data and Geodynamics Models

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.; Kreylos, O.; Billen, M. I.; Hamann, B.; Jadamec, M. A.; Rundle, J. B.; van Aalsburg, J.; Yikilmaz, M. B.

    2008-12-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. Major projects such as EarthScope and GeoEarthScope are producing the data needed to characterize the structure and kinematics of Earth's surface and interior at unprecedented resolution. At the same time, high-performance computing enables high-precision and fine- detail simulation of geodynamics processes, complementing the observational data. To facilitate interpretation and analysis of these datasets, to evaluate models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. VR has traditionally been used primarily as a presentation tool allowing active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for accelerated scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. Our approach to VR takes advantage of the specialized skills of geoscientists who are trained to interpret geological and geophysical data generated from field observations. Interactive tools allow the scientist to explore and interpret geodynamic models, tomographic models, and topographic observations, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulations or field observations. The use of VR technology enables us to improve our interpretation of crust and mantle structure and of geodynamical processes. Mapping tools based on computer visualization allow virtual "field studies" in inaccessible regions, and an interactive tool allows us to construct digital fault models for use in numerical models. Using the interactive tools on a high-end platform such as an immersive virtual reality room known as a Cave Automatic Virtual Environment (CAVE), enables the scientist to stand in data three-dimensional dataset while taking measurements. The CAVE involves three or more projection surfaces arranged as walls in a room. Stereo projectors combined with a motion tracking system and immersion recreates the experience of carrying out research in the field. This high-end system provides significant advantages for scientists working with complex volumetric data. The interactive tools also work on low-cost platforms that provide stereo views and the potential for interactivity such as a Geowall or a 3D enabled TV. The Geowall is also a well-established tool for education, and in combination with the tools we have developed, enables the rapid transfer of research data and new knowledge to the classroom. The interactive visualization tools can also be used on a desktop or laptop with or without stereo capability. Further information about the Virtual Reality User Interface (VRUI), the 3DVisualizer, the Virtual mapping tools, and the LIDAR viewer, can be found on the KeckCAVES website, www.keckcaves.org.

  13. Immersive Environments in ADL

    DTIC Science & Technology

    2009-08-20

    Tracking and Storing In Browser 3-D 13 Questions or Comments? Peter Smith Team Lead, Immersive Learning Technologies peter.smith.ctr@adlnet.gov +1.407.384.5572 ...Immersive Environments in ADL Mr. Peter Smith, Lead, ADL Immersive Learning Team 08/20/2009 Report Documentation Page Form ApprovedOMB No. 0704-0188...5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Advanced Decision Learning (ADL),1901 N

  14. Eye Movement Analysis and Cognitive Assessment. The Use of Comparative Visual Search Tasks in a Non-immersive VR Application.

    PubMed

    Rosa, Pedro J; Gamito, Pedro; Oliveira, Jorge; Morais, Diogo; Pavlovic, Matthew; Smyth, Olivia; Maia, Inês; Gomes, Tiago

    2017-03-23

    An adequate behavioral response depends on attentional and mnesic processes. When these basic cognitive functions are impaired, the use of non-immersive Virtual Reality Applications (VRAs) can be a reliable technique for assessing the level of impairment. However, most non-immersive VRAs use indirect measures to make inferences about visual attention and mnesic processes (e.g., time to task completion, error rate). To examine whether the eye movement analysis through eye tracking (ET) can be a reliable method to probe more effectively where and how attention is deployed and how it is linked with visual working memory during comparative visual search tasks (CVSTs) in non-immersive VRAs. The eye movements of 50 healthy participants were continuously recorded while CVSTs, selected from a set of cognitive tasks in the Systemic Lisbon Battery (SLB). Then a VRA designed to assess of cognitive impairments were randomly presented. The total fixation duration, the number of visits in the areas of interest and in the interstimulus space, along with the total execution time was significantly different as a function of the Mini Mental State Examination (MMSE) scores. The present study demonstrates that CVSTs in SLB, when combined with ET, can be a reliable and unobtrusive method for assessing cognitive abilities in healthy individuals, opening it to potential use in clinical samples.

  15. Neural mechanisms of limb position estimation in the primate brain.

    PubMed

    Shi, Ying; Buneo, Christopher A

    2011-01-01

    Understanding the neural mechanisms of limb position estimation is important both for comprehending the neural control of goal directed arm movements and for developing neuroprosthetic systems designed to replace lost limb function. Here we examined the role of area 5 of the posterior parietal cortex in estimating limb position based on visual and somatic (proprioceptive, efference copy) signals. Single unit recordings were obtained as monkeys reached to visual targets presented in a semi-immersive virtual reality environment. On half of the trials animals were required to maintain their limb position at these targets while receiving both visual and non-visual feedback of their arm position, while on the other trials visual feedback was withheld. When examined individually, many area 5 neurons were tuned to the position of the limb in the workspace but very few neurons modulated their firing rates based on the presence/absence of visual feedback. At the population level however decoding of limb position was somewhat more accurate when visual feedback was provided. These findings support a role for area 5 in limb position estimation but also suggest that visual signals regarding limb position are only weakly represented in this area, and only at the population level.

  16. Reducing the Schizophrenia Stigma: A New Approach Based on Augmented Reality

    PubMed Central

    Silva, Rafael D. de C.; Albuquerque, Saulo G. C.; Muniz, Artur de V.; Filho, Pedro P. Rebouças; Ribeiro, Sidarta

    2017-01-01

    Schizophrenia is a chronic mental disease that usually manifests psychotic symptoms and affects an individual's functionality. The stigma related to this disease is a serious obstacle for an adequate approach to its treatment. Stigma can, for example, delay the start of treatment, and it creates difficulties in interpersonal and professional relationships. This work proposes a new tool based on augmented reality to reduce the stigma related to schizophrenia. The tool is capable of simulating the psychotic symptoms typical of schizophrenia and simulates sense perception changes in order to create an immersive experience capable of generating pathological experiences of a patient with schizophrenia. The integration into the proposed environment occurs through immersion glasses and an embedded camera. Audio and visual effects can also be applied in real time. To validate the proposed environment, medical students experienced the virtual environment and then answered three questionnaires to assess (i) stigmas related to schizophrenia, (ii) the efficiency and effectiveness of the tool, and, finally (iii) stigma after simulation. The analysis of the questionnaires showed that the proposed model is a robust tool and quite realistic and, thus, very promising in reducing stigma associated with schizophrenia by instilling in the observer a greater comprehension of any person during an schizophrenic outbreak, whether a patient or a family member. PMID:29317860

  17. Defense applications of the CAVE (CAVE automatic virtual environment)

    NASA Astrophysics Data System (ADS)

    Isabelle, Scott K.; Gilkey, Robert H.; Kenyon, Robert V.; Valentino, George; Flach, John M.; Spenny, Curtis H.; Anderson, Timothy R.

    1997-07-01

    The CAVE is a multi-person, room-sized, high-resolution, 3D video and auditory environment, which can be used to present very immersive virtual environment experiences. This paper describes the CAVE technology and the capability of the CAVE system as originally developed at the Electronics Visualization Laboratory of the University of Illinois- Chicago and as more recently implemented by Wright State University (WSU) in the Armstrong Laboratory at Wright- Patterson Air Force Base (WPAFB). One planned use of the WSU/WPAFB CAVE is research addressing the appropriate design of display and control interfaces for controlling uninhabited aerial vehicles. The WSU/WPAFB CAVE has a number of features that make it well-suited to this work: (1) 360 degrees surround, plus floor, high resolution visual displays, (2) virtual spatialized audio, (3) the ability to integrate real and virtual objects, and (4) rapid and flexible reconfiguration. However, even though the CAVE is likely to have broad utility for military applications, it does have certain limitations that may make it less well- suited to applications that require 'natural' haptic feedback, vestibular stimulation, or an ability to interact with close detailed objects.

  18. Immersive realities: articulating the shift from VR to mobile AR through artistic practice

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; Cornish, Tracy; Berry, Rodney; DeFanti, Thomas A.

    2012-03-01

    Our contemporary imaginings of technological engagement with digital environments has transitioned from flying through Virtual Reality to mobile interactions with the physical world through personal media devices. Experiences technologically mediated through social interactivity within physical environments are now being preferenced over isolated environments such as CAVEs or HMDs. Examples of this trend can be seen in early tele-collaborative artworks which strove to use advanced networking to join multiple participants in shared virtual environments. Recent developments in mobile AR allow untethered access to such shared realities in places far removed from labs and home entertainment environments, and without the bulky and expensive technologies attached to our bodies that accompany most VR. This paper addresses the emerging trend favoring socially immersive artworks via mobile Augmented Reality rather than sensorially immersive Virtual Reality installations. With particular focus on AR as a mobile, locative technology, we will discuss how concepts of immersion and interactivity are evolving with this new medium. Immersion in context of mobile AR can be redefined to describe socially interactive experiences. Having distinctly different sensory, spatial and situational properties, mobile AR offers a new form for remixing elements from traditional virtual reality with physically based social experiences. This type of immersion offers a wide array of potential for mobile AR art forms. We are beginning to see examples of how artists can use mobile AR to create social immersive and interactive experiences.

  19. The influence of visual and vestibular orientation cues in a clock reading task.

    PubMed

    Davidenko, Nicolas; Cheong, Yeram; Waterman, Amanda; Smith, Jacob; Anderson, Barrett; Harmon, Sarah

    2018-05-23

    We investigated how performance in the real-life perceptual task of analog clock reading is influenced by the clock's orientation with respect to egocentric, gravitational, and visual-environmental reference frames. In Experiment 1, we designed a simple clock-reading task and found that observers' reaction time to correctly tell the time depends systematically on the clock's orientation. In Experiment 2, we dissociated egocentric from environmental reference frames by having participants sit upright or lie sideways while performing the task. We found that both reference frames substantially contribute to response times in this task. In Experiment 3, we placed upright or rotated participants in an upright or rotated immersive virtual environment, which allowed us to further dissociate vestibular from visual cues to the environmental reference frame. We found evidence of environmental reference frame effects only when visual and vestibular cues were aligned. We discuss the implications for the design of remote and head-mounted displays. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. The influence of an immersive virtual environment on the segmental organization of postural stabilizing responses.

    PubMed

    Keshner, E A; Kenyon, R V

    2000-01-01

    We examined the effect of a 3-dimensional stereoscopic scene on segmental stabilization. Eight subjects participated in static sway and locomotion experiments with a visual scene that moved sinusoidally or at constant velocity about the pitch or roll axes. Segmental displacements, Fast Fourier Transforms, and Root Mean Square values were calculated. In both pitch and roll, subjects exhibited greater magnitudes of motion in head and trunk than ankle. Smaller amplitudes and frequent phase reversals suggested control of the ankle by segmental proprioceptive inputs and ground reaction forces rather than by the visual-vestibular signals. Postural controllers may set limits of motion at each body segment rather than be governed solely by a perception of the visual vertical. Two locomotor strategies were also exhibited, implying that some subjects could override the effect of the roll axis optic flow field. Our results demonstrate task dependent differences that argue against using static postural responses to moving visual fields when assessing more dynamic tasks.

  1. Declarative Knowledge Acquisition in Immersive Virtual Learning Environments

    ERIC Educational Resources Information Center

    Webster, Rustin

    2016-01-01

    The author investigated the interaction effect of immersive virtual reality (VR) in the classroom. The objective of the project was to develop and provide a low-cost, scalable, and portable VR system containing purposely designed and developed immersive virtual learning environments for the US Army. The purpose of the mixed design experiment was…

  2. Active Learning through the Use of Virtual Environments

    ERIC Educational Resources Information Center

    Mayrose, James

    2012-01-01

    Immersive Virtual Reality (VR) has seen explosive growth over the last decade. Immersive VR attempts to give users the sensation of being fully immersed in a synthetic environment by providing them with 3D hardware, and allowing them to interact with objects in virtual worlds. The technology is extremely effective for learning and exploration, and…

  3. "To Improve Language, You Have to Mix": Teachers' Perceptions of Language Learning in an Overseas Immersion Environment

    ERIC Educational Resources Information Center

    Roskvist, Annelies; Harvey, Sharon; Corder, Deborah; Stacey, Karen

    2014-01-01

    The overseas immersion environment has long been considered a superior context for language learning, supposedly providing unlimited exposure to target language (TL) input and countless opportunities for authentic interaction with expert users. This article focuses on immersion programmes (IPs) for in-service language teachers--a relatively…

  4. Validation of an immersive virtual reality system for training near and far space neglect in individuals with stroke: a pilot study.

    PubMed

    Yasuda, Kazuhiro; Muroi, Daisuke; Ohira, Masahiro; Iwata, Hiroyasu

    2017-10-01

    Unilateral spatial neglect (USN) is defined as impaired ability to attend and see on one side, and when present, it interferes seriously with daily life. These symptoms can exist for near and far spaces combined or independently, and it is important to provide effective intervention for near and far space neglect. The purpose of this pilot study was to propose an immersive virtual reality (VR) rehabilitation program using a head-mounted display that is able to train both near and far space neglect, and to validate the immediate effect of the VR program in both near and far space neglect. Ten USN patients underwent the VR program with a pre-post design and no control. In the virtual environment, we developed visual searching and reaching tasks using an immersive VR system. Behavioral inattention test (BIT) scores obtained pre- and immediate post-VR program were compared. BIT scores obtained pre- and post-VR program revealed that far space neglect but not near space neglect improved promptly after the VR program. This effect for far space neglect was observed in the cancelation task, but not in the line bisection task. Positive effects of the immersive VR program for far space neglect are suggested by the results of the present pilot study. However, further studies with rigorous designs are needed to validate its clinical effectiveness.

  5. Study on Collaborative Object Manipulation in Virtual Environment

    NASA Astrophysics Data System (ADS)

    Mayangsari, Maria Niken; Yong-Moo, Kwon

    This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.

  6. The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment

    PubMed Central

    Milella, Ferdinando; Pinto, Carlo; Cant, Iain; White, Mark; Meyer, Georg

    2018-01-01

    Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as ‘presence’, when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user’s overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience. PMID:29390023

  7. The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment.

    PubMed

    Cooper, Natalia; Milella, Ferdinando; Pinto, Carlo; Cant, Iain; White, Mark; Meyer, Georg

    2018-01-01

    Objective and subjective measures of performance in virtual reality environments increase as more sensory cues are delivered and as simulation fidelity increases. Some cues (colour or sound) are easier to present than others (object weight, vestibular cues) so that substitute cues can be used to enhance informational content in a simulation at the expense of simulation fidelity. This study evaluates how substituting cues in one modality by alternative cues in another modality affects subjective and objective performance measures in a highly immersive virtual reality environment. Participants performed a wheel change in a virtual reality (VR) environment. Auditory, haptic and visual cues, signalling critical events in the simulation, were manipulated in a factorial design. Subjective ratings were recorded via questionnaires. The time taken to complete the task was used as an objective performance measure. The results show that participants performed best and felt an increased sense of immersion and involvement, collectively referred to as 'presence', when substitute multimodal sensory feedback was provided. Significant main effects of audio and tactile cues on task performance and on participants' subjective ratings were found. A significant negative relationship was found between the objective (overall completion times) and subjective (ratings of presence) performance measures. We conclude that increasing informational content, even if it disrupts fidelity, enhances performance and user's overall experience. On this basis we advocate the use of substitute cues in VR environments as an efficient method to enhance performance and user experience.

  8. Memory and visual search in naturalistic 2D and 3D environments

    PubMed Central

    Li, Chia-Ling; Aivar, M. Pilar; Kit, Dmitry M.; Tong, Matthew H.; Hayhoe, Mary M.

    2016-01-01

    The role of memory in guiding attention allocation in daily behaviors is not well understood. In experiments with two-dimensional (2D) images, there is mixed evidence about the importance of memory. Because the stimulus context in laboratory experiments and daily behaviors differs extensively, we investigated the role of memory in visual search, in both two-dimensional (2D) and three-dimensional (3D) environments. A 3D immersive virtual apartment composed of two rooms was created, and a parallel 2D visual search experiment composed of snapshots from the 3D environment was developed. Eye movements were tracked in both experiments. Repeated searches for geometric objects were performed to assess the role of spatial memory. Subsequently, subjects searched for realistic context objects to test for incidental learning. Our results show that subjects learned the room-target associations in 3D but less so in 2D. Gaze was increasingly restricted to relevant regions of the room with experience in both settings. Search for local contextual objects, however, was not facilitated by early experience. Incidental fixations to context objects do not necessarily benefit search performance. Together, these results demonstrate that memory for global aspects of the environment guides search by restricting allocation of attention to likely regions, whereas task relevance determines what is learned from the active search experience. Behaviors in 2D and 3D environments are comparable, although there is greater use of memory in 3D. PMID:27299769

  9. 3D Immersive Visualization with Astrophysical Data

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.

    2017-01-01

    We present the refinement of a new 3D immersion technique for astrophysical data visualization.Methodology to create 360 degree spherical panoramas is reviewed. The 3D software package Blender coupled with Python and the Google Spatial Media module are used together to create the final data products. Data can be viewed interactively with a mobile phone or tablet or in a web browser. The technique can apply to different kinds of astronomical data including 3D stellar and galaxy catalogs, images, and planetary maps.

  10. CliniSpace: a multiperson 3D online immersive training environment accessible through a browser.

    PubMed

    Dev, Parvati; Heinrichs, W LeRoy; Youngblood, Patricia

    2011-01-01

    Immersive online medical environments, with dynamic virtual patients, have been shown to be effective for scenario-based learning (1). However, ease of use and ease of access have been barriers to their use. We used feedback from prior evaluation of these projects to design and develop CliniSpace. To improve usability, we retained the richness of prior virtual environments but modified the user interface. To improve access, we used a Software-as-a-Service (SaaS) approach to present a richly immersive 3D environment within a web browser.

  11. Language-driven anticipatory eye movements in virtual reality.

    PubMed

    Eichert, Nicole; Peeters, David; Hagoort, Peter

    2018-06-01

    Predictive language processing is often studied by measuring eye movements as participants look at objects on a computer screen while they listen to spoken sentences. This variant of the visual-world paradigm has revealed that information encountered by a listener at a spoken verb can give rise to anticipatory eye movements to a target object, which is taken to indicate that people predict upcoming words. The ecological validity of such findings remains questionable, however, because these computer experiments used two-dimensional stimuli that were mere abstractions of real-world objects. Here we present a visual-world paradigm study in a three-dimensional (3-D) immersive virtual reality environment. Despite significant changes in the stimulus materials and the different mode of stimulus presentation, language-mediated anticipatory eye movements were still observed. These findings thus indicate that people do predict upcoming words during language comprehension in a more naturalistic setting where natural depth cues are preserved. Moreover, the results confirm the feasibility of using eyetracking in rich and multimodal 3-D virtual environments.

  12. Virtual reality simulation in neurosurgery: technologies and evolution.

    PubMed

    Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H

    2013-01-01

    Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.

  13. Voxel-based Immersive Environments Immersive Environments

    DTIC Science & Technology

    2000-05-31

    3D accelerated hardware. While this method lends itself well to modem hardware, the quality of the resulting images was low due to the coarse sampling...pipes. We will use MPEG video compression when sending video over T1 line, whereas for 56K bit Internet connection, we can use one of the more...sent over the communication line. The ultimate goal is to send the immersive environment over the 56K bps Internet. Since we need to send audio and

  14. Journey to the centre of the cell: Virtual reality immersion into scientific data.

    PubMed

    Johnston, Angus P R; Rae, James; Ariotti, Nicholas; Bailey, Benjamin; Lilja, Andrew; Webb, Robyn; Ferguson, Charles; Maher, Sheryl; Davis, Thomas P; Webb, Richard I; McGhee, John; Parton, Robert G

    2018-02-01

    Visualization of scientific data is crucial not only for scientific discovery but also to communicate science and medicine to both experts and a general audience. Until recently, we have been limited to visualizing the three-dimensional (3D) world of biology in 2 dimensions. Renderings of 3D cells are still traditionally displayed using two-dimensional (2D) media, such as on a computer screen or paper. However, the advent of consumer grade virtual reality (VR) headsets such as Oculus Rift and HTC Vive means it is now possible to visualize and interact with scientific data in a 3D virtual world. In addition, new microscopic methods provide an unprecedented opportunity to obtain new 3D data sets. In this perspective article, we highlight how we have used cutting edge imaging techniques to build a 3D virtual model of a cell from serial block-face scanning electron microscope (SBEM) imaging data. This model allows scientists, students and members of the public to explore and interact with a "real" cell. Early testing of this immersive environment indicates a significant improvement in students' understanding of cellular processes and points to a new future of learning and public engagement. In addition, we speculate that VR can become a new tool for researchers studying cellular architecture and processes by populating VR models with molecular data. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Visual motion combined with base of support width reveals variable field dependency in healthy young adults.

    PubMed

    Streepey, Jefferson W; Kenyon, Robert V; Keshner, Emily A

    2007-01-01

    We previously reported responses to induced postural instability in young healthy individuals viewing visual motion with a narrow (25 degrees in both directions) and wide (90 degrees and 55 degrees in the horizontal and vertical directions) field of view (FOV) as they stood on different sized blocks. Visual motion was achieved using an immersive virtual environment that moved realistically with head motion (natural motion) and translated sinusoidally at 0.1 Hz in the fore-aft direction (augmented motion). We observed that a subset of the subjects (steppers) could not maintain continuous stance on the smallest block when the virtual environment was in motion. We completed a posteriori analyses on the postural responses of the steppers and non-steppers that may inform us about the mechanisms underlying these differences in stability. We found that when viewing augmented motion with a wide FOV, there was a greater effect on the head and whole body center of mass and ankle angle root mean square (RMS) values of the steppers than of the non-steppers. FFT analyses revealed greater power at the frequency of the visual stimulus in the steppers compared to the non-steppers. Whole body COM time lags relative to the augmented visual scene revealed that the time-delay between the scene and the COM was significantly increased in the steppers. The increased responsiveness to visual information suggests a greater visual field-dependency of the steppers and suggests that the thresholds for shifting from a reliance on visual information to somatosensory information can differ even within a healthy population.

  16. Immersion as an embodied cognition shift: aesthetic experience and spatial situated cognition.

    PubMed

    Trentini, Bruno

    2015-09-01

    The main hypothesis of situated cognition is related to the origin of mental processes: the environment is thought to be the source of all cognitive processes. However, immersion enables a dual perception of space by enabling to perceive both the routine environment and a new way to see the world. We want to provide a further insight into the transition from on-line cognition to off-line cognition: we want to show that aesthetic experience towards immersive art comes from the awareness that one's cognition depends on the environment. Although this specific cognition is not independent from the general environment, it abstracts the individuals from their idiosyncratic environment. Therefore, immersive art may induce cognitive processes that are borderline cases of situated cognition. Aesthetic experience regarding spatial cognition will be described using an approach of embodied aesthetics, that is to say an approach which connects phenomenology of perception and cognitive sciences. No experiments are contemplated as of now. The experience of immersive art makes individuals become aware that their perceptual processes can adapt to the environment. Thus, the self-experience, which is typical of aesthetic experience, may be the cornerstone of off-line cognition.

  17. A Proposed Treatment for Visual Field Loss caused by Traumatic Brain Injury using Interactive Visuotactile Virtual Environment

    NASA Astrophysics Data System (ADS)

    Farkas, Attila J.; Hajnal, Alen; Shiratuddin, Mohd F.; Szatmary, Gabriella

    In this paper, we propose a novel approach of using interactive virtual environment technology in Vision Restoration Therapy caused by Traumatic Brain Injury. We called the new system Interactive Visuotactile Virtual Environment and it holds a promise of expanding the scope of already existing rehabilitation techniques. Traditional vision rehabilitation methods are based on passive psychophysical training procedures, and can last up to six months before any modest improvements can be seen in patients. A highly immersive and interactive virtual environment will allow the patient to practice everyday activities such as object identification and object manipulation through the use 3D motion sensoring handheld devices such data glove or the Nintendo Wiimote. Employing both perceptual and action components in the training procedures holds the promise of more efficient sensorimotor rehabilitation. Increased stimulation of visual and sensorimotor areas of the brain should facilitate a comprehensive recovery of visuomotor function by exploiting the plasticity of the central nervous system. Integrated with a motion tracking system and an eye tracking device, the interactive virtual environment allows for the creation and manipulation of a wide variety of stimuli, as well as real-time recording of hand-, eye- and body movements and coordination. The goal of the project is to design a cost-effective and efficient vision restoration system.

  18. The Integrated Virtual Environment Rehabilitation Treadmill System

    PubMed Central

    Feasel, Jeff; Whitton, Mary C.; Kassler, Laura; Brooks, Frederick P.; Lewek, Michael D.

    2015-01-01

    Slow gait speed and interlimb asymmetry are prevalent in a variety of disorders. Current approaches to locomotor retraining emphasize the need for appropriate feedback during intensive, task-specific practice. This paper describes the design and feasibility testing of the integrated virtual environment rehabilitation treadmill (IVERT) system intended to provide real-time, intuitive feedback regarding gait speed and asymmetry during training. The IVERT system integrates an instrumented, split-belt treadmill with a front-projection, immersive virtual environment. The novel adaptive control system uses only ground reaction force data from the treadmill to continuously update the speeds of the two treadmill belts independently, as well as to control the speed and heading in the virtual environment in real time. Feedback regarding gait asymmetry is presented 1) visually as walking a curved trajectory through the virtual environment and 2) proprioceptively in the form of different belt speeds on the split-belt treadmill. A feasibility study involving five individuals with asymmetric gait found that these individuals could effectively control the speed of locomotion and perceive gait asymmetry during the training session. Although minimal changes in overground gait symmetry were observed immediately following a single training session, further studies should be done to determine the IVERT’s potential as a tool for rehabilitation of asymmetric gait by providing patients with congruent visual and proprioceptive feedback. PMID:21652279

  19. Visual Immersion for Cultural Understanding and Multimodal Literacy

    ERIC Educational Resources Information Center

    Smilan, Cathy

    2017-01-01

    When considering inclusive art curriculum that accommodates all learners, including English language learners, two distinct yet inseparable issues come to mind. The first is that English language learner students can use visual language and visual literacy skills inherent in visual arts curriculum to scaffold learning in and through the arts.…

  20. Exploring 4D Flow Data in an Immersive Virtual Environment

    NASA Astrophysics Data System (ADS)

    Stevens, A. H.; Butkiewicz, T.

    2017-12-01

    Ocean models help us to understand and predict a wide range of intricate physical processes which comprise the atmospheric and oceanic systems of the Earth. Because these models output an abundance of complex time-varying three-dimensional (i.e., 4D) data, effectively conveying the myriad information from a given model poses a significant visualization challenge. The majority of the research effort into this problem has concentrated around synthesizing and examining methods for representing the data itself; by comparison, relatively few studies have looked into the potential merits of various viewing conditions and virtual environments. We seek to improve our understanding of the benefits offered by current consumer-grade virtual reality (VR) systems through an immersive, interactive 4D flow visualization system. Our dataset is a Regional Ocean Modeling System (ROMS) model representing a 12-hour tidal cycle of the currents within New Hampshire's Great Bay estuary. The model data was loaded into a custom VR particle system application using the OpenVR software library and the HTC Vive hardware, which tracks a headset and two six-degree-of-freedom (6DOF) controllers within a 5m-by-5m area. The resulting visualization system allows the user to coexist in the same virtual space as the data, enabling rapid and intuitive analysis of the flow model through natural interactions with the dataset and within the virtual environment. Whereas a traditional computer screen typically requires the user to reposition a virtual camera in the scene to obtain the desired view of the data, in virtual reality the user can simply move their head to the desired viewpoint, completely eliminating the mental context switches from data exploration/analysis to view adjustment and back. The tracked controllers become tools to quickly manipulate (reposition, reorient, and rescale) the dataset and to interrogate it by, e.g., releasing dye particles into the flow field, probing scalar velocities, placing a cutting plane through a region of interest, etc. It is hypothesized that the advantages afforded by head-tracked viewing and 6DOF interaction devices will lead to faster and more efficient examination of 4D flow data. A human factors study is currently being prepared to empirically evaluate this method of visualization and interaction.

  1. Virtual reality in anxiety disorders: the past and the future.

    PubMed

    Gorini, Alessandra; Riva, Giuseppe

    2008-02-01

    One of the most effective treatments of anxiety is exposure therapy: a person is exposed to specific feared situations or objects that trigger anxiety. This exposure process may be done through actual exposure, with visualization, by imagination or using virtual reality (VR), that provides users with computer simulated environments with and within which they can interact. VR is made possible by the capability of computers to synthesize a 3D graphical environment from numerical data. Furthermore, because input devices sense the subject's reactions and motions, the computer can modify the synthetic environment accordingly, creating the illusion of interacting with, and thus being immersed within the environment. Starting from 1995, different experimental studies have been conducted in order to investigate the effect of VR exposure in the treatment of subclinical fears and anxiety disorders. This review will discuss their outcome and provide guidelines for the use of VR exposure for the treatment of anxious patients.

  2. The Evolution of Constructivist Learning Environments: Immersion in Distributed, Virtual Worlds.

    ERIC Educational Resources Information Center

    Dede, Chris

    1995-01-01

    Discusses the evolution of constructivist learning environments and examines the collaboration of simulated software models, virtual environments, and evolving mental models via immersion in artificial realities. A sidebar gives a realistic example of a student navigating through cyberspace. (JMV)

  3. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  4. An Active Vision Approach to Understanding and Improving Visual Training in the Geosciences

    NASA Astrophysics Data System (ADS)

    Voronov, J.; Tarduno, J. A.; Jacobs, R. A.; Pelz, J. B.; Rosen, M. R.

    2009-12-01

    Experience in the field is a fundamental aspect of geologic training, and its effectiveness is largely unchallenged because of anecdotal evidence of its success among expert geologists. However, there have been only a few quantitative studies based on large data collection efforts to investigate how Earth Scientists learn in the field. In a recent collaboration between Earth scientists, Cognitive scientists and experts in Imaging science at the University of Rochester and Rochester Institute of Technology, we are investigating such a study. Within Cognitive Science, one school of thought, referred to as the Active Vision approach, emphasizes that visual perception is an active process requiring us to move our eyes to acquire new information about our environment. The Active Vision approach indicates the perceptual skills which experts possess and which novices will need to acquire to achieve expert performance. We describe data collection efforts using portable eye-trackers to assess how novice and expert geologists acquire visual knowledge in the field. We also discuss our efforts to collect images for use in a semi-immersive classroom environment, useful for further testing of novices and experts using eye-tracking technologies.

  5. 'Putting it on the table': direct-manipulative interaction and multi-user display technologies for semi-immersive environments and augmented reality applications.

    PubMed

    Encarnação, L Miguel; Bimber, Oliver

    2002-01-01

    Collaborative virtual environments for diagnosis and treatment planning are increasingly gaining importance in our global society. Virtual and Augmented Reality approaches promised to provide valuable means for the involved interactive data analysis, but the underlying technologies still create a cumbersome work environment that is inadequate for clinical employment. This paper addresses two of the shortcomings of such technology: Intuitive interaction with multi-dimensional data in immersive and semi-immersive environments as well as stereoscopic multi-user displays combining the advantages of Virtual and Augmented Reality technology.

  6. Virtual reality and telerobotics applications of an Address Recalculation Pipeline

    NASA Technical Reports Server (NTRS)

    Regan, Matthew; Pose, Ronald

    1994-01-01

    The technology described in this paper was designed to reduce latency to user interactions in immersive virtual reality environments. It is also ideally suited to telerobotic applications such as interaction with remote robotic manipulators in space or in deep sea operations. in such circumstances the significant latency is observed response to user stimulus which is due to communications delays, and the disturbing jerkiness due to low and unpredictable frame rates on compressed video user feedback or computationally limited virtual worlds, can be masked by our techniques. The user is provided with highly responsive visual feedback independent of communication or computational delays in providing physical video feedback or in rendering virtual world images. Virtual and physical environments can be combined seamlessly using these techniques.

  7. Corrosion Behavior of Low-C Medium-Mn Steel in Simulated Marine Immersion and Splash Zone Environment

    NASA Astrophysics Data System (ADS)

    Zhang, Dazheng; Gao, Xiuhua; Su, Guanqiao; Du, Linxiu; Liu, Zhenguang; Hu, Jun

    2017-05-01

    The corrosion behavior of low-C medium-Mn steel in simulated marine immersion and splash zone environment was studied by static immersion corrosion experiment and wet-dry cyclic corrosion experiment, respectively. Corrosion rate, corrosion products, surface morphology, cross-sectional morphology, elemental distribution, potentiodynamic polarization curves and electrochemical impedance spectra were used to elucidate the corrosion behavior of low-C medium-Mn steel. The results show that corrosion rate in immersion zone is much less than that in splash zone owing to its relatively mild environment. Manganese compounds are detected in the corrosion products and only appeared in splash zone environment, which can deteriorate the protective effect of rust layer. With the extension of exposure time, corrosion products are gradually transformed into dense and thick corrosion rust from the loose and porous one in these two environments. But in splash zone environment, alloying elements of Mn appear significant enrichment in the rust layer, which decrease the corrosion resistance of the steel.

  8. Level of Immersion in Virtual Environments Impacts the Ability to Assess and Teach Social Skills in Autism Spectrum Disorder

    PubMed Central

    Bugnariu, Nicoleta L.

    2016-01-01

    Abstract Virtual environments (VEs) may be useful for delivering social skills interventions to individuals with autism spectrum disorder (ASD). Immersive VEs provide opportunities for individuals with ASD to learn and practice skills in a controlled replicable setting. However, not all VEs are delivered using the same technology, and the level of immersion differs across settings. We group studies into low-, moderate-, and high-immersion categories by examining five aspects of immersion. In doing so, we draw conclusions regarding the influence of this technical manipulation on the efficacy of VEs as a tool for assessing and teaching social skills. We also highlight ways in which future studies can advance our understanding of how manipulating aspects of immersion may impact intervention success. PMID:26919157

  9. Enhancing radiological volumes with symbolic anatomy using image fusion and collaborative virtual reality.

    PubMed

    Silverstein, Jonathan C; Dech, Fred; Kouchoukos, Philip L

    2004-01-01

    Radiological volumes are typically reviewed by surgeons using cross-sections and iso-surface reconstructions. Applications that combine collaborative stereo volume visualization with symbolic anatomic information and data fusions would expand surgeons' capabilities in interpretation of data and in planning treatment. Such an application has not been seen clinically. We are developing methods to systematically combine symbolic anatomy (term hierarchies and iso-surface atlases) with patient data using data fusion. We describe our progress toward integrating these methods into our collaborative virtual reality application. The fully combined application will be a feature-rich stereo collaborative volume visualization environment for use by surgeons in which DICOM datasets will self-report underlying anatomy with visual feedback. Using hierarchical navigation of SNOMED-CT anatomic terms integrated with our existing Tele-immersive DICOM-based volumetric rendering application, we will display polygonal representations of anatomic systems on the fly from menus that query a database. The methods and tools involved in this application development are SNOMED-CT, DICOM, VISIBLE HUMAN, volumetric fusion and C++ on a Tele-immersive platform. This application will allow us to identify structures and display polygonal representations from atlas data overlaid with the volume rendering. First, atlas data is automatically translated, rotated, and scaled to the patient data during loading using a public domain volumetric fusion algorithm. This generates a modified symbolic representation of the underlying canonical anatomy. Then, through the use of collision detection or intersection testing of various transparent polygonal representations, the polygonal structures are highlighted into the volumetric representation while the SNOMED names are displayed. Thus, structural names and polygonal models are associated with the visualized DICOM data. This novel juxtaposition of information promises to expand surgeons' abilities to interpret images and plan treatment.

  10. Immersive Environments for Mission Operations: Beyond Mars Pathfinder

    NASA Technical Reports Server (NTRS)

    Wright, J.; Hartman, F.; Cooper, B.

    1998-01-01

    Immersive environments are just beginning to be used to support mission operations at the Jet Propulsion Laboratory. This technology contributed to the Mars Pathfinder Mission in planning sorties for the Sojourner rover.

  11. Comparing perceived auditory width to the visual image of a performing ensemble in contrasting bi-modal environmentsa)

    PubMed Central

    Valente, Daniel L.; Braasch, Jonas; Myrbeck, Shane A.

    2012-01-01

    Despite many studies investigating auditory spatial impressions in rooms, few have addressed the impact of simultaneous visual cues on localization and the perception of spaciousness. The current research presents an immersive audiovisual environment in which participants were instructed to make auditory width judgments in dynamic bi-modal settings. The results of these psychophysical tests suggest the importance of congruent audio visual presentation to the ecological interpretation of an auditory scene. Supporting data were accumulated in five rooms of ascending volumes and varying reverberation times. Participants were given an audiovisual matching test in which they were instructed to pan the auditory width of a performing ensemble to a varying set of audio and visual cues in rooms. Results show that both auditory and visual factors affect the collected responses and that the two sensory modalities coincide in distinct interactions. The greatest differences between the panned audio stimuli given a fixed visual width were found in the physical space with the largest volume and the greatest source distance. These results suggest, in this specific instance, a predominance of auditory cues in the spatial analysis of the bi-modal scene. PMID:22280585

  12. Digital Immersive Virtual Environments and Instructional Computing

    ERIC Educational Resources Information Center

    Blascovich, Jim; Beall, Andrew C.

    2010-01-01

    This article reviews theory and research relevant to the development of digital immersive virtual environment-based instructional computing systems. The review is organized within the context of a multidimensional model of social influence and interaction within virtual environments that models the interaction of four theoretical factors: theory…

  13. Immersive Planetarium Visualizations for Teaching Solar System Moon Concepts to Undergraduates

    ERIC Educational Resources Information Center

    Yu, Ka Chun; Sahami, Kamran; Denn, Grant; Sahami, Victoria; Sessions, Larry C.

    2016-01-01

    Digital video fulldome has long been heralded as a revolutionary educational technology; yet the discipline-based astronomy education research literature showing planetarium effectiveness has been sparse. In order to help understand to what extent immersion impacts learning and the effect of the "narrative journey" model of presentation,…

  14. Employing immersive virtual environments for innovative experiments in health care communication.

    PubMed

    Persky, Susan

    2011-03-01

    This report reviews the literature for studies that employ immersive virtual environment technology methods to conduct experimental studies in health care communication. Advantages and challenges of using these tools for research in this area are also discussed. A literature search was conducted using the Scopus database. Results were hand searched to identify the body of studies, conducted since 1995, that are related to the report objective. The review identified four relevant studies that stem from two unique projects. One project focused on the impact of a clinician's characteristics and behavior on health care communication, the other focused on the characteristics of the patient. Both projects illustrate key methodological advantages conferred by immersive virtual environments, including, ability to maintain simultaneously high experimental control and realism, ability to manipulate variables in new ways, and unique behavioral measurement opportunities. Though implementation challenges exist for immersive virtual environment-based research methods, given the technology's unique capabilities, benefits can outweigh the costs in many instances. Immersive virtual environments may therefore prove an important addition to the array of tools available for advancing our understanding of communication in health care. Published by Elsevier Ireland Ltd.

  15. Exploring the Relationship Between Distributed Training, Integrated Learning Environments, and Immersive Training Environments

    DTIC Science & Technology

    2007-01-01

    educating and training (O’Keefe IV & McIntyre III, 2006). Topics vary widely from standard educational topics such as teaching kids physics, mechanics...Winn, W., & Yu, R. (1997). The Impact of Three Dimensional Immersive Virtual Environments on Modern Pedagogy : Global Change, VR and Learning

  16. Failure behavior of plasma-sprayed HAp coating on commercially pure titanium substrate in simulated body fluid (SBF) under bending load.

    PubMed

    Laonapakul, Teerawat; Rakngarm Nimkerdphol, Achariya; Otsuka, Yuichi; Mutoh, Yoshiharu

    2012-11-01

    Four point bending tests with acoustic emission (AE) monitoring were conducted for evaluating failure behavior of the plasma-sprayed hydroxyapatite (HAp) top coat on commercially pure titanium (cp-Ti) plate with and without mixed HAp/Ti bond coat. Effect of immersion in simulated body fluid (SBF) on failure behavior of the coated specimen was also investigated by immersing the specimen in SBF. The AE patterns obtained from the bending test of the HAp coating specimens after a week immersion in SBF clearly showed the earlier stage of delamination and spallation of the coating layer compared to those without immersion in SBF. It was also found that the bond coating improved failure resistance of the HAp coating specimen compared to that without the bond coat. Four point bend fatigue tests under ambient and SBF environments were also conducted with AE monitoring during the entire fatigue test for investigating the influence of SBF environment on fatigue failure behavior of the HAp coating specimen with the mixed HAp/Ti bond coat. The specimens tested at a stress amplitude of 120 MPa under both ambient and SBF environments could survive up to 10⁷ cycles without spallation of HAp coating layer. The specimens tested under SBF environment and those tested under ambient environment after immersion in SBF showed shorter fatigue life compared to those tested under ambient environment without SBF immersion. Micro-cracks nucleated in the coating layer in the early stage of fatigue life and then propagated into the cp-Ti substrate in the intermediate stage, which unstably propagated to failure in the final stage. It was found from the XRD analysis that the dissolution of the co-existing phases and the precipitation of the HAp phase were taken place during immersion in SBF. During this process, the co-existing phases disappeared from the coating layer and the HAp phase fully occupied the coating layer. The degradation of bending strength and fatigue life of the HAp coating specimens tested under SBF environment would be induced by dissolution of the co-existing phases from the coating layer during immersion in SBF. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Testing geoscience data visualization systems for geological mapping and training

    NASA Astrophysics Data System (ADS)

    Head, J. W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Senthil Kumar, P.

    2008-09-01

    Traditional methods of planetary geological mapping have relied on photographic hard copy and light-table tracing and mapping. In the last several decades this has given way to the availability and analysis of multiple digital data sets, and programs and platforms that permit the viewing and manipulation of multiple annotated layers of relevant information. This has revolutionized the ability to incorporate important new data into the planetary mapping process at all scales. Information on these developments and approaches can be obtained at http://astrogeology.usgs. gov/ Technology/. The processes is aided by Geographic Information Systems (GIS) (see http://astrogeology. usgs.gov/Technology/) and excellent analysis packages (such as ArcGIS) that permit co-registration, rapid viewing, and analysis of multiple data sets on desktop displays (see http://astrogeology.usgs.gov/Projects/ webgis/). We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment", or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks. There is still much to learn and understand, however, about how the varying degrees of immersive displays affect task performance. For example, in using a 1280x1024 desktop monitor to explore an image, the mapper wastes a lot of time in image zooming/panning to balance the analysis-driven need for both detail as well as context. Therefore, we have spent a considerable amount of time exploring higher-resolution media, such as an IBM Bertha display 3840x2400 or a tiled wall with multiple projectors. We have found through over a year of weekly meetings and assessment that they definitely improve the efficiency of analysis and mapping. Here we outline briefly the nature of the major systems and our initial assessment of these in 1:5M Scale NASA-USGS Venus Geological Mapping Program (http://astrogeology.usgs. gov/Projects/PlanetaryMapping/MapStatus/VenusStatus/V enus_Status.html). 1. Immersive Virtual Reality (Cave): ADVISER System Description: Our Cave system is an 8'x8'x8' cube with four projection surfaces (three walls and the floor). Four linux machines (identical in performance to the desktop machine) provide data for the Cave. Users utilize a handheld 3D tracked input device to navigate. Our 3D input device has a joystick and is simple to use. To navigate, the user simply points in the direction he/she wants to fly and pushes the joystick forward or backward to move relative to that direction. The user can push the joystick to the left and right to rotate his/her position in the virtual world. A collision detection algorithm is used to prevent the user from going underneath the surface. We have developed ADVISER (ADvanced VIsualization for Solar system Exploration) [1,2] as a tool for taking planetary geologists virtually "into the field" in the IVR Cave environment in support of several scientific themes and have assessed its application to geological mapping of Venus. ADVISER aims to create a field experience by integrating multiple data sources and presenting them as a unified environment to the scientist. Additionally, we have developed a virtual field kit, tailored to supporting research tasks dictated by scientific and mapping themes. Technically, ADVISER renders high-resolution topographic and image datasets (8192x8192 samples) in stereo at interactive frame-rates (25+ frames-per-second). The system is based on a state-of-the-art terrain rendering system and is highly interactive; for example, vertical exaggeration, lighting geometry, image contrast, and contour lines can be modified by the user in real time. High-resolution image data can be overlaid on the terrain and other data can be rendered in this context. A detailed description and case studies of ADVISER are available.

  18. Virtual environment display for a 3D audio room simulation

    NASA Astrophysics Data System (ADS)

    Chapin, William L.; Foster, Scott

    1992-06-01

    Recent developments in virtual 3D audio and synthetic aural environments have produced a complex acoustical room simulation. The acoustical simulation models a room with walls, ceiling, and floor of selected sound reflecting/absorbing characteristics and unlimited independent localizable sound sources. This non-visual acoustic simulation, implemented with 4 audio ConvolvotronsTM by Crystal River Engineering and coupled to the listener with a Poihemus IsotrakTM, tracking the listener's head position and orientation, and stereo headphones returning binaural sound, is quite compelling to most listeners with eyes closed. This immersive effect should be reinforced when properly integrated into a full, multi-sensory virtual environment presentation. This paper discusses the design of an interactive, visual virtual environment, complementing the acoustic model and specified to: 1) allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; 2) reinforce the listener's feeling of telepresence into the acoustical environment with visual and proprioceptive sensations; 3) enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and 4) serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations. The installed system implements a head-coupled, wide-angle, stereo-optic tracker/viewer and multi-computer simulation control. The portable demonstration system implements a head-mounted wide-angle, stereo-optic display, separate head and pointer electro-magnetic position trackers, a heterogeneous parallel graphics processing system, and object oriented C++ program code.

  19. Learning Relative Motion Concepts in Immersive and Non-immersive Virtual Environments

    NASA Astrophysics Data System (ADS)

    Kozhevnikov, Michael; Gurlitt, Johannes; Kozhevnikov, Maria

    2013-12-01

    The focus of the current study is to understand which unique features of an immersive virtual reality environment have the potential to improve learning relative motion concepts. Thirty-seven undergraduate students learned relative motion concepts using computer simulation either in immersive virtual environment (IVE) or non-immersive desktop virtual environment (DVE) conditions. Our results show that after the simulation activities, both IVE and DVE groups exhibited a significant shift toward a scientific understanding in their conceptual models and epistemological beliefs about the nature of relative motion, and also a significant improvement on relative motion problem-solving tests. In addition, we analyzed students' performance on one-dimensional and two-dimensional questions in the relative motion problem-solving test separately and found that after training in the simulation, the IVE group performed significantly better than the DVE group on solving two-dimensional relative motion problems. We suggest that egocentric encoding of the scene in IVE (where the learner constitutes a part of a scene they are immersed in), as compared to allocentric encoding on a computer screen in DVE (where the learner is looking at the scene from "outside"), is more beneficial than DVE for studying more complex (two-dimensional) relative motion problems. Overall, our findings suggest that such aspects of virtual realities as immersivity, first-hand experience, and the possibility of changing different frames of reference can facilitate understanding abstract scientific phenomena and help in displacing intuitive misconceptions with more accurate mental models.

  20. Using biofeedback while immersed in a stressful videogame increases the effectiveness of stress management skills in soldiers.

    PubMed

    Bouchard, Stéphane; Bernier, François; Boivin, Eric; Morin, Brian; Robillard, Geneviève

    2012-01-01

    This study assessed the efficacy of using visual and auditory biofeedback while immersed in a tridimensional videogame to practice a stress management skill (tactical breathing). All 41 participants were soldiers who had previously received basic stress management training and first aid training in combat. On the first day, they received a 15-minute refresher briefing and were randomly assigned to either: (a) no additional stress management training (SMT) for three days, or (b) 30-minute sessions (one per day for three days) of biofeedback-assisted SMT while immersed in a horror/first-person shooter game. The training was performed in a dark and enclosed environment using a 50-inch television with active stereoscopic display and loudspeakers. On the last day, all participants underwent a live simulated ambush with an improvised explosive device, where they had to provide first aid to a wounded soldier. Stress levels were measured with salivary cortisol collected when waking-up, before and after the live simulation. Stress was also measured with heart rate at baseline, during an apprehension phase, and during the live simulation. Repeated-measure ANOVAs and ANCOVAs confirmed that practicing SMT was effective in reducing stress. Results are discussed in terms of the advantages of the proposed program for military personnel and the need to practice SMT.

  1. Immersive visualization for navigation and control of the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Hartman, Frank R.; Cooper, Brian; Maxwell, Scott; Wright, John; Yen, Jeng

    2004-01-01

    The Rover Sequencing and Visualization Program (RSVP) is a suite of tools for sequencing of planetary rovers, which are subject to significant light time delay and thus are unsuitable for teleoperation.

  2. Full Immersive Virtual Environment Cave[TM] in Chemistry Education

    ERIC Educational Resources Information Center

    Limniou, M.; Roberts, D.; Papadopoulos, N.

    2008-01-01

    By comparing two-dimensional (2D) chemical animations designed for computer's desktop with three-dimensional (3D) chemical animations designed for the full immersive virtual reality environment CAVE[TM] we studied how virtual reality environments could raise student's interest and motivation for learning. By using the 3ds max[TM], we can visualize…

  3. Using Virtual Reality to Help Students with Social Interaction Skills

    ERIC Educational Resources Information Center

    Beach, Jason; Wendt, Jeremy

    2015-01-01

    The purpose of this study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity to…

  4. A Quantitative Visual Mapping and Visualization Approach for Deep Ocean Floor Research

    NASA Astrophysics Data System (ADS)

    Hansteen, T. H.; Kwasnitschka, T.

    2013-12-01

    Geological fieldwork on the sea floor is still impaired by our inability to resolve features on a sub-meter scale resolution in a quantifiable reference frame and over an area large enough to reveal the context of local observations. In order to overcome these issues, we have developed an integrated workflow of visual mapping techniques leading to georeferenced data sets which we examine using state-of-the-art visualization technology to recreate an effective working style of field geology. We demonstrate a microbathymetrical workflow, which is based on photogrammetric reconstruction of ROV imagery referenced to the acoustic vehicle track. The advantage over established acoustical systems lies in the true three-dimensionality of the data as opposed to the perspective projection from above produced by downward looking mapping methods. A full color texture mosaic derived from the imagery allows studies at resolutions beyond the resolved geometry (usually one order of magnitude below the image resolution) while color gives additional clues, which can only be partly resolved in acoustic backscatter. The creation of a three-dimensional model changes the working style from the temporal domain of a video recording back to the spatial domain of a map. We examine these datasets using a custom developed immersive virtual visualization environment. The ARENA (Artificial Research Environment for Networked Analysis) features a (lower) hemispherical screen at a diameter of six meters, accommodating up to four scientists at once thus providing the ability to browse data interactively among a group of researchers. This environment facilitates (1) the development of spatial understanding analogue to on-land outcrop studies, (2) quantitative observations of seafloor morphology and physical parameters of its deposits, (3) more effective formulation and communication of working hypotheses.

  5. Evaluation of knowledge transfer in an immersive virtual learning environment for the transportation community.

    DOT National Transportation Integrated Search

    2014-05-01

    Immersive Virtual Learning Environments (IVLEs) are extensively used in training, but few rigorous scientific investigations regarding the : transfer of learning have been conducted. Measurement of learning transfer through evaluative methods is key ...

  6. Interactive 3D visualization of structural changes in the brain of a person with corticobasal syndrome

    PubMed Central

    Hänel, Claudia; Pieperhoff, Peter; Hentschel, Bernd; Amunts, Katrin; Kuhlen, Torsten

    2014-01-01

    The visualization of the progression of brain tissue loss in neurodegenerative diseases like corticobasal syndrome (CBS) can provide not only information about the localization and distribution of the volume loss, but also helps to understand the course and the causes of this neurodegenerative disorder. The visualization of such medical imaging data is often based on 2D sections, because they show both internal and external structures in one image. Spatial information, however, is lost. 3D visualization of imaging data is capable to solve this problem, but it faces the difficulty that more internally located structures may be occluded by structures near the surface. Here, we present an application with two designs for the 3D visualization of the human brain to address these challenges. In the first design, brain anatomy is displayed semi-transparently; it is supplemented by an anatomical section and cortical areas for spatial orientation, and the volumetric data of volume loss. The second design is guided by the principle of importance-driven volume rendering: A direct line-of-sight to the relevant structures in the deeper parts of the brain is provided by cutting out a frustum-like piece of brain tissue. The application was developed to run in both, standard desktop environments and in immersive virtual reality environments with stereoscopic viewing for improving the depth perception. We conclude, that the presented application facilitates the perception of the extent of brain degeneration with respect to its localization and affected regions. PMID:24847243

  7. Using RSVP for analyzing state and previous activities for the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Cooper, Brian K.; Hartman, Frank; Maxwell, Scott; Wright, John; Yen, Jeng

    2004-01-01

    Current developments in immersive environments for mission planning include several tools which make up a system for performing and rehearsing missions. This system, known as the Rover Sequencing and Visualization Program (RSVP), includes tools for planning long range sorties for highly autonomous rovers, tools for planning operations with robotic arms, and advanced tools for visualizing telemetry from remote spacecraft and landers. One of the keys to successful planning of rover activities is knowing what the rover has accomplished to date and understanding the current rover state. RSVP builds on the lessons learned and the heritage of the Mars Pathfinder mission This paper will discuss the tools and methodologies present in the RSVP suite for examining rover state, reviewing previous activities, visually comparing telemetered results to rehearsed results, and reviewing science and engineering imagery. In addition we will present how this tool suite was used on the Mars Exploration Rovers (MER) project to explore the surface of Mars.

  8. Evaluating visual and auditory contributions to the cognitive restoration effect.

    PubMed

    Emfield, Adam G; Neider, Mark B

    2014-01-01

    It has been suggested that certain real-world environments can have a restorative effect on an individual, as expressed in changes in cognitive performance and mood. Much of this research builds on Attention Restoration Theory (ART), which suggests that environments that have certain characteristics induce cognitive restoration via variations in attentional demands. Specifically, natural environments that require little top-down processing have a positive effect on cognitive performance, while city-like environments show no effect. We characterized the cognitive restoration effect further by examining (1) whether natural visual stimuli, such as blue spaces, were more likely to provide a restorative effect over urban visual stimuli, (2) if increasing immersion with environment-related sound produces a similar or superior effect, (3) if this effect extends to other cognitive tasks, such as the functional field of view (FFOV), and (4) if we could better understand this effect by providing controls beyond previous works. We had 202 participants complete a cognitive task battery, consisting of a reverse digit span task, the attention network task, and the FFOV task prior to and immediately after a restoration period. In the restoration period, participants were assigned to one of seven conditions in which they listened to natural or urban sounds, watched images of natural or urban environments, or a combination of both. Additionally, some participants were in a control group with exposure to neither picture nor sound. While we found some indication of practice effects, there were no differential effects of restoration observed in any of our cognitive tasks, regardless of condition. We did, however, find evidence that our nature images and sounds were more relaxing than their urban counterparts. Overall, our findings suggest that acute exposure to relaxing pictorial and auditory stimulus is insufficient to induce improvements in cognitive performance.

  9. Highly immersive virtual reality laparoscopy simulation: development and future aspects.

    PubMed

    Huber, Tobias; Wunderling, Tom; Paschold, Markus; Lang, Hauke; Kneist, Werner; Hansen, Christian

    2018-02-01

    Virtual reality (VR) applications with head-mounted displays (HMDs) have had an impact on information and multimedia technologies. The current work aimed to describe the process of developing a highly immersive VR simulation for laparoscopic surgery. We combined a VR laparoscopy simulator (LapSim) and a VR-HMD to create a user-friendly VR simulation scenario. Continuous clinical feedback was an essential aspect of the development process. We created an artificial VR (AVR) scenario by integrating the simulator video output with VR game components of figures and equipment in an operating room. We also created a highly immersive VR surrounding (IVR) by integrating the simulator video output with a [Formula: see text] video of a standard laparoscopy scenario in the department's operating room. Clinical feedback led to optimization of the visualization, synchronization, and resolution of the virtual operating rooms (in both the IVR and the AVR). Preliminary testing results revealed that individuals experienced a high degree of exhilaration and presence, with rare events of motion sickness. The technical performance showed no significant difference compared to that achieved with the standard LapSim. Our results provided a proof of concept for the technical feasibility of an custom highly immersive VR-HMD setup. Future technical research is needed to improve the visualization, immersion, and capability of interacting within the virtual scenario.

  10. Leveraging Google Geo Tools for Interactive STEM Education: Insights from the GEODE Project

    NASA Astrophysics Data System (ADS)

    Dordevic, M.; Whitmeyer, S. J.; De Paor, D. G.; Karabinos, P.; Burgin, S.; Coba, F.; Bentley, C.; St John, K. K.

    2016-12-01

    Web-based imagery and geospatial tools have transformed our ability to immerse students in global virtual environments. Google's suite of geospatial tools, such as Google Earth (± Engine), Google Maps, and Street View, allow developers and instructors to create interactive and immersive environments, where students can investigate and resolve common misconceptions in STEM concepts and natural processes. The GEODE (.net) project is developing digital resources to enhance STEM education. These include virtual field experiences (VFEs), such as an interactive visualization of the breakup of the Pangaea supercontinent, a "Grand Tour of the Terrestrial Planets," and GigaPan-based VFEs of sites like the Canadian Rockies. Web-based challenges, such as EarthQuiz (.net) and the "Fold Analysis Challenge," incorporate scaffolded investigations of geoscience concepts. EarthQuiz features web-hosted imagery, such as Street View, Photo Spheres, GigaPans, and Satellite View, as the basis for guided inquiry. In the Fold Analysis Challenge, upper-level undergraduates use Google Earth to evaluate a doubly-plunging fold at Sheep Mountain, WY. GEODE.net also features: "Reasons for the Seasons"—a Google Earth-based visualization that addresses misconceptions that abound amongst students, teachers, and the public, many of whom believe that seasonality is caused by large variations in Earth's distance from the Sun; "Plate Euler Pole Finder," which helps students understand rotational motion of tectonic plates on the globe; and "Exploring Marine Sediments Using Google Earth," an exercise that uses empirical data to explore the surficial distribution of marine sediments in the modern ocean. The GEODE research team includes the authors and: Heather Almquist, Cinzia Cervato, Gene Cooper, Helen Crompton, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Barb Tewksbury, and their students and collaborating colleagues. We are supported by NSF DUE 1323419 and a Google Geo Curriculum Award.

  11. Perception of approaching and retreating floor-projected shapes in a large, immersive, multimedia learning environment.

    PubMed

    Dolgov, Igor; Birchfield, David A; McBeath, Michael K; Thornburg, Harvey; Todd, Christopher G

    2009-04-01

    Perception of floor-projected moving geometric shapes was examined in the context of the Situated Multimedia Arts Learning Laboratory (SMALLab), an immersive, mixed-reality learning environment. As predicted, the projected destinations of shapes which retreated in depth (proximal origin) were judged significantly less accurately than those that approached (distal origin). Participants maintained similar magnitudes of error throughout the session, and no effect of practice was observed. Shape perception in an immersive multimedia environment is comparable to the real world. One may conclude that systematic exploration of basic psychological phenomena in novel mediated environments is integral to an understanding of human behavior in novel human-computer interaction architectures.

  12. Systematic distortions of perceptual stability investigated using immersive virtual reality

    PubMed Central

    Tcheang, Lili; Gilson, Stuart J.; Glennerster, Andrew

    2010-01-01

    Using an immersive virtual reality system, we measured the ability of observers to detect the rotation of an object when its movement was yoked to the observer's own translation. Most subjects had a large bias such that a static object appeared to rotate away from them as they moved. Thresholds for detecting target rotation were similar to those for an equivalent speed discrimination task carried out by static observers, suggesting that visual discrimination is the predominant limiting factor in detecting target rotation. Adding a stable visual reference frame almost eliminated the bias. Varying the viewing distance of the target had little effect, consistent with observers under-estimating distance walked. However, accuracy of walking to a briefly presented visual target was high and not consistent with an under-estimation of distance walked. We discuss implications for theories of a task-independent representation of visual space. PMID:15845248

  13. Haptics-based immersive telerobotic system for improvised explosive device disposal: Are two hands better than one?

    NASA Astrophysics Data System (ADS)

    Erickson, David; Lacheray, Hervé; Lambert, Jason Michel; Mantegh, Iraj; Crymble, Derry; Daly, John; Zhao, Yan

    2012-06-01

    State-of-the-art robotic explosive ordnance disposal robotics have not, in general, adopted recent advances in control technology and man-machine interfaces and lag many years behind academia. This paper describes the Haptics-based Immersive Telerobotic System project investigating an immersive telepresence envrionment incorporating advanced vehicle control systems, Augmented immersive sensory feedback, dynamic 3D visual information, and haptic feedback for explosive ordnance disposal operators. The project aim is to provide operatiors a more sophisticated interface and expand sensory input to perform complex tasks to defeat improvised explosive devices successfully. The introduction of haptics and immersive teleprescence has the potential to shift the way teleprescence systems work for explosive ordnance disposal tasks or more widely for first responders scenarios involving remote unmanned ground vehicles.

  14. An Immersive VR System for Sports Education

    NASA Astrophysics Data System (ADS)

    Song, Peng; Xu, Shuhong; Fong, Wee Teck; Chin, Ching Ling; Chua, Gim Guan; Huang, Zhiyong

    The development of new technologies has undoubtedly promoted the advances of modern education, among which Virtual Reality (VR) technologies have made the education more visually accessible for students. However, classroom education has been the focus of VR applications whereas not much research has been done in promoting sports education using VR technologies. In this paper, an immersive VR system is designed and implemented to create a more intuitive and visual way of teaching tennis. A scalable system architecture is proposed in addition to the hardware setup layout, which can be used for various immersive interactive applications such as architecture walkthroughs, military training simulations, other sports game simulations, interactive theaters, and telepresent exhibitions. Realistic interaction experience is achieved through accurate and robust hybrid tracking technology, while the virtual human opponent is animated in real time using shader-based skin deformation. Potential future extensions are also discussed to improve the teaching/learning experience.

  15. Using virtual reality to augment perception, enhance sensorimotor adaptation, and change our minds.

    PubMed

    Wright, W Geoffrey

    2014-01-01

    Technological advances that involve human sensorimotor processes can have both intended and unintended effects on the central nervous system (CNS). This mini review focuses on the use of virtual environments (VE) to augment brain functions by enhancing perception, eliciting automatic motor behavior, and inducing sensorimotor adaptation. VE technology is becoming increasingly prevalent in medical rehabilitation, training simulators, gaming, and entertainment. Although these VE applications have often been shown to optimize outcomes, whether it be to speed recovery, reduce training time, or enhance immersion and enjoyment, there are inherent drawbacks to environments that can potentially change sensorimotor calibration. Across numerous VE studies over the years, we have investigated the effects of combining visual and physical motion on perception, motor control, and adaptation. Recent results from our research involving exposure to dynamic passive motion within a visually-depicted VE reveal that short-term exposure to augmented sensorimotor discordance can result in systematic aftereffects that last beyond the exposure period. Whether these adaptations are advantageous or not, remains to be seen. Benefits as well as risks of using VE-driven sensorimotor stimulation to enhance brain processes will be discussed.

  16. Assessment of chemicals released in the marine environment by dielectric elastomers useful as active elements in wave energy harvesters.

    PubMed

    Zaltariov, Mirela-Fernanda; Bele, Adrian; Vasiliu, Lavinia; Gradinaru, Luiza; Vornicu, Nicoleta; Racles, Carmen; Cazacu, Maria

    2018-01-05

    A series of elastomers, either natural or synthetic (some of them commercial, while others prepared in the laboratory), suitable for use as active elements in devices for wave energy harvesting, were evaluated concerning their behavior and effects on the marine environment. In this aim, the elastomer films, initially evaluated regarding their aspect, structure, surface wettability, and tolerance of microorganisms growth, were immersed in synthetic seawater (SSW) within six months for assessing compounds released. There were analyzed the changes occurred both in the elastomers and salt water in which they were immersed. For this, water samples taken at set time intervals were analyzed by using a sequence of sensitive spectral techniques: UV-vis, IR, and in relevant cases 1 H NMR and electrospray ionization mass spectrometry (ESI-MS), able to detect and identify organic compounds, while after six months, they were also investigated from the point of view of aspect, presence of metal traces, pH, and biological activity. The changes in aspect, structure and morphology of the dielectric films at the end of the dipping period were also evaluated by visual inspection, IR spectroscopy by using spectral subtraction method, and SEM-EDX technique. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Effects of catechin-enriched ion beverage intake on thermoregulatory function in a hot environment.

    PubMed

    Nishimura, Rumiko; Nishimura, Naoki; Iwase, Satoshi; Takeshita, Masao; Katashima, Mitsuhiro; Katsuragi, Yoshihisa; Sato, Motohiko

    2018-04-23

    We examined the effect of intake of a catechin-enriched ion beverage (Cat-I) on the thermoregulatory response in a hot environment. Eight healthy men were exposed to a hot environment for 90 min at an ambient temperature of 35 °C (relative humidity: 75%) combined with lower leg water immersion at 40 °C. At that time, either Cat-I, an ion beverage (Ion), or mineral water (Placebo) was consumed at three points: (1) at the start of lower leg immersion, (2) at 30 min after immersion, and (3) at 60 min after immersion. In all conditions, tympanic temperature (Tty) increased gradually during lower leg water immersion. However, the rate of increase of Tty tended to be suppressed after 30 min. The effect of drinking Cat-I had a limited detection period of approximately 60-70 min, and the rate of sweating was clearly increased with Cat-I compared with Ion and Placebo. Cat-I also tended to decrease the body temperature threshold at which sweating was induced compared with Ion or Placebo. These findings suggest that Cat-I efficiently suppressed the increase of body temperature in a hot environment.

  18. Interactive Visualization to Advance Earthquake Simulation

    NASA Astrophysics Data System (ADS)

    Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn

    2008-04-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.

  19. Immersive Environments - A Connectivist Approach

    NASA Astrophysics Data System (ADS)

    Loureiro, Ana; Bettencourt, Teresa

    We are conducting a research project with the aim of achieving better and more efficient ways to facilitate teaching and learning in Higher Level Education. We have chosen virtual environments, with particular emphasis to Second Life® platform augmented by web 2.0 tools, to develop the study. The Second Life® environment has some interesting characteristics that captured our attention, it is immersive; it is a real world simulator; it is a social network; it allows real time communication, cooperation, collaboration and interaction; it is a safe and controlled environment. We specifically chose tools from web 2.0 that enable sharing and collaborative way of learning. Through understanding the characteristics of this learning environment, we believe that immersive learning along with other virtual tools can be integrated in today's pedagogical practices.

  20. Hybrid 2-D and 3-D Immersive and Interactive User Interface for Scientific Data Visualization

    DTIC Science & Technology

    2017-08-01

    visualization, 3-D interactive visualization, scientific visualization, virtual reality, real -time ray tracing 16. SECURITY CLASSIFICATION OF: 17...scientists to employ in the real world. Other than user-friendly software and hardware setup, scientists also need to be able to perform their usual...and scientific visualization communities mostly have different research priorities. For the VR community, the ability to support real -time user

  1. Coercive Narratives, Motivation and Role Playing in Virtual Worlds

    DTIC Science & Technology

    2002-01-01

    resource for making immersive virtual environments highly engaging. Interaction also appeals to our natural desire to discover. Reading a book contains...participation in an open-ended Virtual Environment (VE). I intend to take advantage of a participants’ natural tendency to prefer interaction when possible...I hope this work will expand the potential of experience within virtual worlds. K e y w o r d s : Immersive Environments , Virtual Environments

  2. Do you see what I hear: experiments in multi-channel sound and 3D visualization for network monitoring?

    NASA Astrophysics Data System (ADS)

    Ballora, Mark; Hall, David L.

    2010-04-01

    Detection of intrusions is a continuing problem in network security. Due to the large volumes of data recorded in Web server logs, analysis is typically forensic, taking place only after a problem has occurred. This paper describes a novel method of representing Web log information through multi-channel sound, while simultaneously visualizing network activity using a 3-D immersive environment. We are exploring the detection of intrusion signatures and patterns, utilizing human aural and visual pattern recognition ability to detect intrusions as they occur. IP addresses and return codes are mapped to an informative and unobtrusive listening environment to act as a situational sound track of Web traffic. Web log data is parsed and formatted using Python, then read as a data array by the synthesis language SuperCollider [1], which renders it as a sonification. This can be done either for the study of pre-existing data sets or in monitoring Web traffic in real time. Components rendered aurally include IP address, geographical information, and server Return Codes. Users can interact with the data, speeding or slowing the speed of representation (for pre-existing data sets) or "mixing" sound components to optimize intelligibility for tracking suspicious activity.

  3. Cognitive Aspects of Collaboration in 3d Virtual Environments

    NASA Astrophysics Data System (ADS)

    Juřík, V.; Herman, L.; Kubíček, P.; Stachoň, Z.; Šašinka, Č.

    2016-06-01

    Human-computer interaction has entered the 3D era. The most important models representing spatial information — maps — are transferred into 3D versions regarding the specific content to be displayed. Virtual worlds (VW) become promising area of interest because of possibility to dynamically modify content and multi-user cooperation when solving tasks regardless to physical presence. They can be used for sharing and elaborating information via virtual images or avatars. Attractiveness of VWs is emphasized also by possibility to measure operators' actions and complex strategies. Collaboration in 3D environments is the crucial issue in many areas where the visualizations are important for the group cooperation. Within the specific 3D user interface the operators' ability to manipulate the displayed content is explored regarding such phenomena as situation awareness, cognitive workload and human error. For such purpose, the VWs offer a great number of tools for measuring the operators' responses as recording virtual movement or spots of interest in the visual field. Study focuses on the methodological issues of measuring the usability of 3D VWs and comparing them with the existing principles of 2D maps. We explore operators' strategies to reach and interpret information regarding the specific type of visualization and different level of immersion.

  4. Comparison of grasping movements made by healthy subjects in a 3-dimensional immersive virtual versus physical environment.

    PubMed

    Magdalon, Eliane C; Michaelsen, Stella M; Quevedo, Antonio A; Levin, Mindy F

    2011-09-01

    Virtual reality (VR) technology is being used with increasing frequency as a training medium for motor rehabilitation. However, before addressing training effectiveness in virtual environments (VEs), it is necessary to identify if movements made in such environments are kinematically similar to those made in physical environments (PEs) and the effect of provision of haptic feedback on these movement patterns. These questions are important since reach-to-grasp movements may be inaccurate when visual or haptic feedback is altered or absent. Our goal was to compare kinematics of reaching and grasping movements to three objects performed in an immersive three-dimensional (3D) VE with haptic feedback (cyberglove/grasp system) viewed through a head-mounted display to those made in an equivalent physical environment (PE). We also compared movements in PE made with and without wearing the cyberglove/grasp haptic feedback system. Ten healthy subjects (8 women, 62.1±8.8years) reached and grasped objects requiring 3 different grasp types (can, diameter 65.6mm, cylindrical grasp; screwdriver, diameter 31.6mm, power grasp; pen, diameter 7.5mm, precision grasp) in PE and visually similar virtual objects in VE. Temporal and spatial arm and trunk kinematics were analyzed. Movements were slower and grip apertures were wider when wearing the glove in both the PE and the VE compared to movements made in the PE without the glove. When wearing the glove, subjects used similar reaching trajectories in both environments, preserved the coordination between reaching and grasping and scaled grip aperture to object size for the larger object (cylindrical grasp). However, in VE compared to PE, movements were slower and had longer deceleration times, elbow extension was greater when reaching to the smallest object and apertures were wider for the power and precision grip tasks. Overall, the differences in spatial and temporal kinematics of movements between environments were greater than those due only to wearing the cyberglove/grasp system. Differences in movement kinematics due to the viewing environment were likely due to a lack of prior experience with the virtual environment, an uncertainty of object location and the restricted field-of-view when wearing the head-mounted display. The results can be used to inform the design and disposition of objects within 3D VEs for the study of the control of prehension and for upper limb rehabilitation. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Evaluation of historical museum interior lighting system using fully immersive virtual luminous environment

    NASA Astrophysics Data System (ADS)

    Navvab, Mojtaba; Bisegna, Fabio; Gugliermetti, Franco

    2013-05-01

    Saint Rocco Museum, a historical building in Venice, Italy is used as a case study to explore the performance of its' lighting system and visible light impact on viewing the large size art works. The transition from threedimensional architectural rendering to the three-dimensional virtual luminance mapping and visualization within a virtual environment is described as an integrated optical method for its application toward preservation of the cultural heritage of the space. Lighting simulation programs represent color as RGB triplets in a devicedependent color space such as ITU-R BT709. Prerequisite for this is a 3D-model which can be created within this computer aided virtual environment. The onsite measured surface luminance, chromaticity and spectral data were used as input to an established real-time indirect illumination and a physically based algorithms to produce the best approximation for RGB to be used as an input to generate the image of the objects. Conversion of RGB to and from spectra has been a major undertaking in order to match the infinite number of spectra to create the same colors that were defined by RGB in the program. The ability to simulate light intensity, candle power and spectral power distributions provide opportunity to examine the impact of color inter-reflections on historical paintings. VR offers an effective technique to quantify the visible light impact on human visual performance under precisely controlled representation of light spectrum that could be experienced in 3D format in a virtual environment as well as historical visual archives. The system can easily be expanded to include other measurements and stimuli.

  6. Immersive Media Environments for Special Education: Developing Agency in Communication for Youth with Autism

    ERIC Educational Resources Information Center

    Tolentino, Lisa

    2013-01-01

    This dissertation describes the development of a state-of-the-art immersive media environment and its potential to motivate high school youth with autism to vocally express themselves. Due to the limited availability of media environments in public education settings, studies on the use of such systems in special education contexts are rare. A…

  7. The effects of immersiveness on physiology.

    PubMed

    Wiederhold, B K; Davis, R; Wiederhold, M D

    1998-01-01

    The effects of varying levels of immersion in virtual reality environments on participant's heart rate, respiration rate, peripheral skin temperature, and skin resistance levels were examined. Subjective reports of presence were also noted. Participants were presented with a virtual environment of an airplane flight both as seen from a two-dimensional computer screen and as seen from within a head-mounted display. Subjects were randomly assigned to different order of conditions presented, but all subjects received both conditions. Differences between the non-phobics' physiological responses and the phobic's response when placed in a virtual environment related to the phobia were noted. Also noted were changes in physiology based on degree of immersion.

  8. Generating Contextual Descriptions of Virtual Reality (VR) Spaces

    NASA Astrophysics Data System (ADS)

    Olson, D. M.; Zaman, C. H.; Sutherland, A.

    2017-12-01

    Virtual reality holds great potential for science communication, education, and research. However, interfaces for manipulating data and environments in virtual worlds are limited and idiosyncratic. Furthermore, speech and vision are the primary modalities by which humans collect information about the world, but the linking of visual and natural language domains is a relatively new pursuit in computer vision. Machine learning techniques have been shown to be effective at image and speech classification, as well as at describing images with language (Karpathy 2016), but have not yet been used to describe potential actions. We propose a technique for creating a library of possible context-specific actions associated with 3D objects in immersive virtual worlds based on a novel dataset generated natively in virtual reality containing speech, image, gaze, and acceleration data. We will discuss the design and execution of a user study in virtual reality that enabled the collection and the development of this dataset. We will also discuss the development of a hybrid machine learning algorithm linking vision data with environmental affordances in natural language. Our findings demonstrate that it is possible to develop a model which can generate interpretable verbal descriptions of possible actions associated with recognized 3D objects within immersive VR environments. This suggests promising applications for more intuitive user interfaces through voice interaction within 3D environments. It also demonstrates the potential to apply vast bodies of embodied and semantic knowledge to enrich user interaction within VR environments. This technology would allow for applications such as expert knowledge annotation of 3D environments, complex verbal data querying and object manipulation in virtual spaces, and computer-generated, dynamic 3D object affordances and functionality during simulations.

  9. Best of Both Worlds

    ERIC Educational Resources Information Center

    Ramaswami, Rama

    2009-01-01

    Educators know that people learn best by doing. When students are doing something rather than reading or learning about it, they learn better. Immersive environments help students retain more information and speed up their learning. There's an enhancement in the way they learn. Immersive environments--which utilize technologies like simulations,…

  10. Proof of concept : examining characteristics of roadway infrastructure in various 3D visualization modes.

    DOT National Transportation Integrated Search

    2015-02-01

    Utilizing enhanced visualization in transportation planning and design gained popularity in the last decade. This work aimed at : demonstrating the concept of utilizing a highly immersive, virtual reality simulation engine for creating dynamic, inter...

  11. Can nature make us more caring? Effects of immersion in nature on intrinsic aspirations and generosity.

    PubMed

    Weinstein, Netta; Przybylski, Andrew K; Ryan, Richard M

    2009-10-01

    Four studies examined the effects of nature on valuing intrinsic and extrinsic aspirations. Intrinsic aspirations reflected prosocial and other-focused value orientations, and extrinsic aspirations predicted self-focused value orientations. Participants immersed in natural environments reported higher valuing of intrinsic aspirations and lower valuing of extrinsic aspirations, whereas those immersed in non-natural environments reported increased valuing of extrinsic aspirations and no change of intrinsic aspirations. Three studies explored experiences of nature relatedness and autonomy as underlying mechanisms of these effects, showing that nature immersion elicited these processes whereas non-nature immersion thwarted them and that they in turn predicted higher intrinsic and lower extrinsic aspirations. Studies 3 and 4 also extended the paradigm by testing these effects on generous decision making indicative of valuing intrinsic versus extrinsic aspirations.

  12. Understanding contributing factors to verbal coercion while studying abroad.

    PubMed

    Marcantonio, Tiffany; Angelone, D J; Joppa, Meredith

    2018-02-06

    Verbal coercion (VC) is a common sexual assault (SA) tactic whereby men and women can be victims or perpetrators. College study abroad students report engagement in casual sex, alcohol consumption, and immersion in a sexualized environment (eg, an environment that supports or encourages sexual activity); factors highly associated with SA in general. The purpose of this study was to examine casual sex, alcohol use, and sexualized environments with VC victimization (VCV) and perpetration (VCP) while abroad. Study abroad students (N = 130) completed questionnaires on alcohol use, casual sex, immersion in a sexualized environment, and VC experiences. Participants were more likely to report both VCV and VCP while abroad if they immersed themselves in a sexualized environment; identifying as male was associated with VCP. Results can inform intervention by providing directors with specific constructs to discuss in pre-departure meetings, such as the influence of the environment on VC.

  13. Evaluating visual discomfort in stereoscopic projection-based CAVE system with a close viewing distance

    NASA Astrophysics Data System (ADS)

    Song, Weitao; Weng, Dongdong; Feng, Dan; Li, Yuqian; Liu, Yue; Wang, Yongtian

    2015-05-01

    As one of popular immersive Virtual Reality (VR) systems, stereoscopic cave automatic virtual environment (CAVE) system is typically consisted of 4 to 6 3m-by-3m sides of a room made of rear-projected screens. While many endeavors have been made to reduce the size of the projection-based CAVE system, the issue of asthenopia caused by lengthy exposure to stereoscopic images in such CAVE with a close viewing distance was seldom tangled. In this paper, we propose a light-weighted approach which utilizes a convex eyepiece to reduce visual discomfort induced by stereoscopic vision. An empirical experiment was conducted to examine the feasibility of convex eyepiece in a large depth of field (DOF) at close viewing distance both objectively and subjectively. The result shows the positive effects of convex eyepiece on the relief of eyestrain.

  14. Eye Movements of Patients with Tunnel Vision while Walking

    PubMed Central

    Vargas-Martín, Fernando; Peli, Eli

    2006-01-01

    Purpose To determine how severe peripheral field loss (PFL) affects the dispersion of eye movements relative to the head, while walking in real environments. This information should help to better define the visual field and clearance requirements for head-mounted mobility visual aids. Methods Eye positions relative to the head were recorded in five retinitis pigmentosa patients with less than 15° of visual field and three normally-sighted people, each walking in varied environments for more than 30 minutes. The eye position recorder was made portable by modifying a head-mounted ISCAN system. Custom data processing was implemented to reject unreliable data. Sample standard deviations of eye position (dispersion) were compared across subject groups and environments. Results PFL patients exhibited narrower horizontal eye position dispersions than normally-sighted subjects (9.4° vs. 14.2°, p < 0.0001) and PFL patients’ vertical dispersions were smaller when walking indoors than outdoors (8.2° vs. 10.3°, p = 0.048). Conclusions When walking, the PFL patients did not increase their scanning eye movements to compensate for missing peripheral vision information. Their horizontal scanning was actually reduced, possibly because saccadic amplitude is limited by a lack of peripheral stimulation. The results suggest that a field-of-view as wide as 40° may be needed for closed (immersive) head-mounted mobility aids, while a much narrower display, perhaps as narrow as 20°, might be sufficient with an open design. PMID:17122116

  15. Eye movements of patients with tunnel vision while walking.

    PubMed

    Vargas-Martín, Fernando; Peli, Eli

    2006-12-01

    To determine how severe peripheral field loss (PFL) affects the dispersion of eye movements relative to the head in patients walking in real environments. This information should help to define the visual field and clearance requirements for head-mounted mobility visual aids. Eye positions relative to the head were recorded in five patients with retinitis pigmentosa who had less than 15 degrees of visual field and in three normally sighted people, each walking in varied environments for more than 30 minutes. The eye-position recorder was made portable by modifying a head-mounted system (ISCAN, Burlington, MA). Custom data processing was implemented, to reject unreliable data. Sample standard deviations of eye position (dispersion) were compared across subject groups and environments. The patients with PFL exhibited narrower horizontal eye-position dispersions than did the normally sighted subjects (9.4 degrees vs. 14.2 degrees , P < 0.0001), and the vertical dispersions of patients with PFL were smaller when they were walking indoors than when walking outdoors (8.2 degrees vs. 10.3 degrees ; P = 0.048). When walking, the patients with PFL did not increase their scanning eye movements to compensate for missing peripheral vision information. Their horizontal scanning was actually reduced, possibly because of lack of peripheral stimulation. The results suggest that a field of view as wide as 40 degrees may be needed for closed (immersive) head-mounted mobility aids, whereas a much narrower display, perhaps as narrow as 20 degrees , may be sufficient with an open design.

  16. Modulation of Excitability in the Temporoparietal Junction Relieves Virtual Reality Sickness.

    PubMed

    Takeuchi, Naoyuki; Mori, Takayuki; Suzukamo, Yoshimi; Izumi, Shin-Ichi

    2018-06-01

    Virtual reality (VR) immersion often provokes subjective discomfort and postural instability, so called VR sickness. The neural mechanism of VR sickness is speculated to be related to visual-vestibular information mismatch and/or postural instability. However, the approaches proposed to relieve VR sickness through modulation of brain activity are poorly understood. Using transcranial direct current stimulation (tDCS), we aimed to investigate whether VR sickness could be relieved by the modulation of cortical excitability in the temporoparietal junction (TPJ), which is known to be involved in processing of both vestibular and visual information. Twenty healthy subjects received tDCS over right TPJ before VR immersion. The order of the three types of tDCS (anodal, cathodal, and sham) was counterbalanced across subjects. We evaluated the subjective symptoms, heart rate, and center of pressure at baseline, after tDCS, and after VR immersion. VR immersion using head-mounted displays provoked subjective discomfort and postural instability. However, anodal tDCS over right TPJ ameliorated subjective disorientation symptoms and postural instability induced by VR immersion compared with sham condition. The amelioration of VR sickness by anodal tDCS over the right TPJ might result from relief of the sensory conflict and/or facilitation of vestibular function. Our result not only has potential clinical implications for the neuromodulation approach of VR sickness but also implies a causal role of the TPJ in VR sickness.

  17. Eye height scaling of absolute size in immersive and nonimmersive displays

    NASA Technical Reports Server (NTRS)

    Dixon, M. W.; Wraga, M.; Proffitt, D. R.; Williams, G. C.; Kaiser, M. K. (Principal Investigator)

    2000-01-01

    Eye-height (EH) scaling of absolute height was investigated in three experiments. In Experiment 1, standing observers viewed cubes in an immersive virtual environment. Observers' center of projection was placed at actual EH and at 0.7 times actual EH. Observers' size judgments revealed that the EH manipulation was 76.8% effective. In Experiment 2, seated observers viewed the same cubes on an interactive desktop display; however, no effect of EH was found in response to the simulated EH manipulation. Experiment 3 tested standing observers in the immersive environment with the field of view reduced to match that of the desktop. Comparable to Experiment 1, the effect of EH was 77%. These results suggest that EH scaling is not generally used when people view an interactive desktop display because the altitude of the center of projection is indeterminate. EH scaling is spontaneously evoked, however, in immersive environments.

  18. Vertical ground reaction force in stationary running in water and on land: A study with a wide range of cadences.

    PubMed

    de Brito Fontana, Heiliane; Ruschel, Caroline; Dell'Antonio, Elisa; Haupenthal, Alessandro; Pereira, Gustavo Soares; Roesler, Helio

    2018-04-01

    The aim of this study was to analyze the effect of cadence, immersion level as well as body density on the vertical component (Fy max ) of ground reaction force (GRF) during stationary running (SR). In a controlled, laboratory study, thirty-two subjects ran at a wide range of cadences (85-210 steps/min) in water, immersed to the hip and to the chest, and on dry land. Fy max. was verified by a waterproof force measurement system and predicted based on a statistical model including cadence, immersion ratio and body density. The effect of cadence was shown to depend on the environment: while Fy max increases linearly with increasing cadence on land; in water, Fy max reaches a plateau at both hip and chest immersions. All factors analyzed, cadence, immersion level and body density affected Fy max significantly, with immersion (aquatic × land environment) showing the greatest effect. In water, different cadences may lead to bigger changes in Fy max than the changes obtained by moving subjects from hip to chest immersion. A regression model able to predict 69% of Fy max variability in water was proposed and validated. Cadence, Immersion and body density affect Fy max in a significant and non-independent way. Besides a model of potential use in the prescription of stationary running in water, our analysis provides insights into the different responses of GRF to changes in exercise parameters between land and aquatic environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. A Virtual World for Collaboration: The AETZone

    ERIC Educational Resources Information Center

    Cheney, Amelia W.; Sanders, Robert L.; Matzen, Nita J.; Bronack, Stephen C.; Riedl, Richard E.; Tashner, John H.

    2009-01-01

    Participation in learning communities, and the construction of knowledge in communities of practice, are important considerations in the use of 3D immersive worlds. This article describes the creation of this type of learning environment in AETZone, an immersive virtual environment in use within graduate programs at Appalachian State University…

  20. Immersive environment technologies for planetary exploration with applications for mixed reality

    NASA Technical Reports Server (NTRS)

    Wright, J.; Hartman, F.; Cooper, B.

    2002-01-01

    Immersive environments are successfully being used to support mission operations at JPL. This technology contributed to the Mars Pathfinder Mission in planning sorties for the Sojourner rover. Results and operational experiences with these tools are being incorporated into the development of the second generation of mission planning tools.

  1. Considerations for the future development of virtual technology as a rehabilitation tool

    PubMed Central

    Kenyon, Robert V; Leigh, Jason; Keshner, Emily A

    2004-01-01

    Background Virtual environments (VE) are a powerful tool for various forms of rehabilitation. Coupling VE with high-speed networking [Tele-Immersion] that approaches speeds of 100 Gb/sec can greatly expand its influence in rehabilitation. Accordingly, these new networks will permit various peripherals attached to computers on this network to be connected and to act as fast as if connected to a local PC. This innovation may soon allow the development of previously unheard of networked rehabilitation systems. Rapid advances in this technology need to be coupled with an understanding of how human behavior is affected when immersed in the VE. Methods This paper will discuss various forms of VE that are currently available for rehabilitation. The characteristic of these new networks and examine how such networks might be used for extending the rehabilitation clinic to remote areas will be explained. In addition, we will present data from an immersive dynamic virtual environment united with motion of a posture platform to record biomechanical and physiological responses to combined visual, vestibular, and proprioceptive inputs. A 6 degree-of-freedom force plate provides measurements of moments exerted on the base of support. Kinematic data from the head, trunk, and lower limb was collected using 3-D video motion analysis. Results Our data suggest that when there is a confluence of meaningful inputs, neither vision, vestibular, or proprioceptive inputs are suppressed in healthy adults; the postural response is modulated by all existing sensory signals in a non-additive fashion. Individual perception of the sensory structure appears to be a significant component of the response to these protocols and underlies much of the observed response variability. Conclusion The ability to provide new technology for rehabilitation services is emerging as an important option for clinicians and patients. The use of data mining software would help analyze the incoming data to provide both the patient and the therapist with evaluation of the current treatment and modifications needed for future therapies. Quantification of individual perceptual styles in the VE will support development of individualized treatment programs. The virtual environment can be a valuable tool for therapeutic interventions that require adaptation to complex, multimodal environments. PMID:15679951

  2. Along the Virtuality Continuum - Two Showcases on how xR Technologies Transform Geoscience Research and Education

    NASA Astrophysics Data System (ADS)

    Klippel, A.; Zhao, J.; Masrur, A.; Wallgruen, J. O.; La Femina, P. C.

    2017-12-01

    We present work along the virtuality continuum showcasing both AR and VR environments for geoscience applications and research. The AR/VR project focusses on one of the most prominent landmarks on the Penn State campus which, at the same time, is a representation of the geology of Pennsylvania. The Penn State Obelisk is a 32" high, 51 ton monument composed of 281 rocks collected from across Pennsylvania. While information about its origins and composition are scattered in articles and some web databases, we compiled all the available data from the web and archives and curated them as a basis for an immersive xR experience. Tabular data was amended by xR data such as 360° photos, videos, and 3D models (e.g., the Obelisk). Our xR (both AR and VR) prototype provides an immersive analytical environment that supports interactive data visualization and virtual navigation in a natural environment (a campus model of today and of 1896, the year of the Obelisk's installation). This work-in-progress project can provide an interactive immersive learning platform (specifically, for K-12 and introductory level geosciences students) where learning process is enhanced through seamless navigation between 3D data space and physical space. The, second, VR focused application is creating and empirically evaluating virtual reality (VR) experiences for geosciences research, specifically, an interactive volcano experience based on LiDAR and image data of Iceland's Thrihnukar volcano. The prototype addresses the lack of content and tools for immersive virtual reality (iVR) in geoscientific education and research and how to make it easier to integrate iVR into research and classroom experiences. It makes use of environmentally sensed data such that interaction and linked content can be integrated into a single experience. We discuss our workflows as well as methods and authoring tools for iVR analysis and creation of virtual experiences. These methods and tools aim to enhance the utility of geospatial data from repositories such as OpenTopography.org through unlocking treasure-troves of geospatial data for VR applications. Their enhanced accessibility in education and research for the geosciences and beyond will benefit geoscientists and educators who cannot be expected to be VR and 3D application experts.

  3. 3D Immersive Visualization: An Educational Tool in Geosciences

    NASA Astrophysics Data System (ADS)

    Pérez-Campos, N.; Cárdenas-Soto, M.; Juárez-Casas, M.; Castrejón-Pineda, R.

    2007-05-01

    3D immersive visualization is an innovative tool currently used in various disciplines, such as medicine, architecture, engineering, video games, etc. Recently, the Universidad Nacional Autónoma de México (UNAM) mounted a visualization theater (Ixtli) with leading edge technology, for academic and research purposes that require immersive 3D tools for a better understanding of the concepts involved. The Division of Engineering in Earth Sciences of the School of Engineering, UNAM, is running a project focused on visualization of geoscience data. Its objective is to incoporate educational material in geoscience courses in order to support and to improve the teaching-learning process, especially in well-known difficult topics for students. As part of the project, proffessors and students are trained in visualization techniques, then their data are adapted and visualized in Ixtli as part of a class or a seminar, where all the attendants can interact, not only among each other but also with the object under study. As part of our results, we present specific examples used in basic geophysics courses, such as interpreted seismic cubes, seismic-wave propagation models, and structural models from bathymetric, gravimetric and seismological data; as well as examples from ongoing applied projects, such as a modeled SH upward wave, the occurrence of an earthquake cluster in 1999 in the Popocatepetl volcano, and a risk atlas from Delegación Alvaro Obregón in Mexico City. All these examples, plus those to come, constitute a library for students and professors willing to explore another dimension of the teaching-learning process. Furthermore, this experience can be enhaced by rich discussions and interactions by videoconferences with other universities and researchers.

  4. Training presence: the importance of virtual reality experience on the "sense of being there".

    PubMed

    Gamito, Pedro; Oliveira, Jorge; Morais, Diogo; Baptista, André; Santos, Nuno; Soares, Fábio; Saraiva, Tomaz; Rosa, Pedro

    2010-01-01

    Nature and origin of presence are still unclear. Although it can be characterized, under a neurophysiological perspective, as a process resulting from a synchrony between cognitive and perceptive systems, the multitude of associated processes reduces the chances of brain mapping presence. In this way, our study was designed in order to understand the possible role of VR experience on presence in a virtual environment. For our study, 16 participants (M=28.39 years; SD=13.44) of both genders without computer experience were selected. The study design consisted of two assessments (initial and final), where the participants were evaluated with BFI, PQ, ITQ, QC, MCSDS-SF, STAI, visual attention and behavioral measures after playing an first person shooter (FPS) game. In order to manipulate the level of VR experience the participants were trained on, a different FPS was used during the 12 weekly sessions of 30 minutes. Results revealed significant differences between the first and final assessment for presence (F(1,15)=11.583; MSE=775.538; p<01) and immersion scores (F(1,15)=6.234; MSE=204.962; p<05), indicating higher levels of presence and immersion in the final assessment. No statistical significant results were obtained for cybersickness or the behavioral measures. In summary, our results showed that training and the subsequent higher computer experience levels can increase immersion and presence.

  5. Using Biofeedback while Immersed in a Stressful Videogame Increases the Effectiveness of Stress Management Skills in Soldiers

    PubMed Central

    Bouchard, Stéphane; Bernier, François; Boivin, Éric; Morin, Brian; Robillard, Geneviève

    2012-01-01

    This study assessed the efficacy of using visual and auditory biofeedback while immersed in a tridimensional videogame to practice a stress management skill (tactical breathing). All 41 participants were soldiers who had previously received basic stress management training and first aid training in combat. On the first day, they received a 15-minute refresher briefing and were randomly assigned to either: (a) no additional stress management training (SMT) for three days, or (b) 30-minute sessions (one per day for three days) of biofeedback-assisted SMT while immersed in a horror/first-person shooter game. The training was performed in a dark and enclosed environment using a 50-inch television with active stereoscopic display and loudspeakers. On the last day, all participants underwent a live simulated ambush with an improvised explosive device, where they had to provide first aid to a wounded soldier. Stress levels were measured with salivary cortisol collected when waking-up, before and after the live simulation. Stress was also measured with heart rate at baseline, during an apprehension phase, and during the live simulation. Repeated-measure ANOVAs and ANCOVAs confirmed that practicing SMT was effective in reducing stress. Results are discussed in terms of the advantages of the proposed program for military personnel and the need to practice SMT. PMID:22558370

  6. Immersive STEM: From Fulldome to VR Technologies

    NASA Astrophysics Data System (ADS)

    Wyatt, R. J.

    2015-12-01

    For more than 15 years, fulldome video technology has transformed planetariums worldwide, using data-driven visualizations to support science storytelling. Fulldome video shares significant technical infrastructure with emerging VR headset technologies, and these personalized VR experiences allow for new audiences and new experiences of an existing library of context—as well as affording new opportunities for fulldome producers to explore. At the California Academy of Sciences, we are translating assets for our planetarium shows into immersive experiences for a variety of HR headsets. We have adapted scenes from our four award-wining features—Fragile Planet (2008), Life: A Cosmic Story (2010), Earthquake: Evidence of a Restless Planet (2012), and Habitat Earth (2015)—to place viewers inside a virtual planetarium viewing the shows. Similarly, we have released two creative-commons mini-shows on various VR outlets. This presentation will also highlight content the Academy will make available from our upcoming 2016 planetarium show about asteroids, comets, and solar system origins, some of which has been formatted for a full four-pi-steradian perspective. The shared immersive environment of digital planetariums offers significant opportunities for education and affective engagement of STEM-hungry audiences—including students, families, and adults. With the advent of VR technologies, we can leverage the experience of fulldome producers and planetarium professionals to create personalized home experiences that allow new ways to experience their content.

  7. Virtual Reality in Neurointervention.

    PubMed

    Ong, Chin Siang; Deib, Gerard; Yesantharao, Pooja; Qiao, Ye; Pakpoor, Jina; Hibino, Narutoshi; Hui, Ferdinand; Garcia, Juan R

    2018-06-01

    Virtual reality (VR) allows users to experience realistic, immersive 3D virtual environments with the depth perception and binocular field of view of real 3D settings. Newer VR technology has now allowed for interaction with 3D objects within these virtual environments through the use of VR controllers. This technical note describes our preliminary experience with VR as an adjunct tool to traditional angiographic imaging in the preprocedural workup of a patient with a complex pseudoaneurysm. Angiographic MRI data was imported and segmented to create 3D meshes of bilateral carotid vasculature. The 3D meshes were then projected into VR space, allowing the operator to inspect the carotid vasculature using a 3D VR headset as well as interact with the pseudoaneurysm (handling, rotation, magnification, and sectioning) using two VR controllers. 3D segmentation of a complex pseudoaneurysm in the distal cervical segment of the right internal carotid artery was successfully performed and projected into VR. Conventional and VR visualization modes were equally effective in identifying and classifying the pathology. VR visualization allowed the operators to manipulate the dataset to achieve a greater understanding of the anatomy of the parent vessel, the angioarchitecture of the pseudoaneurysm, and the surface contours of all visualized structures. This preliminary study demonstrates the feasibility of utilizing VR for preprocedural evaluation in patients with anatomically complex neurovascular disorders. This novel visualization approach may serve as a valuable adjunct tool in deciding patient-specific treatment plans and selection of devices prior to intervention.

  8. "The Future Is Old": Immersive Learning with Generation Y Engineering Students

    ERIC Educational Resources Information Center

    Blashki, Katherine; Nichol, Sophie; Jia, Dawei; Prompramote, Supawan

    2007-01-01

    This paper explores the application of four elements deemed to be essential to immersive learning; immersion, engagement, risk/creativity and agency. The authors discuss the implementation of these four elements within two very different classroom environments, one secondary and one tertiary, to illustrate the importance of students' active…

  9. Our Young Cultural Ambassadors: Montessori Peacemakers for a Modern World

    ERIC Educational Resources Information Center

    Carver-Akers, Kateri; Markatos-Soriano, Kristine

    2007-01-01

    This article describes the Language Center Montessori School in Chapel Hill, North Carolina, where students are learning in a language-immersion Montessori environment. The school offers a choice to parents--Spanish immersion or French immersion--but Montessori comes with both. The school's motivation for promoting bilingualism is to improve…

  10. Reclassification Patterns among Latino English Learner Students in Bilingual, Dual Immersion, and English Immersion Classrooms

    ERIC Educational Resources Information Center

    Umansky, Ilana M.; Reardon, Sean F.

    2014-01-01

    Schools are under increasing pressure to reclassify their English learner (EL) students to "fluent English proficient" status as quickly as possible. This article examines timing to reclassification among Latino ELs in four distinct linguistic instructional environments: English immersion, transitional bilingual, maintenance bilingual,…

  11. Simultsonic: A Simulation Tool for Ultrasonic Inspection

    NASA Astrophysics Data System (ADS)

    Krishnamurthy, Adarsh; Karthikeyan, Soumya; Krishnamurthy, C. V.; Balasubramaniam, Krishnan

    2006-03-01

    A simulation program SIMULTSONIC is under development at CNDE to help determine and/or help optimize ultrasonic probe locations for inspection of complex components. SIMULTSONIC provides a ray-trace based assessment initially followed by a displacement or pressure field-based assessment for user-specified probe positions and user-selected component. Immersion and contact modes of inspection are available in SIMULTSONIC. The code written in Visual C++ operating in Microsoft Windows environment provides an interactive user interface. In this paper, the application of SIMULTSONIC to the inspection of very thin-walled pipes (with 450 um wall thickness) is described. Ray trace based assessment was done using SIMULTSONIC to determine the standoff distance and the angle of oblique incidence for an immersion mode focused transducer. A 3-cycle Hanning window pulse was chosen for simulations. Experiments were carried out to validate the simulations. The A-scans and the associated B-Scan images obtained through simulations show good correlation with experimental results, both with the arrival time of the signal as well as with the signal amplitudes. The scope of SIMULTSONIC to deal with parametrically represented surfaces will also be discussed.

  12. Simultaneous neural and movement recording in large-scale immersive virtual environments.

    PubMed

    Snider, Joseph; Plank, Markus; Lee, Dongpyo; Poizner, Howard

    2013-10-01

    Virtual reality (VR) allows precise control and manipulation of rich, dynamic stimuli that, when coupled with on-line motion capture and neural monitoring, can provide a powerful means both of understanding brain behavioral relations in the high dimensional world and of assessing and treating a variety of neural disorders. Here we present a system that combines state-of-the-art, fully immersive, 3D, multi-modal VR with temporally aligned electroencephalographic (EEG) recordings. The VR system is dynamic and interactive across visual, auditory, and haptic interactions, providing sight, sound, touch, and force. Crucially, it does so with simultaneous EEG recordings while subjects actively move about a 20 × 20 ft² space. The overall end-to-end latency between real movement and its simulated movement in the VR is approximately 40 ms. Spatial precision of the various devices is on the order of millimeters. The temporal alignment with the neural recordings is accurate to within approximately 1 ms. This powerful combination of systems opens up a new window into brain-behavioral relations and a new means of assessment and rehabilitation of individuals with motor and other disorders.

  13. Semi-Immersive Virtual Turbine Engine Simulation System

    NASA Astrophysics Data System (ADS)

    Abidi, Mustufa H.; Al-Ahmari, Abdulrahman M.; Ahmad, Ali; Darmoul, Saber; Ameen, Wadea

    2018-05-01

    The design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.

  14. Change Blindness Phenomena for Virtual Reality Display Systems.

    PubMed

    Steinicke, Frank; Bruder, Gerd; Hinrichs, Klaus; Willemsen, Pete

    2011-09-01

    In visual perception, change blindness describes the phenomenon that persons viewing a visual scene may apparently fail to detect significant changes in that scene. These phenomena have been observed in both computer-generated imagery and real-world scenes. Several studies have demonstrated that change blindness effects occur primarily during visual disruptions such as blinks or saccadic eye movements. However, until now the influence of stereoscopic vision on change blindness has not been studied thoroughly in the context of visual perception research. In this paper, we introduce change blindness techniques for stereoscopic virtual reality (VR) systems, providing the ability to substantially modify a virtual scene in a manner that is difficult for observers to perceive. We evaluate techniques for semiimmersive VR systems, i.e., a passive and active stereoscopic projection system as well as an immersive VR system, i.e., a head-mounted display, and compare the results to those of monoscopic viewing conditions. For stereoscopic viewing conditions, we found that change blindness phenomena occur with the same magnitude as in monoscopic viewing conditions. Furthermore, we have evaluated the potential of the presented techniques for allowing abrupt, and yet significant, changes of a stereoscopically displayed virtual reality environment.

  15. Data Visualization in Information Retrieval and Data Mining (SIG VIS).

    ERIC Educational Resources Information Center

    Efthimiadis, Efthimis

    2000-01-01

    Presents abstracts that discuss using data visualization for information retrieval and data mining, including immersive information space and spatial metaphors; spatial data using multi-dimensional matrices with maps; TREC (Text Retrieval Conference) experiments; users' information needs in cartographic information retrieval; and users' relevance…

  16. Feasibility of Using an Augmented Immersive Virtual Reality Learning Environment to Enhance Music Conducting Skills

    ERIC Educational Resources Information Center

    Orman, Evelyn K.; Price, Harry E.; Russell, Christine R.

    2017-01-01

    Acquiring nonverbal skills necessary to appropriately communicate and educate members of performing ensembles is essential for wind band conductors. Virtual reality learning environments (VRLEs) provide a unique setting for developing these proficiencies. For this feasibility study, we used an augmented immersive VRLE to enhance eye contact, torso…

  17. Measuring Flow Experience in an Immersive Virtual Environment for Collaborative Learning

    ERIC Educational Resources Information Center

    van Schaik, P.; Martin, S.; Vallance, M.

    2012-01-01

    In contexts other than immersive virtual environments, theoretical and empirical work has identified flow experience as a major factor in learning and human-computer interaction. Flow is defined as a "holistic sensation that people feel when they act with total involvement". We applied the concept of flow to modeling the experience of…

  18. Using Immersive Virtual Environments for Certification

    NASA Technical Reports Server (NTRS)

    Lutz, R.; Cruz-Neira, C.

    1998-01-01

    Immersive virtual environments (VEs) technology has matured to the point where it can be utilized as a scientific and engineering problem solving tool. In particular, VEs are starting to be used to design and evaluate safety-critical systems that involve human operators, such as flight and driving simulators, complex machinery training, and emergency rescue strategies.

  19. Streamlining Simulation Development using a Commercial Game Engine

    DTIC Science & Technology

    2009-10-01

    few years. The realism is stunning and the Commercial Game Industry fuels the fire of cutting edge advances in hardware and immersive experiences...Technology applies to Military training in more than just the obvious upgrades in game engines and hardware. The increased visual realism and performance...elaborate storytelling and cinematic effects provide a more immersive and compelling experience to the player. The underlying game engine technology

  20. Tackling the challenges of fully immersive head-mounted AR devices

    NASA Astrophysics Data System (ADS)

    Singer, Wolfgang; Hillenbrand, Matthias; Münz, Holger

    2017-11-01

    The optical requirements of fully immersive head mounted AR devices are inherently determined by the human visual system. The etendue of the visual system is large. As a consequence, the requirements for fully immersive head-mounted AR devices exceeds almost any high end optical system. Two promising solutions to achieve the large etendue and their challenges are discussed. Head-mounted augmented reality devices have been developed for decades - mostly for application within aircrafts and in combination with a heavy and bulky helmet. The established head-up displays for applications within automotive vehicles typically utilize similar techniques. Recently, there is the vision of eyeglasses with included augmentation, offering a large field of view, and being unobtrusively all-day wearable. There seems to be no simple solution to reach the functional performance requirements. Known technical solutions paths seem to be a dead-end, and some seem to offer promising perspectives, however with severe limitations. As an alternative, unobtrusively all-day wearable devices with a significantly smaller field of view are already possible.

  1. Time Series Data Visualization in World Wide Telescope

    NASA Astrophysics Data System (ADS)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  2. The cranial nerve skywalk: A 3D tutorial of cranial nerves in a virtual platform.

    PubMed

    Richardson-Hatcher, April; Hazzard, Matthew; Ramirez-Yanez, German

    2014-01-01

    Visualization of the complex courses of the cranial nerves by students in the health-related professions is challenging through either diagrams in books or plastic models in the gross laboratory. Furthermore, dissection of the cranial nerves in the gross laboratory is an extremely meticulous task. Teaching and learning the cranial nerve pathways is difficult using two-dimensional (2D) illustrations alone. Three-dimensional (3D) models aid the teacher in describing intricate and complex anatomical structures and help students visualize them. The study of the cranial nerves can be supplemented with 3D, which permits the students to fully visualize their distribution within the craniofacial complex. This article describes the construction and usage of a virtual anatomy platform in Second Life™, which contains 3D models of the cranial nerves III, V, VII, and IX. The Cranial Nerve Skywalk features select cranial nerves and the associated autonomic pathways in an immersive online environment. This teaching supplement was introduced to groups of pre-healthcare professional students in gross anatomy courses at both institutions and student feedback is included. © 2014 American Association of Anatomists.

  3. Evaluation of the Texas Technology Immersion Pilot: First-Year Results

    ERIC Educational Resources Information Center

    Shapley, Kelly; Sheehan, Daniel; Sturges, Keith; Caranikas-Walker, Fanny; Huntsberger, Briana; Maloney, Catherine

    2006-01-01

    The Technology Immersion Pilot (TIP) sets forth a vision for technology immersion in Texas public schools. The Texas Education Agency (TEA) directed nearly $14 million in federal Title II, Part D monies toward funding a wireless learning environment for high-need middle schools through a competitive grant process. A concurrent research project…

  4. Immersive Simulations for Smart Classrooms: Exploring Evolutionary Concepts in Secondary Science

    ERIC Educational Resources Information Center

    Lui, Michelle; Slotta, James D.

    2014-01-01

    This article presents the design of an immersive simulation and inquiry activity for technology-enhanced classrooms. Using a co-design method, researchers worked with a high school biology teacher to create a rainforest simulation, distributed across several large displays in the room to immerse students in the environment. The authors created and…

  5. Use of Immersive Simulations to Enhance Graduate Student Learning: Implications for Educational Leadership Programs

    ERIC Educational Resources Information Center

    Voelkel, Robert H.; Johnson, Christie W.; Gilbert, Kristen A.

    2016-01-01

    The purpose of this article is to present how one university incorporates immersive simulations through platforms which employ avatars to enhance graduate student understanding and learning in educational leadership programs. While using simulations and immersive virtual environments continues to grow, the literature suggests limited evidence of…

  6. Real-time 3D visualization of volumetric video motion sensor data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, J.; Stansfield, S.; Shawver, D.

    1996-11-01

    This paper addresses the problem of improving detection, assessment, and response capabilities of security systems. Our approach combines two state-of-the-art technologies: volumetric video motion detection (VVMD) and virtual reality (VR). This work capitalizes on the ability of VVMD technology to provide three-dimensional (3D) information about the position, shape, and size of intruders within a protected volume. The 3D information is obtained by fusing motion detection data from multiple video sensors. The second component involves the application of VR technology to display information relating to the sensors and the sensor environment. VR technology enables an operator, or security guard, to bemore » immersed in a 3D graphical representation of the remote site. VVMD data is transmitted from the remote site via ordinary telephone lines. There are several benefits to displaying VVMD information in this way. Because the VVMD system provides 3D information and because the sensor environment is a physical 3D space, it seems natural to display this information in 3D. Also, the 3D graphical representation depicts essential details within and around the protected volume in a natural way for human perception. Sensor information can also be more easily interpreted when the operator can `move` through the virtual environment and explore the relationships between the sensor data, objects and other visual cues present in the virtual environment. By exploiting the powerful ability of humans to understand and interpret 3D information, we expect to improve the means for visualizing and interpreting sensor information, allow a human operator to assess a potential threat more quickly and accurately, and enable a more effective response. This paper will detail both the VVMD and VR technologies and will discuss a prototype system based upon their integration.« less

  7. The effect of visual-vestibulosomatosensory conflict induced by virtual reality on postural stability in humans.

    PubMed

    Nishiike, Suetaka; Okazaki, Suzuyo; Watanabe, Hiroshi; Akizuki, Hironori; Imai, Takao; Uno, Atsuhiko; Kitahara, Tadashi; Horii, Arata; Takeda, Noriaki; Inohara, Hidenori

    2013-01-01

    In this study, we examined the effects of sensory inputs of visual-vestibulosomatosensory conflict induced by virtual reality (VR) on subjective dizziness, posture stability and visual dependency on postural control in humans. Eleven healthy young volunteers were immersed in two different VR conditions. In the control condition, subjects walked voluntarily with the background images of interactive computer graphics proportionally synchronized to their walking pace. In the visual-vestibulosomatosensory conflict condition, subjects kept still, but the background images that subjects experienced in the control condition were presented. The scores of both Graybiel's and Hamilton's criteria, postural instability and Romberg ratio were measured before and after the two conditions. After immersion in the conflict condition, both subjective dizziness and objective postural instability were significantly increased, and Romberg ratio, an index of the visual dependency on postural control, was slightly decreased. These findings suggest that sensory inputs of visual-vestibulosomatosensory conflict induced by VR induced motion sickness, resulting in subjective dizziness and postural instability. They also suggest that adaptation to the conflict condition decreases the contribution of visual inputs to postural control with re-weighing of vestibulosomatosensory inputs. VR may be used as a rehabilitation tool for dizzy patients by its ability to induce sensory re-weighing of postural control.

  8. Correcting Distance Estimates by Interacting With Immersive Virtual Environments: Effects of Task and Available Sensory Information

    ERIC Educational Resources Information Center

    Waller, David; Richardson, Adam R.

    2008-01-01

    The tendency to underestimate egocentric distances in immersive virtual environments (VEs) is not well understood. However, previous research (A. R. Richardson & D. Waller, 2007) has demonstrated that a brief period of interaction with the VE prior to making distance judgments can effectively eliminate subsequent underestimation. Here the authors…

  9. The Effects of Instructor-Avatar Immediacy in Second Life, an Immersive and Interactive Three-Dimensional Virtual Environment

    ERIC Educational Resources Information Center

    Lawless-Reljic, Sabine Karine

    2010-01-01

    Growing interest of educational institutions in desktop 3D graphic virtual environments for hybrid and distance education prompts questions on the efficacy of such tools. Virtual worlds, such as Second Life[R], enable computer-mediated immersion and interactions encompassing multimodal communication channels including audio, video, and text-.…

  10. Effect of blueberry agro-industrial waste addition to corn starch-based films for the production of a pH-indicator film.

    PubMed

    Luchese, Cláudia Leites; Sperotto, Natalia; Spada, Jordana Corralo; Tessaro, Isabel Cristina

    2017-11-01

    Intelligent packaging is an emerging area of food technology that can provide better preservation and be of further convenience for consumers. It is recommended that biodegradable materials be used to develop low-impact designs for better packaging, which could benefit the environment by simply expanding their use to new areas. In this work, corn starch, glycerol and blueberry powder (with and without prior fruit bleaching) were used to produce films by casting. Blueberry powder, a co-product from juice processing, which is rich in anthocyanins, was added in the films to evaluate its potential as a colorimetric indicator, due to the ability of anthocyanin to change color when placed in an acidic or basic environment. After the films were immersed in different buffer solutions, visual color changes were observed, where the films became reddish at acidic pH and bluish at basic pH. The ΔE* values were greater than 3, suggesting a visually perceptible change to the human eye. The samples with fruit bleaching (CB) were visually darker (lower luminance values), while the samples without bleaching (SB) had a lighter color and higher brightness, represented by larger L* values. These results indicate the potential of blueberry powder as a pH indicator for intelligent food packaging or even for sensing food deterioration. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. MinOmics, an Integrative and Immersive Tool for Multi-Omics Analysis.

    PubMed

    Maes, Alexandre; Martinez, Xavier; Druart, Karen; Laurent, Benoist; Guégan, Sean; Marchand, Christophe H; Lemaire, Stéphane D; Baaden, Marc

    2018-06-21

    Proteomic and transcriptomic technologies resulted in massive biological datasets, their interpretation requiring sophisticated computational strategies. Efficient and intuitive real-time analysis remains challenging. We use proteomic data on 1417 proteins of the green microalga Chlamydomonas reinhardtii to investigate physicochemical parameters governing selectivity of three cysteine-based redox post translational modifications (PTM): glutathionylation (SSG), nitrosylation (SNO) and disulphide bonds (SS) reduced by thioredoxins. We aim to understand underlying molecular mechanisms and structural determinants through integration of redox proteome data from gene- to structural level. Our interactive visual analytics approach on an 8.3 m2 display wall of 25 MPixel resolution features stereoscopic three dimensions (3D) representation performed by UnityMol WebGL. Virtual reality headsets complement the range of usage configurations for fully immersive tasks. Our experiments confirm that fast access to a rich cross-linked database is necessary for immersive analysis of structural data. We emphasize the possibility to display complex data structures and relationships in 3D, intrinsic to molecular structure visualization, but less common for omics-network analysis. Our setup is powered by MinOmics, an integrated analysis pipeline and visualization framework dedicated to multi-omics analysis. MinOmics integrates data from various sources into a materialized physical repository. We evaluate its performance, a design criterion for the framework.

  12. Interactive modeling and simulation of peripheral nerve cords in virtual environments

    NASA Astrophysics Data System (ADS)

    Ullrich, Sebastian; Frommen, Thorsten; Eckert, Jan; Schütz, Astrid; Liao, Wei; Deserno, Thomas M.; Ntouba, Alexandre; Rossaint, Rolf; Prescher, Andreas; Kuhlen, Torsten

    2008-03-01

    This paper contributes to modeling, simulation and visualization of peripheral nerve cords. Until now, only sparse datasets of nerve cords can be found. In addition, this data has not yet been used in simulators, because it is only static. To build up a more flexible anatomical structure of peripheral nerve cords, we propose a hierarchical tree data structure where each node represents a nerve branch. The shape of the nerve segments itself is approximated by spline curves. Interactive modeling allows for the creation and editing of control points which are used for branching nerve sections, calculating spline curves and editing spline representations via cross sections. Furthermore, the control points can be attached to different anatomic structures. Through this approach, nerve cords deform in accordance to the movement of the connected structures, e.g., muscles or bones. As a result, we have developed an intuitive modeling system that runs on desktop computers and in immersive environments. It allows anatomical experts to create movable peripheral nerve cords for articulated virtual humanoids. Direct feedback of changes induced by movement or deformation is achieved by visualization in real-time. The techniques and the resulting data are already used for medical simulators.

  13. Using virtual reality to augment perception, enhance sensorimotor adaptation, and change our minds

    PubMed Central

    Wright, W. Geoffrey

    2014-01-01

    Technological advances that involve human sensorimotor processes can have both intended and unintended effects on the central nervous system (CNS). This mini review focuses on the use of virtual environments (VE) to augment brain functions by enhancing perception, eliciting automatic motor behavior, and inducing sensorimotor adaptation. VE technology is becoming increasingly prevalent in medical rehabilitation, training simulators, gaming, and entertainment. Although these VE applications have often been shown to optimize outcomes, whether it be to speed recovery, reduce training time, or enhance immersion and enjoyment, there are inherent drawbacks to environments that can potentially change sensorimotor calibration. Across numerous VE studies over the years, we have investigated the effects of combining visual and physical motion on perception, motor control, and adaptation. Recent results from our research involving exposure to dynamic passive motion within a visually-depicted VE reveal that short-term exposure to augmented sensorimotor discordance can result in systematic aftereffects that last beyond the exposure period. Whether these adaptations are advantageous or not, remains to be seen. Benefits as well as risks of using VE-driven sensorimotor stimulation to enhance brain processes will be discussed. PMID:24782724

  14. Evaluation of the Texas Technology Immersion Pilot: Findings from the Second Year

    ERIC Educational Resources Information Center

    Shapley, Kelly; Sheehan, Daniel; Maloney, Catherine; Caranikas-Walker, Fanny; Huntsberger, Briana; Sturges, Keith

    2007-01-01

    The Technology Immersion Pilot (TIP) sets forth a vision for technology immersion in Texas public schools. The Texas Education Agency (TEA) originally directed more than $14.5 million in federal Title II, Part D monies toward funding a wireless learning environment for high-need middle schools through a competitive grant process. A concurrent…

  15. Building University Capacity to Visualize Solutions to Complex Problems in the Arctic

    NASA Astrophysics Data System (ADS)

    Broderson, D.; Veazey, P.; Raymond, V. L.; Kowalski, K.; Prakash, A.; Signor, B.

    2016-12-01

    Rapidly changing environments are creating complex problems across the globe, which are particular magnified in the Arctic. These worldwide challenges can best be addressed through diverse and interdisciplinary research teams. It is incumbent on such teams to promote co-production of knowledge and data-driven decision-making by identifying effective methods to communicate their findings and to engage with the public. Decision Theater North (DTN) is a new semi-immersive visualization system that provides a space for teams to collaborate and develop solutions to complex problems, relying on diverse sets of skills and knowledge. It provides a venue to synthesize the talents of scientists, who gather information (data); modelers, who create models of complex systems; artists, who develop visualizations; communicators, who connect and bridge populations; and policymakers, who can use the visualizations to develop sustainable solutions to pressing problems. The mission of Decision Theater North is to provide a cutting-edge visual environment to facilitate dialogue and decision-making by stakeholders including government, industry, communities and academia. We achieve this mission by adopting a multi-faceted approach reflected in the theater's design, technology, networking capabilities, user support, community relationship building, and strategic partnerships. DTN is a joint project of Alaska's National Science Foundation Experimental Program to Stimulate Competitive Research (NSF EPSCoR) and the University of Alaska Fairbanks (UAF), who have brought the facility up to full operational status and are now expanding its development space to support larger team science efforts. Based in Fairbanks, Alaska, DTN is uniquely poised to address changes taking place in the Arctic and subarctic, and is connected with a larger network of decision theaters that include the Arizona State University Decision Theater Network and the McCain Institute in Washington, DC.

  16. Children's Perception of Gap Affordances: Bicycling Across Traffic-Filled Intersections in an Immersive Virtual Environment

    ERIC Educational Resources Information Center

    Plumert, Jodie M.; Kearney, Joseph K.; Cremer, James F.

    2004-01-01

    This study examined gap choices and crossing behavior in children and adults using an immersive, interactive bicycling simulator. Ten- and 12-year-olds and adults rode a bicycle mounted on a stationary trainer through a virtual environment consisting of a street with 6 intersections. Participants faced continuous cross traffic traveling at 25mph…

  17. Making Web3D Less Scary: Toward Easy-to-Use Web3D e-Learning Content Development Tools for Educators

    ERIC Educational Resources Information Center

    de Byl, Penny

    2009-01-01

    Penny de Byl argues that one of the biggest challenges facing educators today is the integration of rich and immersive three-dimensional environments with existing teaching and learning materials. To empower educators with the ability to embrace emerging Web3D technologies, the Advanced Learning and Immersive Virtual Environment (ALIVE) research…

  18. Cognitive factors associated with immersion in virtual environments

    NASA Technical Reports Server (NTRS)

    Psotka, Joseph; Davison, Sharon

    1993-01-01

    Immersion into the dataspace provided by a computer, and the feeling of really being there or 'presence', are commonly acknowledged as the uniquely important features of virtual reality environments. How immersed one feels appears to be determined by a complex set of physical components and affordances of the environment, and as yet poorly understood psychological processes. Pimentel and Teixeira say that the experience of being immersed in a computer-generated world involves the same mental shift of 'suspending your disbelief for a period of time' as 'when you get wrapped up in a good novel or become absorbed in playing a computer game'. That sounds as if it could be right, but it would be good to get some evidence for these important conclusions. It might be even better to try to connect these statements with theoretical positions that try to do justice to complex cognitive processes. The basic precondition for understanding Virtual Reality (VR) is understanding the spatial representation systems that localize our bodies or egocenters in space. The effort to understand these cognitive processes is being driven with new energy by the pragmatic demands of successful virtual reality environments, but the literature is largely sparse and anecdotal.

  19. Experience with V-STORE: considerations on presence in virtual environments for effective neuropsychological rehabilitation of executive functions.

    PubMed

    Lo Priore, Corrado; Castelnuovo, Gianluca; Liccione, Diego; Liccione, Davide

    2003-06-01

    The paper discusses the use of immersive virtual reality systems for the cognitive rehabilitation of dysexecutive syndrome, usually caused by prefrontal brain injuries. With respect to classical P&P and flat-screen computer rehabilitative tools, IVR systems might prove capable of evoking a more intense and compelling sense of presence, thanks to the highly naturalistic subject-environment interaction allowed. Within a constructivist framework applied to holistic rehabilitation, we suggest that this difference might enhance the ecological validity of cognitive training, partly overcoming the implicit limits of a lab setting, which seem to affect non-immersive procedures especially when applied to dysexecutive symptoms. We tested presence in a pilot study applied to a new VR-based rehabilitation tool for executive functions, V-Store; it allows patients to explore a virtual environment where they solve six series of tasks, ordered for complexity and designed to stimulate executive functions, programming, categorical abstraction, short-term memory and attention. We compared sense of presence experienced by unskilled normal subjects, randomly assigned to immersive or non-immersive (flat screen) sessions of V-Store, through four different indexes: self-report questionnaire, psychophysiological (GSR, skin conductance), neuropsychological (incidental recall memory test related to auditory information coming from the "real" environment) and count of breaks in presence (BIPs). Preliminary results show in the immersive group a significantly higher GSR response during tasks; neuropsychological data (fewer recalled elements from "reality") and less BIPs only show a congruent but yet non-significant advantage for the immersive condition; no differences were evident from the self-report questionnaire. A larger experimental group is currently under examination to evaluate significance of these data, which also might prove interesting with respect to the question of objective-subjective measures of presence.

  20. Estimating the relative weights of visual and auditory tau versus heuristic-based cues for time-to-contact judgments in realistic, familiar scenes by older and younger adults.

    PubMed

    Keshavarz, Behrang; Campos, Jennifer L; DeLucia, Patricia R; Oberfeld, Daniel

    2017-04-01

    Estimating time to contact (TTC) involves multiple sensory systems, including vision and audition. Previous findings suggested that the ratio of an object's instantaneous optical size/sound intensity to its instantaneous rate of change in optical size/sound intensity (τ) drives TTC judgments. Other evidence has shown that heuristic-based cues are used, including final optical size or final sound pressure level. Most previous studies have used decontextualized and unfamiliar stimuli (e.g., geometric shapes on a blank background). Here we evaluated TTC estimates by using a traffic scene with an approaching vehicle to evaluate the weights of visual and auditory TTC cues under more realistic conditions. Younger (18-39 years) and older (65+ years) participants made TTC estimates in three sensory conditions: visual-only, auditory-only, and audio-visual. Stimuli were presented within an immersive virtual-reality environment, and cue weights were calculated for both visual cues (e.g., visual τ, final optical size) and auditory cues (e.g., auditory τ, final sound pressure level). The results demonstrated the use of visual τ as well as heuristic cues in the visual-only condition. TTC estimates in the auditory-only condition, however, were primarily based on an auditory heuristic cue (final sound pressure level), rather than on auditory τ. In the audio-visual condition, the visual cues dominated overall, with the highest weight being assigned to visual τ by younger adults, and a more equal weighting of visual τ and heuristic cues in older adults. Overall, better characterizing the effects of combined sensory inputs, stimulus characteristics, and age on the cues used to estimate TTC will provide important insights into how these factors may affect everyday behavior.

  1. Stage Cylindrical Immersive Display

    NASA Technical Reports Server (NTRS)

    Abramyan, Lucy; Norris, Jeffrey S.; Powell, Mark W.; Mittman, David S.; Shams, Khawaja S.

    2011-01-01

    Panoramic images with a wide field of view intend to provide a better understanding of an environment by placing objects of the environment on one seamless image. However, understanding the sizes and relative positions of the objects in a panorama is not intuitive and prone to errors because the field of view is unnatural to human perception. Scientists are often faced with the difficult task of interpreting the sizes and relative positions of objects in an environment when viewing an image of the environment on computer monitors or prints. A panorama can display an object that appears to be to the right of the viewer when it is, in fact, behind the viewer. This misinterpretation can be very costly, especially when the environment is remote and/or only accessible by unmanned vehicles. A 270 cylindrical display has been developed that surrounds the viewer with carefully calibrated panoramic imagery that correctly engages their natural kinesthetic senses and provides a more accurate awareness of the environment. The cylindrical immersive display offers a more natural window to the environment than a standard cubic CAVE (Cave Automatic Virtual Environment), and the geometry allows multiple collocated users to simultaneously view data and share important decision-making tasks. A CAVE is an immersive virtual reality environment that allows one or more users to absorb themselves in a virtual environment. A common CAVE setup is a room-sized cube where the cube sides act as projection planes. By nature, all cubic CAVEs face a problem with edge matching at edges and corners of the display. Modern immersive displays have found ways to minimize seams by creating very tight edges, and rely on the user to ignore the seam. One significant deficiency of flat-walled CAVEs is that the sense of orientation and perspective within the scene is broken across adjacent walls. On any single wall, parallel lines properly converge at their vanishing point as they should, and the sense of perspective within the scene contained on only one wall has integrity. Unfortunately, parallel lines that lie on adjacent walls do not necessarily remain parallel. This results in inaccuracies in the scene that can distract the viewer and subtract from the immersive experience of the CAVE.

  2. Virtual Solar Energy Center: A Case Study of the Use of Advanced Visualization Techniques for the Comprehension of Complex Engineering Products and Processes

    NASA Astrophysics Data System (ADS)

    Ritter, Kenneth August, III

    Industry has a continuing need to train its workforce on recent engineering developments, but many engineering products and processes are hard to explain because of limitations of size, visibility, time scale, cost, and safety. The product or process might be difficult to see because it is either very large or very small, because it is enclosed within an opaque container, or because it happens very fast or very slowly. Some engineering products and processes are also costly or unsafe to use for training purposes, and sometimes the domain expert is not physically available at the training location. All these limitations can potentially be addressed using advanced visualization techniques such as virtual reality. This dissertation describes the development of an immersive virtual reality application using the Six Sigma DMADV process to explain the main equipment and processes used in a concentrating solar power plant. The virtual solar energy center (VEC) application was initially developed and tested in a Cave Automatic Virtual Environment (CAVE) during 2013 and 2014. The software programs used for development were SolidWorks, 3ds Max Design, and Unity 3D. Current hardware and software technologies that could complement this research were analyzed. The NVIDA GRID Visual Computing Appliance (VCA) was chosen as the rendering solution for animating complex CAD models in this application. The MiddleVR software toolkit was selected as the toolkit for VR interactions and CAVE display. A non-immersive 3D version of the VEC application was tested and shown to be an effective training tool in late 2015. An immersive networked version of the VEC allows the user to receive live instruction from a trainer being projected via depth camera imagery from a remote location. Four comparative analysis studies were performed. These studies used the average normalized gain from pre-test scores to determine the effectiveness of the various training methods. With the DMADV approach, solutions were identified and verified during each iteration of the development, which saved valuable time and resulted in better results being achieved in each revision of the application, with the final version having 88% positive responses and same effectiveness as other methods assessed.

  3. Cranial implant design using augmented reality immersive system.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2007-01-01

    Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.

  4. An Evaluation of the English Immersion Approach in the Teaching of Finance in China

    ERIC Educational Resources Information Center

    Zhou, Ruiqi

    2008-01-01

    The English immersion teaching approach adopted by SEIB of GDUFS in China was developed on the basis of the immersion theory which was originally developed in North America. Its purpose is to create a learning environment in which the students acquire knowledge in business with English as the main carrier. The adoption of this approach aims to…

  5. Mobile Immersion: An Experiment Using Mobile Instant Messenger to Support Second-Language Learning

    ERIC Educational Resources Information Center

    Lai, Arthur

    2016-01-01

    Immersion has been an acclaimed approach for second-language acquisition, but is not available to most students. The idea of this study was to create a mobile immersion environment on a smartphone using a mobile instant messenger, WhatsApp™. Forty-five Form-1 (7th grade) students divided into the Mobile Group and Control Group participated in a…

  6. VILLAGE--Virtual Immersive Language Learning and Gaming Environment: Immersion and Presence

    ERIC Educational Resources Information Center

    Wang, Yi Fei; Petrina, Stephen; Feng, Francis

    2017-01-01

    3D virtual worlds are promising for immersive learning in English as a Foreign Language (EFL). Unlike English as a Second Language (ESL), EFL typically takes place in the learners' home countries, and the potential of the language is limited by geography. Although learning contexts where English is spoken is important, in most EFL courses at the…

  7. The effect of degree of immersion upon learning performance in virtual reality simulations for medical education.

    PubMed

    Gutiérrez, Fátima; Pierce, Jennifer; Vergara, Víctor M; Coulter, Robert; Saland, Linda; Caudell, Thomas P; Goldsmith, Timothy E; Alverson, Dale C

    2007-01-01

    Simulations are being used in education and training to enhance understanding, improve performance, and assess competence. However, it is important to measure the performance of these simulations as learning and training tools. This study examined and compared knowledge acquisition using a knowledge structure design. The subjects were first-year medical students at The University of New Mexico School of Medicine. One group used a fully immersed virtual reality (VR) environment using a head mounted display (HMD) and another group used a partially immersed (computer screen) VR environment. The study aims were to determine whether there were significant differences between the two groups as measured by changes in knowledge structure before and after the VR simulation experience. The results showed that both groups benefited from the VR simulation training as measured by the significant increased similarity to the expert knowledge network after the training experience. However, the immersed group showed a significantly higher gain than the partially immersed group. This study demonstrated a positive effect of VR simulation on learning as reflected by improvements in knowledge structure but an enhanced effect of full-immersion using a HMD vs. a screen-based VR system.

  8. [The effects of narcissism and self-esteem on immersion in social network games and massively multiplayer online role-playing games].

    PubMed

    Jin, Kato; Igarashi, Tasuku

    2016-04-01

    Recent research has shown growing interest in the process by which narcissism triggers immersion in social network games (SNG). Highly narcissistic individuals are motivated not only by the achievement of goals and monopoly of materials (i:e., self-enhancement), but also by comparison and competition with others (i.e., social comparison) We predicted that the common rules and environments of SNG and massively multiplayer online role-playing games (MMORPG), such as systems of exchanging items and ranking players, facilitate immersion of highly narcissistic individuals during the game. Structural equation modeling of data from 378 SNG players and 150 MMORPG players recruited online showed that self-esteem inhibited game immersion, whereas narcissism increased game immersion via motivation for goal attainment. SNG players were more likely to be immersed in the game via motivation for goal attainment than MMORPG players. These findings suggest that, compared with MMORPG, the environments of SNG provide strong incentives not for those high in self-esteem who seek acceptance of others, but for those high in narcissism who are motivated by self-enhancement via competition with others.

  9. Effects of sensory cueing in virtual motor rehabilitation. A review.

    PubMed

    Palacios-Navarro, Guillermo; Albiol-Pérez, Sergio; García-Magariño García, Iván

    2016-04-01

    To critically identify studies that evaluate the effects of cueing in virtual motor rehabilitation in patients having different neurological disorders and to make recommendations for future studies. Data from MEDLINE®, IEEExplore, Science Direct, Cochrane library and Web of Science was searched until February 2015. We included studies that investigate the effects of cueing in virtual motor rehabilitation related to interventions for upper or lower extremities using auditory, visual, and tactile cues on motor performance in non-immersive, semi-immersive, or fully immersive virtual environments. These studies compared virtual cueing with an alternative or no intervention. Ten studies with a total number of 153 patients were included in the review. All of them refer to the impact of cueing in virtual motor rehabilitation, regardless of the pathological condition. After selecting the articles, the following variables were extracted: year of publication, sample size, study design, type of cueing, intervention procedures, outcome measures, and main findings. The outcome evaluation was done at baseline and end of the treatment in most of the studies. All of studies except one showed improvements in some or all outcomes after intervention, or, in some cases, in favor of the virtual rehabilitation group compared to the control group. Virtual cueing seems to be a promising approach to improve motor learning, providing a channel for non-pharmacological therapeutic intervention in different neurological disorders. However, further studies using larger and more homogeneous groups of patients are required to confirm these findings. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Learning about the scale of the solar system using digital planetarium visualizations

    NASA Astrophysics Data System (ADS)

    Yu, Ka Chun; Sahami, Kamran; Dove, James

    2017-07-01

    We studied the use of a digital planetarium for teaching relative distances and sizes in introductory undergraduate astronomy classes. Inspired in part by the classic short film The Powers of Ten and large physical scale models of the Solar System that can be explored on foot, we created lectures using virtual versions of these two pedagogical approaches for classes that saw either an immersive treatment in the planetarium or a non-immersive version in the regular classroom (with N = 973 students participating in total). Students who visited the planetarium had not only the greatest learning gains, but their performance increased with time, whereas students who saw the same visuals projected onto a flat display in their classroom showed less retention over time. The gains seen in the students who visited the planetarium reveal that this medium is a powerful tool for visualizing scale over multiple orders of magnitude. However the modest gains for the students in the regular classroom also show the utility of these visualization approaches for the broader category of classroom physics simulations.

  11. Designing the Self: The Transformation of the Relational Self-Concept through Social Encounters in a Virtual Immersive Environment

    ERIC Educational Resources Information Center

    Knutzen, K. Brant; Kennedy, David M.

    2012-01-01

    This article describes the findings of a 3-month study on how social encounters mediated by an online Virtual Immersive Environment (VIE) impacted on the relational self-concept of adolescents. The study gathered data from two groups of students as they took an Introduction to Design and Programming class. Students in group 1 undertook course…

  12. What's There, What If, What Then, and What Can We Do? An Immersive and Embodied Experience of Environment and Place through Children's Literature

    ERIC Educational Resources Information Center

    Burke, Geraldine; Cutter-Mackenzie, Amy

    2010-01-01

    We describe an immersive investigation of children's contemporary picture books, which examines concepts of environment and place. The authors' experience occurred through and alongside a community of learners, of preservice teachers and young children, in an urban coastal community, as part of an undergraduate, pre-service teacher education unit.…

  13. Exploring Design Requirements for Repurposing Dental Virtual Patients From the Web to Second Life: A Focus Group Study

    PubMed Central

    Antoniou, Panagiotis E; Athanasopoulou, Christina A; Dafli, Eleni

    2014-01-01

    Background Since their inception, virtual patients have provided health care educators with a way to engage learners in an experience simulating the clinician’s environment without danger to learners and patients. This has led this learning modality to be accepted as an essential component of medical education. With the advent of the visually and audio-rich 3-dimensional multi-user virtual environment (MUVE), a new deployment platform has emerged for educational content. Immersive, highly interactive, multimedia-rich, MUVEs that seamlessly foster collaboration provide a new hotbed for the deployment of medical education content. Objective This work aims to assess the suitability of the Second Life MUVE as a virtual patient deployment platform for undergraduate dental education, and to explore the requirements and specifications needed to meaningfully repurpose Web-based virtual patients in MUVEs. Methods Through the scripting capabilities and available art assets in Second Life, we repurposed an existing Web-based periodontology virtual patient into Second Life. Through a series of point-and-click interactions and multiple-choice queries, the user experienced a specific periodontology case and was asked to provide the optimal responses for each of the challenges of the case. A focus group of 9 undergraduate dentistry students experienced both the Web-based and the Second Life version of this virtual patient. The group convened 3 times and discussed relevant issues such as the group’s computer literacy, the assessment of Second Life as a virtual patient deployment platform, and compared the Web-based and MUVE-deployed virtual patients. Results A comparison between the Web-based and the Second Life virtual patient revealed the inherent advantages of the more experiential and immersive Second Life virtual environment. However, several challenges for the successful repurposing of virtual patients from the Web to the MUVE were identified. The identified challenges for repurposing of Web virtual patients to the MUVE platform from the focus group study were (1) increased case complexity to facilitate the user’s gaming preconception in a MUVE, (2) necessity to decrease textual narration and provide the pertinent information in a more immersive sensory way, and (3) requirement to allow the user to actuate the solutions of problems instead of describing them through narration. Conclusions For a successful systematic repurposing effort of virtual patients to MUVEs such as Second Life, the best practices of experiential and immersive game design should be organically incorporated in the repurposing workflow (automated or not). These findings are pivotal in an era in which open educational content is transferred to and shared among users, learners, and educators of various open repositories/environments. PMID:24927470

  14. Implementation of 3d Tools and Immersive Experience Interaction for Supporting Learning in a Library-Archive Environment. Visions and Challenges

    NASA Astrophysics Data System (ADS)

    Angeletaki, A.; Carrozzino, M.; Johansen, S.

    2013-07-01

    In this paper we present an experimental environment of 3D books combined with a game application that has been developed by a collaboration project between the Norwegian University of Science and Technology in Trondheim, Norway the NTNU University Library, and the Percro laboratory of Santa Anna University in Pisa, Italy. MUBIL is an international research project involving museums, libraries and ICT academy partners aiming to develop a consistent methodology enabling the use of Virtual Environments as a metaphor to present manuscripts content through the paradigms of interaction and immersion, evaluating different possible alternatives. This paper presents the results of the application of two prototypes of books augmented with the use of XVR and IL technology. We explore immersive-reality design strategies in archive and library contexts for attracting new users. Our newly established Mubil-lab has invited school classes to test the books augmented with 3D models and other multimedia content in order to investigate whether the immersion in such environments can create wider engagement and support learning. The metaphor of 3D books and game designs in a combination allows the digital books to be handled through a tactile experience and substitute the physical browsing. In this paper we present some preliminary results about the enrichment of the user experience in such environment.

  15. Listeners' expectation of room acoustical parameters based on visual cues

    NASA Astrophysics Data System (ADS)

    Valente, Daniel L.

    Despite many studies investigating auditory spatial impressions in rooms, few have addressed the impact of simultaneous visual cues on localization and the perception of spaciousness. The current research presents an immersive audio-visual study, in which participants are instructed to make spatial congruency and quantity judgments in dynamic cross-modal environments. The results of these psychophysical tests suggest the importance of consilient audio-visual presentation to the legibility of an auditory scene. Several studies have looked into audio-visual interaction in room perception in recent years, but these studies rely on static images, speech signals, or photographs alone to represent the visual scene. Building on these studies, the aim is to propose a testing method that uses monochromatic compositing (blue-screen technique) to position a studio recording of a musical performance in a number of virtual acoustical environments and ask subjects to assess these environments. In the first experiment of the study, video footage was taken from five rooms varying in physical size from a small studio to a small performance hall. Participants were asked to perceptually align two distinct acoustical parameters---early-to-late reverberant energy ratio and reverberation time---of two solo musical performances in five contrasting visual environments according to their expectations of how the room should sound given its visual appearance. In the second experiment in the study, video footage shot from four different listening positions within a general-purpose space was coupled with sounds derived from measured binaural impulse responses (IRs). The relationship between the presented image, sound, and virtual receiver position was examined. It was found that many visual cues caused different perceived events of the acoustic environment. This included the visual attributes of the space in which the performance was located as well as the visual attributes of the performer. The addressed visual makeup of the performer included: (1) an actual video of the performance, (2) a surrogate image of the performance, for example a loudspeaker's image reproducing the performance, (3) no visual image of the performance (empty room), or (4) a multi-source visual stimulus (actual video of the performance coupled with two images of loudspeakers positioned to the left and right of the performer). For this experiment, perceived auditory events of sound were measured in terms of two subjective spatial metrics: Listener Envelopment (LEV) and Apparent Source Width (ASW) These metrics were hypothesized to be dependent on the visual imagery of the presented performance. Data was also collected by participants matching direct and reverberant sound levels for the presented audio-visual scenes. In the final experiment, participants judged spatial expectations of an ensemble of musicians presented in the five physical spaces from Experiment 1. Supporting data was accumulated in two stages. First, participants were given an audio-visual matching test, in which they were instructed to align the auditory width of a performing ensemble to a varying set of audio and visual cues. In the second stage, a conjoint analysis design paradigm was explored to extrapolate the relative magnitude of explored audio-visual factors in affecting three assessed response criteria: Congruency (the perceived match-up of the auditory and visual cues in the assessed performance), ASW and LEV. Results show that both auditory and visual factors affect the collected responses, and that the two sensory modalities coincide in distinct interactions. This study reveals participant resiliency in the presence of forced auditory-visual mismatch: Participants are able to adjust the acoustic component of the cross-modal environment in a statistically similar way despite randomized starting values for the monitored parameters. Subjective results of the experiments are presented along with objective measurements for verification.

  16. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.

  17. Evaluation of the Texas Technology Immersion Pilot: An Analysis of the Baseline Conditions and First-Year Implementation of Technology Immersion in Middle Schools. Executive Summary

    ERIC Educational Resources Information Center

    Shapley, Kelly; Sheehan, Daniel; Sturges, Keith; Caranikas-Walker, Fanny; Huntsberger, Briana; Maloney, Catherine

    2006-01-01

    The Texas Education Agency (TEA) used Title II, Part D monies to fund a wireless learning environment for high-need middle schools through the Technology Immersion Pilot (TIP). A concurrent research project funded by a federal Evaluating State Education Technology Programs grant is scientifically evaluating whether student achievement improves…

  18. Evaluation of the Texas Technology Immersion Pilot: An Analysis of the Baseline Conditions and First-Year Implementation of Technology Immersion in Middle School

    ERIC Educational Resources Information Center

    Shapley, Kelly; Sheehan, Daniel; Sturges, Keith; Caranikas-Walker, Fanny; Huntsberger, Briana; Maloney, Catherine

    2006-01-01

    The Texas Education Agency (TEA) used Title II, Part D monies to fund a wireless learning environment for high-need middle schools through the Technology Immersion Pilot (TIP). A concurrent research project funded by a federal Evaluating State Education Technology Programs grant is scientifically evaluating whether student achievement improves…

  19. Immersion Classes in an English Setting: One Way for les Anglais to Learn French. Working Papers on Bilingualism, No. 2.

    ERIC Educational Resources Information Center

    Barik, Henri; And Others

    The results of the evaluation of the French immersion program at a school in a unilingual English environment are described. A battery of tests was administered to a random sample of children from the kindergarten and grade one experimental French immersion classes and to a comparison group composed of children following the regular English…

  20. The Immersive Virtual Reality Experience: A Typology of Users Revealed Through Multiple Correspondence Analysis Combined with Cluster Analysis Technique.

    PubMed

    Rosa, Pedro J; Morais, Diogo; Gamito, Pedro; Oliveira, Jorge; Saraiva, Tomaz

    2016-03-01

    Immersive virtual reality is thought to be advantageous by leading to higher levels of presence. However, and despite users getting actively involved in immersive three-dimensional virtual environments that incorporate sound and motion, there are individual factors, such as age, video game knowledge, and the predisposition to immersion, that may be associated with the quality of virtual reality experience. Moreover, one particular concern for users engaged in immersive virtual reality environments (VREs) is the possibility of side effects, such as cybersickness. The literature suggests that at least 60% of virtual reality users report having felt symptoms of cybersickness, which reduces the quality of the virtual reality experience. The aim of this study was thus to profile the right user to be involved in a VRE through head-mounted display. To examine which user characteristics are associated with the most effective virtual reality experience (lower cybersickness), a multiple correspondence analysis combined with cluster analysis technique was performed. Results revealed three distinct profiles, showing that the PC gamer profile is more associated with higher levels of virtual reality effectiveness, that is, higher predisposition to be immersed and reduced cybersickness symptoms in the VRE than console gamer and nongamer. These findings can be a useful orientation in clinical practice and future research as they help identify which users are more predisposed to benefit from immersive VREs.

  1. LibIsopach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas

    2016-12-06

    LibIsopach is a toolkit for high performance distributed immersive visualization, leveraging modern OpenGL. It features a multi-process scenegraph, explicit instance rendering, mesh generation, and three-dimensional user interaction event processing.

  2. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  3. Cooling hyperthermic firefighters by immersing forearms and hands in 10 degrees C and 20 degrees C water.

    PubMed

    Giesbrecht, Gordon G; Jamieson, Christopher; Cahill, Farrell

    2007-06-01

    Firefighters experience significant heat stress while working with heavy gear in a hot, humid environment. This study compared the cooling effectiveness of immersing the forearms and hands in 10 and 20 degrees C water. Six men (33 +/- 10 yr; 180 +/- 4 cm; 78 +/- 9 kg; 19 +/- 5% body fat) wore firefighter 'turn-out gear' (heavy clothing and breathing apparatus weighing 27 kg) in a protocol including three 20-min exercise bouts (step test, 78 W, 40 degrees C air, 40% RH) each followed by a 20-min rest/cooling (21 degrees C air); i.e., 60 min of exercise, 60 min of cooling. Turn-out gear was removed during rest/cooling periods and subjects either rested (Control), immersed their hands in 10 or 20 degrees C water (H-10, H-20), or immersed their hands and forearms in 10 or 20 degrees C water (HF-10, HF-20). In 20 degrees C water, hand immersion did not reduce core temperature compared with Control; however, including forearm immersion decreased core temperature below Control values after both the second and final exercise periods (p < 0.001). In 10 degrees C water, adding forearm with hand immersion produced a lower core temperature (0.8 degrees C above baseline) than all other conditions (1.1 to 1.4 degrees C above baseline) after the final exercise period (p < 0.001). Sweat loss during Control (1458 g) was greater than all active cooling protocols (1146 g) (p < 0.001), which were not different from each other. Hand and forearm immersion in cool water is simple, reduces heat strain, and may increase work performance in a hot, humid environment. With 20 degrees C water, forearms should be immersed with the hands to be effective. At lower water temperatures, forearm and/or hand immersion will be effective, although forearm immersion will decrease core temperature further.

  4. Narrow band imaging combined with water immersion technique in the diagnosis of celiac disease.

    PubMed

    Valitutti, Francesco; Oliva, Salvatore; Iorfida, Donatella; Aloi, Marina; Gatti, Silvia; Trovato, Chiara Maria; Montuori, Monica; Tiberti, Antonio; Cucchiara, Salvatore; Di Nardo, Giovanni

    2014-12-01

    The "multiple-biopsy" approach both in duodenum and bulb is the best strategy to confirm the diagnosis of celiac disease; however, this increases the invasiveness of the procedure itself and is time-consuming. To evaluate the diagnostic yield of a single biopsy guided by narrow-band imaging combined with water immersion technique in paediatric patients. Prospective assessment of the diagnostic accuracy of narrow-band imaging/water immersion technique-driven biopsy approach versus standard protocol in suspected celiac disease. The experimental approach correctly diagnosed 35/40 children with celiac disease, with an overall diagnostic sensitivity of 87.5% (95% CI: 77.3-97.7). An altered pattern of narrow-band imaging/water immersion technique endoscopic visualization was significantly associated with villous atrophy at guided biopsy (Spearman Rho 0.637, p<0.001). Concordance of narrow-band imaging/water immersion technique endoscopic assessments was high between two operators (K: 0.884). The experimental protocol was highly timesaving compared to the standard protocol. An altered narrow-band imaging/water immersion technique pattern coupled with high anti-transglutaminase antibodies could allow a single guided biopsy to diagnose celiac disease. When no altered mucosal pattern is visible even by narrow-band imaging/water immersion technique, multiple bulbar and duodenal biopsies should be obtained. Copyright © 2014. Published by Elsevier Ltd.

  5. The utilization of infrared imaging for occupational disease study in industrial work.

    PubMed

    Brioschi, Marcos Leal; Okimoto, Maria Lúcia Leite Ribeiro; Vargas, José Viriato Coelho

    2012-01-01

    Infrared imaging has been used to visualize superficial temperatures in industrial employers standing and working in an indoor environment at 22°C. Temperature distributions and changes have been recorded digitally and analyzed. Mean skin temperatures determined by this method have been compared with superficial temperatures obtained with a probe thermocouple. During working hours, surface temperatures were higher over extensor muscles than over other structures and their spatial distributions differed dramatically from those observed before working hours. The authors also analyzed the cold water immersion of the hands during work. This paper showed that working generates different thermal effects on human skin that reflect physiological and pathological occupational conditions and can be monitored by infrared imaging.

  6. Research and Construction Lunar Stereoscopic Visualization System Based on Chang'E Data

    NASA Astrophysics Data System (ADS)

    Gao, Xingye; Zeng, Xingguo; Zhang, Guihua; Zuo, Wei; Li, ChunLai

    2017-04-01

    With lunar exploration activities carried by Chang'E-1, Chang'E-2 and Chang'E-3 lunar probe, a large amount of lunar data has been obtained, including topographical and image data covering the whole moon, as well as the panoramic image data of the spot close to the landing point of Chang'E-3. In this paper, we constructed immersive virtual moon system based on acquired lunar exploration data by using advanced stereoscopic visualization technology, which will help scholars to carry out research on lunar topography, assist the further exploration of lunar science, and implement the facilitation of lunar science outreach to the public. In this paper, we focus on the building of lunar stereoscopic visualization system with the combination of software and hardware by using binocular stereoscopic display technology, real-time rendering algorithm for massive terrain data, and building virtual scene technology based on panorama, to achieve an immersive virtual tour of the whole moon and local moonscape of Chang'E-3 landing point.

  7. Scientific & Intelligence Exascale Visualization Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Money, James H.

    SIEVAS provides an immersive visualization framework for connecting multiple systems in real time for data science. SIEVAS provides the ability to connect multiple COTS and GOTS products in a seamless fashion for data fusion, data analysis, and viewing. It provides this capability by using a combination of micro services, real time messaging, and web service compliant back-end system.

  8. Accessible virtual reality therapy using portable media devices.

    PubMed

    Bruck, Susan; Watters, Paul A

    2010-01-01

    Simulated immersive environments displayed on large screens are a valuable therapeutic asset in the treatment of a range of psychological disorders. Permanent environments are expensive to build and maintain, require specialized clinician training and technical support and often have limited accessibility for clients. Ideally, virtual reality exposure therapy (VRET) could be accessible to the broader community if we could use inexpensive hardware with specifically designed software. This study tested whether watching a handheld non-immersive media device causes nausea and other cybersickness responses. Using a repeated measure design we found that nausea, general discomfort, eyestrain, blurred vision and an increase in salivation significantly increased in response to handheld non-immersive media device exposure.

  9. Presence Relates to Distinct Outcomes in Two Virtual Environments Employing Different Learning Modalities

    PubMed Central

    Persky, Susan; Kaphingst, Kimberly A.; McCall, Cade; Lachance, Christina; Beall, Andrew C.; Blascovich, Jim

    2009-01-01

    Presence in virtual learning environments (VLEs) has been associated with a number of outcome factors related to a user’s ability and motivation to learn. The extant but relatively small body of research suggests that a high level of presence is related to better performance on learning outcomes in VLEs. Different configurations of form and content variables such as those associated with active (self-driven, interactive activities) versus didactic (reading or lecture) learning may, however, influence how presence operates and on what content it operates. We compared the influence of presence between two types of immersive VLEs (i.e., active versus didactic techniques) on comprehension and engagement-related outcomes. The findings revealed that the active VLE promoted greater presence. Although we found no relationship between presence and learning comprehension outcomes for either virtual environment, presence was related to information engagement variables in the didactic immersive VLE but not the active environment. Results demonstrate that presence is not uniformly elicited or effective across immersive VLEs. Educational delivery mode and environment complexity may influence the impact of presence on engagement. PMID:19366319

  10. Presence relates to distinct outcomes in two virtual environments employing different learning modalities.

    PubMed

    Persky, Susan; Kaphingst, Kimberly A; McCall, Cade; Lachance, Christina; Beall, Andrew C; Blascovich, Jim

    2009-06-01

    Presence in virtual learning environments (VLEs) has been associated with a number of outcome factors related to a user's ability and motivation to learn. The extant but relatively small body of research suggests that a high level of presence is related to better performance on learning outcomes in VLEs. Different configurations of form and content variables such as those associated with active (self-driven, interactive activities) versus didactic (reading or lecture) learning may, however, influence how presence operates and on what content it operates. We compared the influence of presence between two types of immersive VLEs (i.e., active versus didactic techniques) on comprehension and engagement-related outcomes. The findings revealed that the active VLE promoted greater presence. Although we found no relationship between presence and learning comprehension outcomes for either virtual environment, presence was related to information engagement variables in the didactic immersive VLE but not the active environment. Results demonstrate that presence is not uniformly elicited or effective across immersive VLEs. Educational delivery mode and environment complexity may influence the impact of presence on engagement.

  11. Cue-exposure software for the treatment of bulimia nervosa and binge eating disorder.

    PubMed

    Gutiérrez-Maldonado, José; Pla-Sanjuanelo, Joana; Ferrer-García, Marta

    2016-11-01

    Cue-exposure therapy (CET) has proven its efficacy in treating patients with bulimia nervosa and binge eating disorder who are resistant to standard treatment. Furthermore, incorporating virtual reality (VR) technology is increasingly considered a valid exposure method that may help to increase the efficacy of standard treatments in a variety of eating disorders. Although immersive displays improve the beneficial effects, expensive technology is not always necessary. We aimed to assess whether exposure to food related virtual environments could decrease food craving in a non-clinical sample. In addition, we specifically compared the effects of two VR systems (one non-immersive and one immersive) during CET. We therefore applied a one-session CET to 113 undergraduate students. Decreased food craving was found during exposure to both VR environments compared with pre-treatment levels, supporting the efficacy of VR-CET in reducing food craving. We found no significant differences in craving between immersive and non-immersive systems. Low-cost non-immersive systems applied through 3D laptops can improve the accessibility of this technique. By reducing the costs and improving the usability, VR-CET on 3D laptops may become a viable option that can be readily applied in a greater range of clinical contexts.

  12. Orientation Preferences and Motion Sickness Induced in a Virtual Reality Environment.

    PubMed

    Chen, Wei; Chao, Jian-Gang; Zhang, Yan; Wang, Jin-Kun; Chen, Xue-Wen; Tan, Cheng

    2017-10-01

    Astronauts' orientation preferences tend to correlate with their susceptibility to space motion sickness (SMS). Orientation preferences appear universally, since variable sensory cue priorities are used between individuals. However, SMS susceptibility changes after proper training, while orientation preferences seem to be intrinsic proclivities. The present study was conducted to investigate whether orientation preferences change if susceptibility is reduced after repeated exposure to a virtual reality (VR) stimulus environment that induces SMS. A horizontal supine posture was chosen to create a sensory context similar to weightlessness, and two VR devices were used to produce a highly immersive virtual scene. Subjects were randomly allocated to an experimental group (trained through exposure to a provocative rotating virtual scene) and a control group (untrained). All subjects' orientation preferences were measured twice with the same interval, but the experimental group was trained three times during the interval, while the control group was not. Trained subjects were less susceptible to SMS, with symptom scores reduced by 40%. Compared with untrained subjects, trained subjects' orientation preferences were significantly different between pre- and posttraining assessments. Trained subjects depended less on visual cues, whereas few subjects demonstrated the opposite tendency. Results suggest that visual information may be inefficient and unreliable for body orientation and stabilization in a rotating visual scene, while reprioritizing preferences for different sensory cues was dynamic and asymmetric between individuals. The present findings should facilitate customization of efficient and proper training for astronauts with different sensory prioritization preferences and dynamic characteristics.Chen W, Chao J-G, Zhang Y, Wang J-K, Chen X-W, Tan C. Orientation preferences and motion sickness induced in a virtual reality environment. Aerosp Med Hum Perform. 2017; 88(10):903-910.

  13. Using virtual reality technology for aircraft visual inspection training: presence and comparison studies.

    PubMed

    Vora, Jeenal; Nair, Santosh; Gramopadhye, Anand K; Duchowski, Andrew T; Melloy, Brian J; Kanki, Barbara

    2002-11-01

    The aircraft maintenance industry is a complex system consisting of several interrelated human and machine components. Recognizing this, the Federal Aviation Administration (FAA) has pursued human factors related research. In the maintenance arena the research has focused on the aircraft inspection process and the aircraft inspector. Training has been identified as the primary intervention strategy to improve the quality and reliability of aircraft inspection. If training is to be successful, it is critical that we provide aircraft inspectors with appropriate training tools and environment. In response to this need, the paper outlines the development of a virtual reality (VR) system for aircraft inspection training. VR has generated much excitement but little formal proof that it is useful. However, since VR interfaces are difficult and expensive to build, the computer graphics community needs to be able to predict which applications will benefit from VR. To address this important issue, this research measured the degree of immersion and presence felt by subjects in a virtual environment simulator. Specifically, it conducted two controlled studies using the VR system developed for visual inspection task of an aft-cargo bay at the VR Lab of Clemson University. Beyond assembling the visual inspection virtual environment, a significant goal of this project was to explore subjective presence as it affects task performance. The results of this study indicated that the system scored high on the issues related to the degree of presence felt by the subjects. As a next logical step, this study, then, compared VR to an existing PC-based aircraft inspection simulator. The results showed that the VR system was better and preferred over the PC-based training tool.

  14. Transduction between worlds: using virtual and mixed reality for earth and planetary science

    NASA Astrophysics Data System (ADS)

    Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.

    2017-12-01

    Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.

  15. Development of Three Versions of a Wheelchair Ergometer for Curvilinear Manual Wheelchair Propulsion Using Virtual Reality.

    PubMed

    Salimi, Zohreh; Ferguson-Pell, Martin

    2018-06-01

    Although wheelchair ergometers provide a safe and controlled environment for studying or training wheelchair users, until recently they had a major disadvantage in only being capable of simulating straight-line wheelchair propulsion. Virtual reality has helped overcome this problem and broaden the usability of wheelchair ergometers. However, for a wheelchair ergometer to be validly used in research studies, it needs to be able to simulate the biomechanics of real world wheelchair propulsion. In this paper, three versions of a wheelchair simulator were developed. They provide a sophisticated wheelchair ergometer in an immersive virtual reality environment. They are intended for manual wheelchair propulsion and all are able to simulate simple translational inertia. In addition, each of the systems reported uses a different approach to simulate wheelchair rotation and accommodate rotational inertial effects. The first system does not provide extra resistance against rotation and relies on merely linear inertia, hypothesizing that it can provide acceptable replication of biomechanics of wheelchair maneuvers. The second and third systems, however, are designed to simulate rotational inertia. System II uses mechanical compensation, and System III uses visual compensation simulating the influence that rotational inertia has on the visual perception of wheelchair movement in response to rotation at different speeds.

  16. VRML metabolic network visualizer.

    PubMed

    Rojdestvenski, Igor

    2003-03-01

    A successful date collection visualization should satisfy a set of many requirements: unification of diverse data formats, support for serendipity research, support of hierarchical structures, algorithmizability, vast information density, Internet-readiness, and other. Recently, virtual reality has made significant progress in engineering, architectural design, entertainment and communication. We experiment with the possibility of using the immersive abstract three-dimensional visualizations of the metabolic networks. We present the trial Metabolic Network Visualizer software, which produces graphical representation of a metabolic network as a VRML world from a formal description written in a simple SGML-type scripting language.

  17. The "Total Immersion" Meeting Environment.

    ERIC Educational Resources Information Center

    Finkel, Coleman

    1980-01-01

    The designing of intelligently planned meeting facilities can aid management communication and learning. The author examines the psychology of meeting attendance; architectural considerations (lighting, windows, color, etc.); design elements and learning modes (furniture, walls, audiovisuals, materials); and the idea of "total immersion meeting…

  18. [Water immersion as an anti-g protection for pilot. Pro et contra].

    PubMed

    Barer, A S

    2007-01-01

    In the period of 1988 - 1990 the ZVEZDA Aerospace medicine Department fulfilled comprehensive physiological investigations in order to assess the prospects for water immersion as an anti-g gear for pilots of high-maneuver aircraft. Both dry and open water immersion methods were used. More than 150 centrifuge runs were conducted to define limits for the acceleration value and time of 9-g tolerance. Volunteered subjects in the pilot's posture were inclined at 35 degrees and 55 degrees to the total inertial force vector. The obvious subjective discomfort felt during acceleration and absence of clinical aftereffect were qualified as a positive outcome. The subjects were ready for repeated runs even after a very brief repose. The main impediment to the professional anti-g use of immersion is visual disorders which in this case are not predictors of coming loss of consciousness and attributed to alterations in regional hemodynamics. The authors assert that there is a good reason to continue search for implementation of the immersion principle in g-protection of pilots to reduce the rate of professional pathologies and to intensify flights.

  19. LVC interaction within a mixed-reality training system

    NASA Astrophysics Data System (ADS)

    Pollock, Brice; Winer, Eliot; Gilbert, Stephen; de la Cruz, Julio

    2012-03-01

    The United States military is increasingly pursuing advanced live, virtual, and constructive (LVC) training systems for reduced cost, greater training flexibility, and decreased training times. Combining the advantages of realistic training environments and virtual worlds, mixed reality LVC training systems can enable live and virtual trainee interaction as if co-located. However, LVC interaction in these systems often requires constructing immersive environments, developing hardware for live-virtual interaction, tracking in occluded environments, and an architecture that supports real-time transfer of entity information across many systems. This paper discusses a system that overcomes these challenges to empower LVC interaction in a reconfigurable, mixed reality environment. This system was developed and tested in an immersive, reconfigurable, and mixed reality LVC training system for the dismounted warfighter at ISU, known as the Veldt, to overcome LVC interaction challenges and as a test bed for cuttingedge technology to meet future U.S. Army battlefield requirements. Trainees interact physically in the Veldt and virtually through commercial and developed game engines. Evaluation involving military trained personnel found this system to be effective, immersive, and useful for developing the critical decision-making skills necessary for the battlefield. Procedural terrain modeling, model-matching database techniques, and a central communication server process all live and virtual entity data from system components to create a cohesive virtual world across all distributed simulators and game engines in real-time. This system achieves rare LVC interaction within multiple physical and virtual immersive environments for training in real-time across many distributed systems.

  20. Modulation of cortical activity in 2D versus 3D virtual reality environments: an EEG study.

    PubMed

    Slobounov, Semyon M; Ray, William; Johnson, Brian; Slobounov, Elena; Newell, Karl M

    2015-03-01

    There is a growing empirical evidence that virtual reality (VR) is valuable for education, training, entertaining and medical rehabilitation due to its capacity to represent real-life events and situations. However, the neural mechanisms underlying behavioral confounds in VR environments are still poorly understood. In two experiments, we examined the effect of fully immersive 3D stereoscopic presentations and less immersive 2D VR environments on brain functions and behavioral outcomes. In Experiment 1 we examined behavioral and neural underpinnings of spatial navigation tasks using electroencephalography (EEG). In Experiment 2, we examined EEG correlates of postural stability and balance. Our major findings showed that fully immersive 3D VR induced a higher subjective sense of presence along with enhanced success rate of spatial navigation compared to 2D. In Experiment 1 power of frontal midline EEG (FM-theta) was significantly higher during the encoding phase of route presentation in the 3D VR. In Experiment 2, the 3D VR resulted in greater postural instability and modulation of EEG patterns as a function of 3D versus 2D environments. The findings support the inference that the fully immersive 3D enriched-environment requires allocation of more brain and sensory resources for cognitive/motor control during both tasks than 2D presentations. This is further evidence that 3D VR tasks using EEG may be a promising approach for performance enhancement and potential applications in clinical/rehabilitation settings. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. User Directed Tools for Exploiting Expert Knowledge in an Immersive Segmentation and Visualization Environment

    NASA Technical Reports Server (NTRS)

    Senger, Steven O.

    1998-01-01

    Volumetric data sets have become common in medicine and many sciences through technologies such as computed x-ray tomography (CT), magnetic resonance (MR), positron emission tomography (PET), confocal microscopy and 3D ultrasound. When presented with 2D images humans immediately and unconsciously begin a visual analysis of the scene. The viewer surveys the scene identifying significant landmarks and building an internal mental model of presented information. The identification of features is strongly influenced by the viewers expectations based upon their expert knowledge of what the image should contain. While not a conscious activity, the viewer makes a series of choices about how to interpret the scene. These choices occur in parallel with viewing the scene and effectively change the way the viewer sees the image. It is this interaction of viewing and choice which is the basis of many familiar visual illusions. This is especially important in the interpretation of medical images where it is the expert knowledge of the radiologist which interprets the image. For 3D data sets this interaction of view and choice is frustrated because choices must precede the visualization of the data set. It is not possible to visualize the data set with out making some initial choices which determine how the volume of data is presented to the eye. These choices include, view point orientation, region identification, color and opacity assignments. Further compounding the problem is the fact that these visualization choices are defined in terms of computer graphics as opposed to language of the experts knowledge. The long term goal of this project is to develop an environment where the user can interact with volumetric data sets using tools which promote the utilization of expert knowledge by incorporating visualization and choice into a tight computational loop. The tools will support activities involving the segmentation of structures, construction of surface meshes and local filtering of the data set. To conform to this environment tools should have several key attributes. First, they should be only rely on computations over a local neighborhood of the probe position. Second, they should operate iteratively over time converging towards a limit behavior. Third, they should adapt to user input modifying they operational parameters with time.

  2. Habituation of the cold shock response may include a significant perceptual component.

    PubMed

    Barwood, Martin J; Corbett, Jo; Wagstaff, Christopher R D

    2014-02-01

    Accidental immersion in cold water is a risk factor for many occupations. Habituation to cold-water immersion (CWI) is one practical means of reducing the cold shock response (CSR) on immersion. We investigated whether repeated thermoneutral water immersion (TWI) induced a perceptual habituation (i.e., could lessen perceived threat and anxiety) and consequently reduce the CSR on subsequent CWI. There were 12 subjects who completed seven 7-min head-out immersions. Immersions one and seven were CWls [15.0 (0.1) degrees C], and immersions two to six were TWI [34.9 (0.10) degrees C]. Anxiety 120-cm visual analogue scale) and the cardiorespiratory responses [heart rate (f(C)), respiratory frequency (f(R)), tidal volume (V(T)), and minute ventilation (V(E))] to immersion were measured throughout. Data were compared within subject between conditions using ANOVA to an alpha level of 0.05. Acute anxiety was significantly reduced after repeated exposure to the immersion scenario (i.e., TWI): CWI-1: 6.3 (4.4) cm; and CWI-2: 4.5 (4.0) cm [condition mean (SD)]. These differences did not influence the peak in the CSR. The f(C), f(R), and V(E) responses were similar between CWI-1 and CWI-2. V(T) response was significantly lower in CWI-2; mean (SD) across the immersion: CWI-1 1.27 (0.17) vs. CWI-2 1.11 0.21 L. Repeated TWI lessened the anxiety associated with CWI (perceptual habituation). This had a negligible effect on the primary components of the CSR, but did lower VT, which may reduce the volume of any aspirated water in an emergency situation. Reducing the threat appraisal of an environmental stressor may be a useful biproduct of survival training, thereby minimizing psychophysiological strain.

  3. Why Does the Buddha Laugh? Exploring Ethnic Visual Culture

    ERIC Educational Resources Information Center

    Shin, Ryan

    2010-01-01

    As an art educator and a native Korean immersed in Asian culture until 30 years of age, and one who has gained some insights into the two cultures of East Asia and America, the author is constantly thinking of what students will learn from embracing Asian visuals and objects in art curriculum. He asks if their history, identity, form and function,…

  4. Computed microtomography visualization and quantification of mouse ischemic brain lesion by nonionic radio contrast agents

    PubMed Central

    Dobrivojević, Marina; Bohaček, Ivan; Erjavec, Igor; Gorup, Dunja; Gajović, Srećko

    2013-01-01

    Aim To explore the possibility of brain imaging by microcomputed tomography (microCT) using x-ray contrasting methods to visualize mouse brain ischemic lesions after middle cerebral artery occlusion (MCAO). Methods Isolated brains were immersed in ionic or nonionic radio contrast agent (RCA) for 5 days and subsequently scanned using microCT scanner. To verify whether ex-vivo microCT brain images can be used to characterize ischemic lesions, they were compared to Nissl stained serial histological sections of the same brains. To verify if brains immersed in RCA may be used afterwards for other methods, subsequent immunofluorescent labeling with anti-NeuN was performed. Results Nonionic RCA showed better gray to white matter contrast in the brain, and therefore was selected for further studies. MicroCT measurement of ischemic lesion size and cerebral edema significantly correlated with the values determined by Nissl staining (ischemic lesion size: P=0.0005; cerebral edema: P=0.0002). Brain immersion in nonionic RCA did not affect subsequent immunofluorescent analysis and NeuN immunoreactivity. Conclusion MicroCT method was proven to be suitable for delineation of the ischemic lesion from the non-infarcted tissue, and quantification of lesion volume and cerebral edema. PMID:23444240

  5. Computed microtomography visualization and quantification of mouse ischemic brain lesion by nonionic radio contrast agents.

    PubMed

    Dobrivojević, Marina; Bohaček, Ivan; Erjavec, Igor; Gorup, Dunja; Gajović, Srećko

    2013-02-01

    To explore the possibility of brain imaging by microcomputed tomography (microCT) using x-ray contrasting methods to visualize mouse brain ischemic lesions after middle cerebral artery occlusion (MCAO). Isolated brains were immersed in ionic or nonionic radio contrast agent (RCA) for 5 days and subsequently scanned using microCT scanner. To verify whether ex-vivo microCT brain images can be used to characterize ischemic lesions, they were compared to Nissl stained serial histological sections of the same brains. To verify if brains immersed in RCA may be used afterwards for other methods, subsequent immunofluorescent labeling with anti-NeuN was performed. Nonionic RCA showed better gray to white matter contrast in the brain, and therefore was selected for further studies. MicroCT measurement of ischemic lesion size and cerebral edema significantly correlated with the values determined by Nissl staining (ischemic lesion size: P=0.0005; cerebral edema: P=0.0002). Brain immersion in nonionic RCA did not affect subsequent immunofluorescent analysis and NeuN immunoreactivity. MicroCT method was proven to be suitable for delineation of the ischemic lesion from the non-infarcted tissue, and quantification of lesion volume and cerebral edema.

  6. Real-time 3D video compression for tele-immersive environments

    NASA Astrophysics Data System (ADS)

    Yang, Zhenyu; Cui, Yi; Anwar, Zahid; Bocchino, Robert; Kiyanclar, Nadir; Nahrstedt, Klara; Campbell, Roy H.; Yurcik, William

    2006-01-01

    Tele-immersive systems can improve productivity and aid communication by allowing distributed parties to exchange information via a shared immersive experience. The TEEVE research project at the University of Illinois at Urbana-Champaign and the University of California at Berkeley seeks to foster the development and use of tele-immersive environments by a holistic integration of existing components that capture, transmit, and render three-dimensional (3D) scenes in real time to convey a sense of immersive space. However, the transmission of 3D video poses significant challenges. First, it is bandwidth-intensive, as it requires the transmission of multiple large-volume 3D video streams. Second, existing schemes for 2D color video compression such as MPEG, JPEG, and H.263 cannot be applied directly because the 3D video data contains depth as well as color information. Our goal is to explore from a different angle of the 3D compression space with factors including complexity, compression ratio, quality, and real-time performance. To investigate these trade-offs, we present and evaluate two simple 3D compression schemes. For the first scheme, we use color reduction to compress the color information, which we then compress along with the depth information using zlib. For the second scheme, we use motion JPEG to compress the color information and run-length encoding followed by Huffman coding to compress the depth information. We apply both schemes to 3D videos captured from a real tele-immersive environment. Our experimental results show that: (1) the compressed data preserves enough information to communicate the 3D images effectively (min. PSNR > 40) and (2) even without inter-frame motion estimation, very high compression ratios (avg. > 15) are achievable at speeds sufficient to allow real-time communication (avg. ~ 13 ms per 3D video frame).

  7. Development of an audio-based virtual gaming environment to assist with navigation skills in the blind.

    PubMed

    Connors, Erin C; Yazzolino, Lindsay A; Sánchez, Jaime; Merabet, Lotfi B

    2013-03-27

    Audio-based Environment Simulator (AbES) is virtual environment software designed to improve real world navigation skills in the blind. Using only audio based cues and set within the context of a video game metaphor, users gather relevant spatial information regarding a building's layout. This allows the user to develop an accurate spatial cognitive map of a large-scale three-dimensional space that can be manipulated for the purposes of a real indoor navigation task. After game play, participants are then assessed on their ability to navigate within the target physical building represented in the game. Preliminary results suggest that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building as indexed by their performance on a series of navigation tasks. These tasks included path finding through the virtual and physical building, as well as a series of drop off tasks. We find that the immersive and highly interactive nature of the AbES software appears to greatly engage the blind user to actively explore the virtual environment. Applications of this approach may extend to larger populations of visually impaired individuals.

  8. Science Education Using a Computer Model-Virtual Puget Sound

    NASA Astrophysics Data System (ADS)

    Fruland, R.; Winn, W.; Oppenheimer, P.; Stahr, F.; Sarason, C.

    2002-12-01

    We created an interactive learning environment based on an oceanographic computer model of Puget Sound-Virtual Puget Sound (VPS)-as an alternative to traditional teaching methods. Students immersed in this navigable 3-D virtual environment observed tidal movements and salinity changes, and performed tracer and buoyancy experiments. Scientific concepts were embedded in a goal-based scenario to locate a new sewage outfall in Puget Sound. Traditional science teaching methods focus on distilled representations of agreed-upon knowledge removed from real-world context and scientific debate. Our strategy leverages students' natural interest in their environment, provides meaningful context and engages students in scientific debate and knowledge creation. Results show that VPS provides a powerful learning environment, but highlights the need for research on how to most effectively represent concepts and organize interactions to support scientific inquiry and understanding. Research is also needed to ensure that new technologies and visualizations do not foster misconceptions, including the impression that the model represents reality rather than being a useful tool. In this presentation we review results from prior work with VPS and outline new work for a modeling partnership recently formed with funding from the National Ocean Partnership Program (NOPP).

  9. Walking in fully immersive virtual environments: an evaluation of potential adverse effects in older adults and individuals with Parkinson's disease.

    PubMed

    Kim, Aram; Darakjian, Nora; Finley, James M

    2017-02-21

    Virtual reality (VR) has recently been explored as a tool for neurorehabilitation to enable individuals with Parkinson's disease (PD) to practice challenging skills in a safe environment. Current technological advances have enabled the use of affordable, fully immersive head-mounted displays (HMDs) for potential therapeutic applications. However, while previous studies have used HMDs in individuals with PD, these were only used for short bouts of walking. Clinical applications of VR for gait training would likely involve an extended exposure to the virtual environment, which has the potential to cause individuals with PD to experience simulator-related adverse effects due to their age or pathology. Thus, our objective was to evaluate the safety of using an HMD for longer bouts of walking in fully immersive VR for older adults and individuals with PD. Thirty-three participants (11 healthy young, 11 healthy older adults, and 11 individuals with PD) were recruited for this study. Participants walked for 20 min while viewing a virtual city scene through an HMD (Oculus Rift DK2). Safety was evaluated using the mini-BESTest, measures of center of pressure (CoP) excursion, and questionnaires addressing symptoms of simulator sickness (SSQ) and measures of stress and arousal. Most participants successfully completed all trials without any discomfort. There were no significant changes for any of our groups in symptoms of simulator sickness or measures of static and dynamic balance after exposure to the virtual environment. Surprisingly, measures of stress decreased in all groups while the PD group also increased the level of arousal after exposure. Older adults and individuals with PD were able to successfully use immersive VR during walking without adverse effects. This provides systematic evidence supporting the safety of immersive VR for gait training in these populations.

  10. Effects of iron content in Ni-Cr-xFe alloys and immersion time on the oxide films formed in a simulated PWR water environment

    NASA Astrophysics Data System (ADS)

    Ru, Xiangkun; Lu, Zhanpeng; Chen, Junjie; Han, Guangdong; Zhang, Jinlong; Hu, Pengfei; Liang, Xue

    2017-12-01

    The iron content in Ni-Cr-xFe (x = 0-9 at.%) alloys strongly affected the properties of oxide films after 978 h of immersion in the simulated PWR primary water environment at 310 °C. Increasing the iron content in the alloys increased the amount of iron-bearing polyhedral spinel oxide particles in the outer oxide layer and increased the local oxidation penetrations into the alloy matrix from the chromium-rich inner oxide layer. The effects of iron content in the alloys on the oxide film properties after 500 h of immersion were less significant than those after 978 h. Iron content increased, and chromium content decreased, in the outer oxide layer with increasing iron content in the alloys. Increasing the immersion time facilitated the formation of the local oxidation penetrations along the matrix/film interface and the nickel-bearing spinel oxides in the outer oxide layer.

  11. Evaluating the Effects of Immersive Embodied Interaction on Cognition in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Parmar, Dhaval

    Virtual reality is on its advent of becoming mainstream household technology, as technologies such as head-mounted displays, trackers, and interaction devices are becoming affordable and easily available. Virtual reality (VR) has immense potential in enhancing the fields of education and training, and its power can be used to spark interest and enthusiasm among learners. It is, therefore, imperative to evaluate the risks and benefits that immersive virtual reality poses to the field of education. Research suggests that learning is an embodied process. Learning depends on grounded aspects of the body including action, perception, and interactions with the environment. This research aims to study if immersive embodiment through the means of virtual reality facilitates embodied cognition. A pedagogical VR solution which takes advantage of embodied cognition can lead to enhanced learning benefits. Towards achieving this goal, this research presents a linear continuum for immersive embodied interaction within virtual reality. This research evaluates the effects of three levels of immersive embodied interactions on cognitive thinking, presence, usability, and satisfaction among users in the fields of science, technology, engineering, and mathematics (STEM) education. Results from the presented experiments show that immersive virtual reality is greatly effective in knowledge acquisition and retention, and highly enhances user satisfaction, interest and enthusiasm. Users experience high levels of presence and are profoundly engaged in the learning activities within the immersive virtual environments. The studies presented in this research evaluate pedagogical VR software to train and motivate students in STEM education, and provide an empirical analysis comparing desktop VR (DVR), immersive VR (IVR), and immersive embodied VR (IEVR) conditions for learning. This research also proposes a fully immersive embodied interaction metaphor (IEIVR) for learning of computational concepts as a future direction, and presents the challenges faced in implementing the IEIVR metaphor due to extended periods of immersion. Results from the conducted studies help in formulating guidelines for virtual reality and education researchers working in STEM education and training, and for educators and curriculum developers seeking to improve student engagement in the STEM fields.

  12. A Full Body Steerable Wind Display for a Locomotion Interface.

    PubMed

    Kulkarni, Sandip D; Fisher, Charles J; Lefler, Price; Desai, Aditya; Chakravarthy, Shanthanu; Pardyjak, Eric R; Minor, Mark A; Hollerbach, John M

    2015-10-01

    This paper presents the Treadport Active Wind Tunnel (TPAWT)-a full-body immersive virtual environment for the Treadport locomotion interface designed for generating wind on a user from any frontal direction at speeds up to 20 kph. The goal is to simulate the experience of realistic wind while walking in an outdoor virtual environment. A recirculating-type wind tunnel was created around the pre-existing Treadport installation by adding a large fan, ducting, and enclosure walls. Two sheets of air in a non-intrusive design flow along the side screens of the back-projection CAVE-like visual display, where they impinge and mix at the front screen to redirect towards the user in a full-body cross-section. By varying the flow conditions of the air sheets, the direction and speed of wind at the user are controlled. Design challenges to fit the wind tunnel in the pre-existing facility, and to manage turbulence to achieve stable and steerable flow, were overcome. The controller performance for wind speed and direction is demonstrated experimentally.

  13. Applied virtual reality at the Research Triangle Institute

    NASA Technical Reports Server (NTRS)

    Montoya, R. Jorge

    1994-01-01

    Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.

  14. Fish in the matrix: motor learning in a virtual world.

    PubMed

    Engert, Florian

    2012-01-01

    One of the large remaining challenges in the field of zebrafish neuroscience is the establishment of techniques and preparations that permit the recording and perturbation of neural activity in animals that can interact meaningfully with the environment. Since it is very difficult to do this in freely behaving zebrafish, I describe here two alternative approaches that meet this goal via tethered preparations. The first uses head-fixation in agarose in combination with online imaging and analysis of tail motion. In the second method, paralyzed fish are suspended with suction pipettes in mid-water and nerve root recordings serve as indicators for intended locomotion. In both cases, fish can be immersed into a virtual environment and allowed to interact with this virtual world via real or fictive tail motions. The specific examples given in this review focus primarily on the role of visual feedback~- but the general principles certainly extend to other modalities, including proprioception, hearing, balance, and somatosensation.

  15. Fish in the matrix: motor learning in a virtual world

    PubMed Central

    Engert, Florian

    2013-01-01

    One of the large remaining challenges in the field of zebrafish neuroscience is the establishment of techniques and preparations that permit the recording and perturbation of neural activity in animals that can interact meaningfully with the environment. Since it is very difficult to do this in freely behaving zebrafish, I describe here two alternative approaches that meet this goal via tethered preparations. The first uses head-fixation in agarose in combination with online imaging and analysis of tail motion. In the second method, paralyzed fish are suspended with suction pipettes in mid-water and nerve root recordings serve as indicators for intended locomotion. In both cases, fish can be immersed into a virtual environment and allowed to interact with this virtual world via real or fictive tail motions. The specific examples given in this review focus primarily on the role of visual feedback~– but the general principles certainly extend to other modalities, including proprioception, hearing, balance, and somatosensation. PMID:23355810

  16. A software system for evaluation and training of spatial reasoning and neuroanatomical knowledge in a virtual environment.

    PubMed

    Armstrong, Ryan; de Ribaupierre, Sandrine; Eagleson, Roy

    2014-04-01

    This paper describes the design and development of a software tool for the evaluation and training of surgical residents using an interactive, immersive, virtual environment. Our objective was to develop a tool to evaluate user spatial reasoning skills and knowledge in a neuroanatomical context, as well as to augment their performance through interactivity. In the visualization, manually segmented anatomical surface images of MRI scans of the brain were rendered using a stereo display to improve depth cues. A magnetically tracked wand was used as a 3D input device for localization tasks within the brain. The movement of the wand was made to correspond to movement of a spherical cursor within the rendered scene, providing a reference for localization. Users can be tested on their ability to localize structures within the 3D scene, and their ability to place anatomical features at the appropriate locations within the rendering. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Hummingbirds control hovering flight by stabilizing visual motion.

    PubMed

    Goller, Benjamin; Altshuler, Douglas L

    2014-12-23

    Relatively little is known about how sensory information is used for controlling flight in birds. A powerful method is to immerse an animal in a dynamic virtual reality environment to examine behavioral responses. Here, we investigated the role of vision during free-flight hovering in hummingbirds to determine how optic flow--image movement across the retina--is used to control body position. We filmed hummingbirds hovering in front of a projection screen with the prediction that projecting moving patterns would disrupt hovering stability but stationary patterns would allow the hummingbird to stabilize position. When hovering in the presence of moving gratings and spirals, hummingbirds lost positional stability and responded to the specific orientation of the moving visual stimulus. There was no loss of stability with stationary versions of the same stimulus patterns. When exposed to a single stimulus many times or to a weakened stimulus that combined a moving spiral with a stationary checkerboard, the response to looming motion declined. However, even minimal visual motion was sufficient to cause a loss of positional stability despite prominent stationary features. Collectively, these experiments demonstrate that hummingbirds control hovering position by stabilizing motions in their visual field. The high sensitivity and persistence of this disruptive response is surprising, given that the hummingbird brain is highly specialized for sensory processing and spatial mapping, providing other potential mechanisms for controlling position.

  18. Can Simulator Immersion Change Cognitive Style? Results from a Cross-Sectional Study of Field-Dependence--Independence in Air Traffic Control Students

    ERIC Educational Resources Information Center

    Van Eck, Richard N.; Fu, Hongxia; Drechsel, Paul V. J.

    2015-01-01

    Air traffic control (ATC) operations are critical to the U.S. aviation infrastructure, making ATC training a critical area of study. Because ATC performance is heavily dependent on visual processing, it is important to understand how to screen for or promote relevant visual processing abilities. While conventional wisdom has maintained that such…

  19. Neuromuscular function during knee extension exercise after cold water immersion.

    PubMed

    Wakabayashi, Hitoshi; Wijayanto, Titis; Tochihara, Yutaka

    2017-06-23

    Human adaptability to cold environment has been focused on in the physiological anthropology and related research area. Concerning the human acclimatization process in the natural climate, it is necessary to conduct a research assessing comprehensive effect of cold environment and physical activities in cold. This study investigated the effect of cold water immersion on the exercise performance and neuromuscular function during maximal and submaximal isometric knee extension. Nine healthy males participated in this study. They performed maximal and submaximal (20, 40, and 60% maximal load) isometric knee extension pre- and post-immersion in 23, 26, and 34 °C water. The muscle activity of the rectus femoris (RF) and vastus lateralis (VL) was measured using surface electromyography (EMG). The percentages of the maximum voluntary contraction (%MVC) and mean power frequency (MPF) of EMG data were analyzed. The post-immersion maximal force was significantly lower in 23 °C than in 26 and 34 °C conditions (P < 0.05). The post-immersion %MVC of RF was significantly higher than pre-immersion during 60% maximal exercise in 23 and 26 °C conditions (P < 0.05). In the VL, the post-immersion %MVC was significantly higher than pre-immersion in 23 and 26 °C conditions during 20% maximal exercise and in 26 °C at 40 and 60% maximal intensities (P < 0.05). The post-immersion %MVC of VL was significantly higher in 26 °C than in 34 °C at 20 and 60% maximal load (P < 0.05). The post-immersion MPF of RF during 20% maximal intensity was significantly lower in 23 °C than in 26 and 34 °C conditions (P < 0.05), and significantly different between three water temperature conditions at 40 and 60% maximal intensities (P < 0.05). The post-immersion MPF of VL during three submaximal trials were significantly lower in 23 and 26 °C than in 34 °C conditions (P < 0.05). The lower shift of EMG frequency would be connected with the decrease in the nerve and muscle fibers conduction velocity. To compensate for the impairment of each muscle fibers function, more muscle fibers might be recruited to maintain the working load. This might result in the greater amplitude of EMG after the cold immersion.

  20. 3d visualization of atomistic simulations on every desktop

    NASA Astrophysics Data System (ADS)

    Peled, Dan; Silverman, Amihai; Adler, Joan

    2013-08-01

    Once upon a time, after making simulations, one had to go to a visualization center with fancy SGI machines to run a GL visualization and make a movie. More recently, OpenGL and its mesa clone have let us create 3D on simple desktops (or laptops), whether or not a Z-buffer card is present. Today, 3D a la Avatar is a commodity technique, presented in cinemas and sold for home TV. However, only a few special research centers have systems large enough for entire classes to view 3D, or special immersive facilities like visualization CAVEs or walls, and not everyone finds 3D immersion easy to view. For maximum physics with minimum effort a 3D system must come to each researcher and student. So how do we create 3D visualization cheaply on every desktop for atomistic simulations? After several months of attempts to select commodity equipment for a whole room system, we selected an approach that goes back a long time, even predating GL. The old concept of anaglyphic stereo relies on two images, slightly displaced, and viewed through colored glasses, or two squares of cellophane from a regular screen/projector or poster. We have added this capability to our AViz atomistic visualization code in its new, 6.1 version, which is RedHat, CentOS and Ubuntu compatible. Examples using data from our own research and that of other groups will be given.

  1. Impact of immersion oils and mounting media on the confocal imaging of dendritic spines

    PubMed Central

    Peterson, Brittni M.; Mermelstein, Paul G.; Meisel, Robert L.

    2015-01-01

    Background Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. New Method Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Results Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Comparison with Existing Method Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Conclusion Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. PMID:25601477

  2. Impact of immersion oils and mounting media on the confocal imaging of dendritic spines.

    PubMed

    Peterson, Brittni M; Mermelstein, Paul G; Meisel, Robert L

    2015-03-15

    Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. New Desktop Virtual Reality Technology in Technical Education

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2008-01-01

    Virtual reality (VR) that immerses users in a 3D environment through use of headwear, body suits, and data gloves has demonstrated effectiveness in technical and professional education. Immersive VR is highly engaging and appealing to technically skilled young Net Generation learners. However, technical difficulty and very high costs have kept…

  4. Criticality for Global Citizenship in Korean English Immersion Camps

    ERIC Educational Resources Information Center

    Ahn, So-Yeon

    2015-01-01

    Given a heavy social, ideological pressure for parents to pursue better English education for their children in the globalized world, short-term English immersion camp programs have emerged as an educational option in South Korea, promoted as environments for intercultural communication between native English-speaking teachers and local Korean…

  5. Pleasant music as a countermeasure against visually induced motion sickness.

    PubMed

    Keshavarz, Behrang; Hecht, Heiko

    2014-05-01

    Visually induced motion sickness (VIMS) is a well-known side-effect in virtual environments or simulators. However, effective behavioral countermeasures against VIMS are still sparse. In this study, we tested whether music can reduce the severity of VIMS. Ninety-three volunteers were immersed in an approximately 14-minute-long video taken during a bicycle ride. Participants were randomly assigned to one of four experimental groups, either including relaxing music, neutral music, stressful music, or no music. Sickness scores were collected using the Fast Motion Sickness Scale and the Simulator Sickness Questionnaire. Results showed an overall trend for relaxing music to reduce the severity of VIMS. When factoring in the subjective pleasantness of the music, a significant reduction of VIMS occurred only when the presented music was perceived as pleasant, regardless of the music type. In addition, we found a gender effect with women reporting more sickness than men. We assume that the presentation of pleasant music can be an effective, low-cost, and easy-to-administer method to reduce VIMS. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  6. Modeling of luminance distribution in CAVE-type virtual reality systems

    NASA Astrophysics Data System (ADS)

    Meironke, Michał; Mazikowski, Adam

    2017-08-01

    At present, one of the most advanced virtual reality systems are CAVE-type (Cave Automatic Virtual Environment) installations. Such systems are usually consisted of four, five or six projection screens and in case of six screens arranged in form of a cube. Providing the user with a high level of immersion feeling in such systems is largely dependent of optical properties of the system. The modeling of physical phenomena plays nowadays a huge role in the most fields of science and technology. It allows to simulate work of device without a need to make any changes in the physical constructions. In this paper distribution of luminance in CAVE-type virtual reality systems were modelled. Calculations were performed for the model of 6-walled CAVE-type installation, based on Immersive 3D Visualization Laboratory, situated at the Faculty of Electronics, Telecommunications and Informatics at the Gdańsk University of Technology. Tests have been carried out for two different scattering distribution of the screen material in order to check how these characteristicinfluence on the luminance distribution of the whole CAVE. The basis assumption and simplification of modeled CAVE-type installation and results were presented. The brief discussion about the results and usefulness of developed model were also carried out.

  7. Integration of the HTC Vive into the medical platform MeVisLab

    NASA Astrophysics Data System (ADS)

    Egger, Jan; Gall, Markus; Wallner, Jürgen; de Almeida Germano Boechat, Pedro; Hann, Alexander; Li, Xing; Chen, Xiaojun; Schmalstieg, Dieter

    2017-03-01

    Virtual Reality (VR) is an immersive technology that replicates an environment via computer-simulated reality. VR gets a lot of attention in computer games but has also great potential in other areas, like the medical domain. Examples are planning, simulations and training of medical interventions, like for facial surgeries where an aesthetic outcome is important. However, importing medical data into VR devices is not trivial, especially when a direct connection and visualization from your own application is needed. Furthermore, most researcher don't build their medical applications from scratch, rather they use platforms, like MeVisLab, Slicer or MITK. The platforms have in common that they integrate and build upon on libraries like ITK and VTK, further providing a more convenient graphical interface to them for the user. In this contribution, we demonstrate the usage of a VR device for medical data under MeVisLab. Therefore, we integrated the OpenVR library into MeVisLab as an own module. This enables the direct and uncomplicated usage of head mounted displays, like the HTC Vive under MeVisLab. Summarized, medical data from other MeVisLab modules can directly be connected per drag-and-drop to our VR module and will be rendered inside the HTC Vive for an immersive inspection.

  8. Virtually driving: are the driving environments "real enough" for exposure therapy with accident victims? An explorative study.

    PubMed

    Walshe, David; Lewis, Elizabeth; O'Sullivan, Kathleen; Kim, Sun I

    2005-12-01

    There is a small but growing body of research supporting the effectiveness of computer-generated environments in exposure therapy for driving phobia. However, research also suggests that difficulties can readily arise whereby patients do not immerse in simulated driving scenes. The simulated driving environments are not "real enough" to undertake exposure therapy. This sets a limitation to the use of virtual reality (VR) exposure therapy as a treatment modality for driving phobia. The aim of this study was to investigate if a clinically acceptable immersion/presence rate of >80% could be achieved for driving phobia subjects in computer generated environments by modifying external factors in the driving environment. Eleven patients referred from the Accident and Emergency Department of a general hospital or from their General Practitioner following a motor vehicle accident, who met DSM-IV criteria for Specific Phobia-driving were exposed to a computer-generated driving environment using computer driving games (London Racer/Midtown Madness). In an attempt to make the driving environments "real enough," external factors were modified by (a) projection of images onto a large screen, (b) viewing the scene through a windscreen, (c) using car seats for both driver and passenger, and (d) increasing vibration sense through use of more powerful subwoofers. Patients undertook a trial session involving driving through computer environments with graded risk of an accident. "Immersion/presence" was operationally defined as a subjective rating by the subject that the environment "feels real," together with an increase in subjective units of distress (SUD) ratings of >3 and/or an increase of heart rate of >15 beats per minute (BPM). Ten of 11 (91%) of the driving phobic subjects met the criteria for immersion/presence in the driving environment enabling progression to VR exposure therapy. These provisional findings suggest that the paradigm adopted in this study might be an effective and relatively inexpensive means of developing driving environments "real enough," to make VR exposure therapy a viable treatment modality for driving phobia following a motor vehicle accident (MVA).

  9. Moving Past Curricula and Strategies: Language and the Development of Adaptive Pedagogy for Immersive Learning Environments

    NASA Astrophysics Data System (ADS)

    Hand, Brian; Cavagnetto, Andy; Chen, Ying-Chih; Park, Soonhye

    2016-04-01

    Given current concerns internationally about student performance in science and the need to shift how science is being learnt in schools, as a community, we need to shift how we approach the issue of learning and teaching in science. In the future, we are going to have to close the gap between how students construct and engage with knowledge in a media-rich environment, and how school classroom environments engage them. This is going to require a shift to immersive environments where attention is paid to the knowledge bases and resources students bring into the classroom. Teachers will have to adopt adaptive pedagogical approaches that are framed around a more nuanced understanding of epistemological orientation, language and the nature of prosocial environments.

  10. Volume Attenuation and High Frequency Loss as Auditory Depth Cues in Stereoscopic 3D Cinema

    NASA Astrophysics Data System (ADS)

    Manolas, Christos; Pauletto, Sandra

    2014-09-01

    Assisted by the technological advances of the past decades, stereoscopic 3D (S3D) cinema is currently in the process of being established as a mainstream form of entertainment. The main focus of this collaborative effort is placed on the creation of immersive S3D visuals. However, with few exceptions, little attention has been given so far to the potential effect of the soundtrack on such environments. The potential of sound both as a means to enhance the impact of the S3D visual information and to expand the S3D cinematic world beyond the boundaries of the visuals is large. This article reports on our research into the possibilities of using auditory depth cues within the soundtrack as a means of affecting the perception of depth within cinematic S3D scenes. We study two main distance-related auditory cues: high-end frequency loss and overall volume attenuation. A series of experiments explored the effectiveness of these auditory cues. Results, although not conclusive, indicate that the studied auditory cues can influence the audience judgement of depth in cinematic 3D scenes, sometimes in unexpected ways. We conclude that 3D filmmaking can benefit from further studies on the effectiveness of specific sound design techniques to enhance S3D cinema.

  11. A Web-Based Tool for Automatic Data Collection, Curation, and Visualization of Complex Healthcare Survey Studies including Social Network Analysis.

    PubMed

    Benítez, José Alberto; Labra, José Emilio; Quiroga, Enedina; Martín, Vicente; García, Isaías; Marqués-Sánchez, Pilar; Benavides, Carmen

    2017-01-01

    There is a great concern nowadays regarding alcohol consumption and drug abuse, especially in young people. Analyzing the social environment where these adolescents are immersed, as well as a series of measures determining the alcohol abuse risk or personal situation and perception using a number of questionnaires like AUDIT, FAS, KIDSCREEN, and others, it is possible to gain insight into the current situation of a given individual regarding his/her consumption behavior. But this analysis, in order to be achieved, requires the use of tools that can ease the process of questionnaire creation, data gathering, curation and representation, and later analysis and visualization to the user. This research presents the design and construction of a web-based platform able to facilitate each of the mentioned processes by integrating the different phases into an intuitive system with a graphical user interface that hides the complexity underlying each of the questionnaires and techniques used and presenting the results in a flexible and visual way, avoiding any manual handling of data during the process. Advantages of this approach are shown and compared to the previous situation where some of the tasks were accomplished by time consuming and error prone manipulations of data.

  12. Nickel and chromium ion release from stainless steel bracket on immersion various types of mouthwashes

    NASA Astrophysics Data System (ADS)

    Mihardjanti, M.; Ismah, N.; Purwanegara, M. K.

    2017-08-01

    The stainless steel bracket is widely used in orthodontics because of its mechanical properties, strength, and good biocompatibility. However, under certain conditions, it can be susceptible to corrosion. Studies have reported that the release of nickel and chromium ions because of corrosion can cause allergic reactions in some individuals and are mutagenic. The condition of the oral environment can lead to corrosion, and one factor that can alter the oral environment is mouthwash. The aim of this study was to measure the nickel and chromium ions released from stainless steel brackets when immersed in mouthwash and aquadest. The objects consisted of four groups of 17 maxillary premolar brackets with .022 slots. Each group was immersed in a different mouthwash and aquadest and incubated at 37 °C for 30 days. After 30 days of immersion, the released ions were measured using the ICP-MS (Inductively Coupled Plasma-Mass Spectrometer). For statistical analysis, both the Kruskal-Wallis and Mann-Whitney tests were used. The results showed differences among the four groups in the nickel ions released (p < 0.05) and the chromium ions released (p < 0.5). In conclusion, the ions released as a result of mouthwash immersion have a small value that is below the limit of daily intake recommended by the World Health Organization.

  13. Habituation of the cold shock response is inhibited by repeated anxiety: Implications for safety behaviour on accidental cold water immersions.

    PubMed

    Barwood, Martin J; Corbett, Jo; Tipton, Mike; Wagstaff, Christopher; Massey, Heather

    2017-05-15

    Accidental cold-water immersion (CWI) triggers the life-threatening cold shock response (CSR) which is a precursor to sudden death on immersion. One practical means of reducing the CSR is to induce an habituation by undergoing repeated short CWIs. Habituation of the CSR is known to be partially reversed by the concomitant experience of acute anxiety, raising the possibility that repeated anxiety could prevent CSR habituation; we tested this hypothesis. Sixteen participants (12 male, 4 female) completed seven, seven-minute immersions in to cold water (15°C). Immersion one acted as a control (CON1). During immersions two to five, which would ordinarily induce an habituation, anxiety levels were repeatedly increased (CWI-ANX rep ) by deception and a demanding mathematical task. Immersions six and seven were counter-balanced with another high anxiety condition (CWI-ANX rep ) or a further control (CON2). Anxiety (20cm visual analogue scale) and cardiorespiratory responses (cardiac frequency [f c ], respiratory frequency [f R ], tidal volume [V T ], minute ventilation [V̇ E ]) were measured. Comparisons were made between experimental immersions (CON1, final CWI-ANX rep , CON2), across habituation immersions and with data from a previous study. Anxiety levels were sustained at a similar level throughout the experimental and habituation immersions (mean [SD] CON1: 7.0 [4.0] cm; CON2: 5.8 [5.2] cm cf CWI-ANX rep : 7.3 [5.5] cm; p>0.05). This culminated in failure of the CSR to habituate even when anxiety levels were not manipulated (i.e. CON2). These data were different (p<0.05) to previous studies where anxiety levels were allowed to fall across habituation immersions and the CSR consequently habituated. Repeated anxiety prevented CSR habituation. A protective strategy that includes inducing habituation for those at risk should include techniques to lower anxiety associated with the immersion event or habituation may not be beneficial in the emergency scenario. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Cultural immersion alters emotion perception: Neurophysiological evidence from Chinese immigrants to Canada.

    PubMed

    Liu, Pan; Rigoulot, Simon; Pell, Marc D

    2017-12-01

    To explore how cultural immersion modulates emotion processing, this study examined how Chinese immigrants to Canada process multisensory emotional expressions, which were compared to existing data from two groups, Chinese and North Americans. Stroop and Oddball paradigms were employed to examine different stages of emotion processing. The Stroop task presented face-voice pairs expressing congruent/incongruent emotions and participants actively judged the emotion of one modality while ignoring the other. A significant effect of cultural immersion was observed in the immigrants' behavioral performance, which showed greater interference from to-be-ignored faces, comparable with what was observed in North Americans. However, this effect was absent in their N400 data, which retained the same pattern as the Chinese. In the Oddball task, where immigrants passively viewed facial expressions with/without simultaneous vocal emotions, they exhibited a larger visual MMN for faces accompanied by voices, again mirroring patterns observed in Chinese. Correlation analyses indicated that the immigrants' living duration in Canada was associated with neural patterns (N400 and visual mismatch negativity) more closely resembling North Americans. Our data suggest that in multisensory emotion processing, adopting to a new culture first leads to behavioral accommodation followed by alterations in brain activities, providing new evidence on human's neurocognitive plasticity in communication.

  15. Designing for Learning Conversations: How Parents Support Children's Science Learning within an Immersive Simulation

    ERIC Educational Resources Information Center

    Tscholl, Michael; Lindgren, Robb

    2016-01-01

    This research investigates the social learning affordances of a room-sized, immersive, and interactive augmented reality simulation environment designed to support children's understanding of basic physics concepts in a science center. Conversations between 97 parent-child pairs were analyzed in relation to categories of talk through which…

  16. A Framework for Aligning Instructional Design Strategies with Affordances of CAVE Immersive Virtual Reality Systems

    ERIC Educational Resources Information Center

    Ritz, Leah T.; Buss, Alan R.

    2016-01-01

    Increasing availability of immersive virtual reality (IVR) systems, such as the Cave Automatic Virtual Environment (CAVE) and head-mounted displays, for use in education contexts is providing new opportunities and challenges for instructional designers. By highlighting the affordances of IVR specific to the CAVE, the authors emphasize the…

  17. Immersive Learning Technologies: Realism and Online Authentic Learning

    ERIC Educational Resources Information Center

    Herrington, Jan; Reeves, Thomas C.; Oliver, Ron

    2007-01-01

    The development of immersive learning technologies in the form of virtual reality and advanced computer applications has meant that realistic creations of simulated environments are now possible. Such simulations have been used to great effect in training in the military, air force, and in medical training. But how realistic do problems need to be…

  18. The Utility of Using Immersive Virtual Environments for the Assessment of Science Inquiry Learning

    ERIC Educational Resources Information Center

    Code, Jillianne; Clarke-Midura, Jody; Zap, Nick; Dede, Chris

    2013-01-01

    Determining the effectiveness of any educational technology depends upon teachers' and learners' perception of the functional utility of that tool for teaching, learning, and assessment. The Virtual Performance project at Harvard University is developing and studying the feasibility of using immersive technology to develop performance…

  19. Student Responses to Their Immersion in a Virtual Environment.

    ERIC Educational Resources Information Center

    Taylor, Wayne

    Undertaken in conjunction with a larger study that investigated the educational efficacy of students building their own virtual worlds, this study measures the reactions of students in grades 4-12 to the experience of being immersed in virtual reality (VR). The study investigated the sense of "presence" experienced by the students, the…

  20. The Design, Development and Evaluation of a Virtual Reality Based Learning Environment

    ERIC Educational Resources Information Center

    Chen, Chwen Jen

    2006-01-01

    Many researchers and instructional designers increasingly recognise the benefits of utilising three dimensional virtual reality (VR) technology in instruction. In general, there are two types of VR system, the immersive system and the non-immersive system. This article focuses on the latter system that merely uses the conventional personal…

  1. The Effect of Exposure on Syntactic Parsing in Spanish-English Bilinguals

    ERIC Educational Resources Information Center

    Dussias, Paola E.; Sagarra, Nuria

    2007-01-01

    An eye tracking experiment examined how exposure to a second language (L2) influences sentence parsing in the first language. Forty-four monolingual Spanish speakers, 24 proficient Spanish-English bilinguals with limited immersion experience in the L2 environment and 20 proficient Spanish-English bilinguals with extensive L2 immersion experience…

  2. Virtual Reality as a Medium for Sensorimotor Adaptation Training and Spaceflight Countermeasures

    NASA Technical Reports Server (NTRS)

    Madansingh, S.; Bloomberg, J. J.

    2014-01-01

    Astronauts experience a profound sensorimotor adaptation during transition to and from the microgravity environment of space. With the upcoming shift to extra-long duration missions (upwards of 1 year) aboard the International Space Station, the immediate risks to astronauts during these transitory periods become more important than ever to understand and prepare for. Recent advances in virtual reality technology enable everyday adoption of these tools for entertainment and use in training. Embedding an individual in a virtual environment (VE) allows the ability to change the perception of visual flow, elicit automatic motor behavior and produce sensorimotor adaptation, not unlike those required during long duration microgravity exposure. The overall goal of this study is to determine the feasibility of present head mounted display technology (HMD) to produce reliable visual flow information and the expected adaptation associated with virtual environment manipulation to be used in future sensorimotor adaptability countermeasures. To further understand the influence of visual flow on gait adaptation during treadmill walking, a series of discordant visual flow manipulations in a virtual environment are proposed. Six healthy participants (3 male and 3 female) will observe visual flow information via HMD (Oculus Rift DK2) while walking on an instrumented treadmill at their preferred walking speed. Participants will be immersed in a series of VE's resembling infinite hallways with different visual characteristics: an office hallway, a hallway with pillars and the hallway of a fictional spacecraft. Participants will perform three trials of 10 min. each, which include walking on the treadmill while receiving congruent or incongruent visual information via the HMD. In the first trial, participants will experience congruent visual information (baseline) where the hallway is perceived to move at the same rate as their walking speed. The final two trials will be randomized among participants where the hallway is perceived to move at either half (0.5x) or twice (2.0x) their preferred walking speed. Participants will remain on the treadmill between trials and will not be warned of the upcoming change to visual flow to minimize preparatory adjustments. Stride length, step frequency and dual-support time will be quantified during each trial. We hypothesize that participants will experience a rapid modification in gait performance during periods of adaptive change, expressed as a decrease in step length, an increase in step frequency and an increase in dual-support time, followed by a period of adaptation where these movement parameters will return to near-baseline levels. As stride length, step frequency and dual support times return to baseline values, an adaptation time constant will be derived to establish individual time-to-adapt (TTA). HMD technology represents a paradigm shift in sensorimotor adaptation training where gait adaptability can be stressed using off-the-shelf consumer products and minimal experimental equipment, allowing for greater training flexibility in astronaut and terrestrial applications alike.

  3. Evaluation of the biocompatibility of NiTi dental wires: a comparison of laboratory experiments and clinical conditions.

    PubMed

    Toker, S M; Canadinc, D

    2014-07-01

    Effects of intraoral environment on the surface degradation of nickel-titanium (NiTi) shape memory alloy orthodontic wires was simulated through ex situ static immersion experiments in artificial saliva. The tested wires were compared to companion wires retrieved from patients in terms of chemical changes and formation of new structures on the surface. Results of the ex situ experiments revealed that the acidic erosion effective at the earlier stages of immersion led to the formation of new structures as the immersion period approached 30 days. Moreover, comparison of these results with the analysis of wires utilized in clinical treatment evidenced that ex situ experiments are reliable in terms predicting C-rich structure formation on the wire surfaces. However, the formation of C pileups at the contact sites of arch wires and brackets could not be simulated with the aid of static immersion experiments, warranting the simulation of the intraoral environment in terms of both chemical and physical conditions, including mechanical loading, when evaluating the biocompatibility of NiTi orthodontic arch wires. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Comparative study on collaborative interaction in non-immersive and immersive systems

    NASA Astrophysics Data System (ADS)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki

    2007-09-01

    This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.

  5. A succinct overview of virtual reality technology use in Alzheimer's disease.

    PubMed

    García-Betances, Rebeca I; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer's disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers' education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments.

  6. Experiencing Soil Science from your office through virtual experiences

    NASA Astrophysics Data System (ADS)

    Beato, M. Carmen; González-Merino, Ramón; Campillo, M. Carmen; Fernández-Ahumada, Elvira; Ortiz, Leovigilda; Taguas, Encarnación V.; Guerrero, José Emilio

    2017-04-01

    Currently, numerous tools based on the new information and communication technologies offer a wide range of possibilities for the implementation of interactive methodologies in Education and Science. In particular, virtual reality and immersive worlds - artificially generated computer environments where users interact through a figurative individual that represents them in that environment (their "avatar") - have been identified as the technology that will change the way we live, particularly in educational terms, product development and entertainment areas (Schmorrow, 2009). Gisbert-Cervera et al. (2011) consider that the 3D worlds in education, among others, provide a unique training and exchange of knowledge environment which allows a goal reflection to support activities and achieve learning outcomes. In Soil Sciences, the experimental component is essential to acquire the necessary knowledge to understand the biogeochemical processes taking place and their interactions with time, climate, topography and living organisms present. In this work, an immersive virtual environment which reproduces a series of pits have been developed to evaluate and differentiate soil characteristics such as texture, structure, consistency, color and other physical-chemical and biological properties for educational purposes. Bibliographical material such as pictures, books, papers and were collected in order to classify the information needed and to build the soil profiles into the virtual environment. The programming language for the virtual recreation was Unreal Engine4 (UE4; https://www.unrealengine.com/unreal-engine-4). This program was chosen because it provides two toolsets for programmers and it can also be used in tandem to accelerate development workflows. In addition, Unreal Engine4 technology powers hundreds of games as well as real-time 3D films, training simulations, visualizations and it creates very realistic graphics. For the evaluation of its impact and its usefulness in teaching, a series of surveys will be presented to undergraduate students and teachers. REFERENCES: Gisbert-Cervera M, Esteve-Gonzalez V., Camacho-Marti M.M. (2011). Delve into the Deep: Learning Potential in Metaverses and 3D Worlds. eLearning (25) Papers ISSN: 1887-1542 Schmorrow D.D. (2009). Why virtual? Theoretical Issues in Ergonomics Science 10(3): 279-282.

  7. High-immersion three-dimensional display of the numerical computer model

    NASA Astrophysics Data System (ADS)

    Xing, Shujun; Yu, Xunbo; Zhao, Tianqi; Cai, Yuanfa; Chen, Duo; Chen, Zhidong; Sang, Xinzhu

    2013-08-01

    High-immersion three-dimensional (3D) displays making them valuable tools for many applications, such as designing and constructing desired building houses, industrial architecture design, aeronautics, scientific research, entertainment, media advertisement, military areas and so on. However, most technologies provide 3D display in the front of screens which are in parallel with the walls, and the sense of immersion is decreased. To get the right multi-view stereo ground image, cameras' photosensitive surface should be parallax to the public focus plane and the cameras' optical axes should be offset to the center of public focus plane both atvertical direction and horizontal direction. It is very common to use virtual cameras, which is an ideal pinhole camera to display 3D model in computer system. We can use virtual cameras to simulate the shooting method of multi-view ground based stereo image. Here, two virtual shooting methods for ground based high-immersion 3D display are presented. The position of virtual camera is determined by the people's eye position in the real world. When the observer stand in the circumcircle of 3D ground display, offset perspective projection virtual cameras is used. If the observer stands out the circumcircle of 3D ground display, offset perspective projection virtual cameras and the orthogonal projection virtual cameras are adopted. In this paper, we mainly discussed the parameter setting of virtual cameras. The Near Clip Plane parameter setting is the main point in the first method, while the rotation angle of virtual cameras is the main point in the second method. In order to validate the results, we use the D3D and OpenGL to render scenes of different viewpoints and generate a stereoscopic image. A realistic visualization system for 3D models is constructed and demonstrated for viewing horizontally, which provides high-immersion 3D visualization. The displayed 3D scenes are compared with the real objects in the real world.

  8. Crowd behaviour during high-stress evacuations in an immersive virtual environment

    PubMed Central

    Kapadia, Mubbasir; Thrash, Tyler; Sumner, Robert W.; Gross, Markus; Helbing, Dirk; Hölscher, Christoph

    2016-01-01

    Understanding the collective dynamics of crowd movements during stressful emergency situations is central to reducing the risk of deadly crowd disasters. Yet, their systematic experimental study remains a challenging open problem due to ethical and methodological constraints. In this paper, we demonstrate the viability of shared three-dimensional virtual environments as an experimental platform for conducting crowd experiments with real people. In particular, we show that crowds of real human subjects moving and interacting in an immersive three-dimensional virtual environment exhibit typical patterns of real crowds as observed in real-life crowded situations. These include the manifestation of social conventions and the emergence of self-organized patterns during egress scenarios. High-stress evacuation experiments conducted in this virtual environment reveal movements characterized by mass herding and dangerous overcrowding as they occur in crowd disasters. We describe the behavioural mechanisms at play under such extreme conditions and identify critical zones where overcrowding may occur. Furthermore, we show that herding spontaneously emerges from a density effect without the need to assume an increase of the individual tendency to imitate peers. Our experiments reveal the promise of immersive virtual environments as an ethical, cost-efficient, yet accurate platform for exploring crowd behaviour in high-risk situations with real human subjects. PMID:27605166

  9. Crowd behaviour during high-stress evacuations in an immersive virtual environment.

    PubMed

    Moussaïd, Mehdi; Kapadia, Mubbasir; Thrash, Tyler; Sumner, Robert W; Gross, Markus; Helbing, Dirk; Hölscher, Christoph

    2016-09-01

    Understanding the collective dynamics of crowd movements during stressful emergency situations is central to reducing the risk of deadly crowd disasters. Yet, their systematic experimental study remains a challenging open problem due to ethical and methodological constraints. In this paper, we demonstrate the viability of shared three-dimensional virtual environments as an experimental platform for conducting crowd experiments with real people. In particular, we show that crowds of real human subjects moving and interacting in an immersive three-dimensional virtual environment exhibit typical patterns of real crowds as observed in real-life crowded situations. These include the manifestation of social conventions and the emergence of self-organized patterns during egress scenarios. High-stress evacuation experiments conducted in this virtual environment reveal movements characterized by mass herding and dangerous overcrowding as they occur in crowd disasters. We describe the behavioural mechanisms at play under such extreme conditions and identify critical zones where overcrowding may occur. Furthermore, we show that herding spontaneously emerges from a density effect without the need to assume an increase of the individual tendency to imitate peers. Our experiments reveal the promise of immersive virtual environments as an ethical, cost-efficient, yet accurate platform for exploring crowd behaviour in high-risk situations with real human subjects. © 2016 The Authors.

  10. Knowledge Acquisition and Job Training for Advanced Technical Skills Using Immersive Virtual Environment

    NASA Astrophysics Data System (ADS)

    Watanuki, Keiichi; Kojima, Kazuyuki

    The environment in which Japanese industry has achieved great respect is changing tremendously due to the globalization of world economies, while Asian countries are undergoing economic and technical development as well as benefiting from the advances in information technology. For example, in the design of custom-made casting products, a designer who lacks knowledge of casting may not be able to produce a good design. In order to obtain a good design and manufacturing result, it is necessary to equip the designer and manufacturer with a support system related to casting design, or a so-called knowledge transfer and creation system. This paper proposes a new virtual reality based knowledge acquisition and job training system for casting design, which is composed of the explicit and tacit knowledge transfer systems using synchronized multimedia and the knowledge internalization system using portable virtual environment. In our proposed system, the education content is displayed in the immersive virtual environment, whereby a trainee may experience work in the virtual site operation. Provided that the trainee has gained explicit and tacit knowledge of casting through the multimedia-based knowledge transfer system, the immersive virtual environment catalyzes the internalization of knowledge and also enables the trainee to gain tacit knowledge before undergoing on-the-job training at a real-time operation site.

  11. Transformational Play as a Curricular Scaffold: Using Videogames to Support Science Education

    NASA Astrophysics Data System (ADS)

    Barab, Sasha A.; Scott, Brianna; Siyahhan, Sinem; Goldstone, Robert; Ingram-Goble, Adam; Zuiker, Steven J.; Warren, Scott

    2009-08-01

    Drawing on game-design principles and an underlying situated theoretical perspective, we developed and researched a 3D game-based curriculum designed to teach water quality concepts. We compared undergraduate student dyads assigned randomly to four different instructional design conditions where the content had increasingly level of contextualization: (a) expository textbook condition, (b) simplistic framing condition, (c) immersive world condition, and (d) a single-user immersive world condition. Results indicated that the immersive-world dyad and immersive-world single user conditions performed significantly better than the electronic textbook group on standardized items. The immersive-world dyad condition also performed significantly better than either the expository textbook or the descriptive framing condition on a performance-based transfer task, and performed significantly better than the expository textbook condition on standardized test items. Implications for science education, and consistent with the goals of this special issue, are that immersive game-based learning environments provide a powerful new form of curriculum for teaching and learning science.

  12. Effects of immersion depth on super-resolution properties of index-different microsphere-assisted nanoimaging

    NASA Astrophysics Data System (ADS)

    Zhou, Yi; Tang, Yan; He, Yu; Liu, Xi; Hu, Song

    2018-03-01

    In related applications of microsphere-assisted super-resolution imaging in biomedical visualization and microfluidic detection, liquids are widely used as background media. For the first time, we quantitatively demonstrate that the maximum irradiances, focal lengths, and waists of photonic nanojets (PNJs) will logically vary with different immersion depths (IMDs). The experimental observations also numerically illustrate the trends of the lateral magnification and field of view (FOV) with the gradual evaporation of ethyl alcohol. This work can provide exact quantitative information for the proper selection of microspheres and IMD for the high-quality discernment of nanostructures.

  13. Three-Dimensional User Interfaces for Immersive Virtual Reality

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1997-01-01

    The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.

  14. 3D movies for teaching seafloor bathymetry, plate tectonics, and ocean circulation in large undergraduate classes

    NASA Astrophysics Data System (ADS)

    Peterson, C. D.; Lisiecki, L. E.; Gebbie, G.; Hamann, B.; Kellogg, L. H.; Kreylos, O.; Kronenberger, M.; Spero, H. J.; Streletz, G. J.; Weber, C.

    2015-12-01

    Geologic problems and datasets are often 3D or 4D in nature, yet projected onto a 2D surface such as a piece of paper or a projection screen. Reducing the dimensionality of data forces the reader to "fill in" that collapsed dimension in their minds, creating a cognitive challenge for the reader, especially new learners. Scientists and students can visualize and manipulate 3D datasets using the virtual reality software developed for the immersive, real-time interactive 3D environment at the KeckCAVES at UC Davis. The 3DVisualizer software (Billen et al., 2008) can also operate on a desktop machine to produce interactive 3D maps of earthquake epicenter locations and 3D bathymetric maps of the seafloor. With 3D projections of seafloor bathymetry and ocean circulation proxy datasets in a virtual reality environment, we can create visualizations of carbon isotope (δ13C) records for academic research and to aid in demonstrating thermohaline circulation in the classroom. Additionally, 3D visualization of seafloor bathymetry allows students to see features of seafloor most people cannot observe first-hand. To enhance lessons on mid-ocean ridges and ocean basin genesis, we have created movies of seafloor bathymetry for a large-enrollment undergraduate-level class, Introduction to Oceanography. In the past four quarters, students have enjoyed watching 3D movies, and in the fall quarter (2015), we will assess how well 3D movies enhance learning. The class will be split into two groups, one who learns about the Mid-Atlantic Ridge from diagrams and lecture, and the other who learns with a supplemental 3D visualization. Both groups will be asked "what does the seafloor look like?" before and after the Mid-Atlantic Ridge lesson. Then the whole class will watch the 3D movie and respond to an additional question, "did the 3D visualization enhance your understanding of the Mid-Atlantic Ridge?" with the opportunity to further elaborate on the effectiveness of the visualization.

  15. Visible Geology - Interactive online geologic block modelling

    NASA Astrophysics Data System (ADS)

    Cockett, R.

    2012-12-01

    Geology is a highly visual science, and many disciplines require spatial awareness and manipulation. For example, interpreting cross-sections, geologic maps, or plotting data on a stereonet all require various levels of spatial abilities. These skills are often not focused on in undergraduate geoscience curricula and many students struggle with spatial relations, manipulations, and penetrative abilities (e.g. Titus & Horsman, 2009). A newly developed program, Visible Geology, allows for students to be introduced to many geologic concepts and spatial skills in a virtual environment. Visible Geology is a web-based, three-dimensional environment where students can create and interrogate their own geologic block models. The program begins with a blank model, users then add geologic beds (with custom thickness and color) and can add geologic deformation events like tilting, folding, and faulting. Additionally, simple intrusive dikes can be modelled, as well as unconformities. Students can also explore the interaction of geology with topography by drawing elevation contours to produce their own topographic models. Students can not only spatially manipulate their model, but can create cross-sections and boreholes to practice their visual penetrative abilities. Visible Geology is easy to access and use, with no downloads required, so it can be incorporated into current, paper-based, lab activities. Sample learning activities are being developed that target introductory and structural geology curricula with learning objectives such as relative geologic history, fault characterization, apparent dip and thickness, interference folding, and stereonet interpretation. Visible Geology provides a richly interactive, and immersive environment for students to explore geologic concepts and practice their spatial skills.; Screenshot of Visible Geology showing folding and faulting interactions on a ridge topography.

  16. Automatic depth grading tool to successfully adapt stereoscopic 3D content to digital cinema and home viewing environments

    NASA Astrophysics Data System (ADS)

    Thébault, Cédric; Doyen, Didier; Routhier, Pierre; Borel, Thierry

    2013-03-01

    To ensure an immersive, yet comfortable experience, significant work is required during post-production to adapt the stereoscopic 3D (S3D) content to the targeted display and its environment. On the one hand, the content needs to be reconverged using horizontal image translation (HIT) so as to harmonize the depth across the shots. On the other hand, to prevent edge violation, specific re-convergence is required and depending on the viewing conditions floating windows need to be positioned. In order to simplify this time-consuming work we propose a depth grading tool that automatically adapts S3D content to digital cinema or home viewing environments. Based on a disparity map, a stereo point of interest in each shot is automatically evaluated. This point of interest is used for depth matching, i.e. to position the objects of interest of consecutive shots in a same plane so as to reduce visual fatigue. The tool adapts the re-convergence to avoid edge-violation, hyper-convergence and hyper-divergence. Floating windows are also automatically positioned. The method has been tested on various types of S3D content, and the results have been validated by a stereographer.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayes, Birchard P; Michel, Kelly D; Few, Douglas A

    From stereophonic, positional sound to high-definition imagery that is crisp and clean, high fidelity computer graphics enhance our view, insight, and intuition regarding our environments and conditions. Contemporary 3-D modeling tools offer an open architecture framework that enables integration with other technologically innovative arenas. One innovation of great interest is Augmented Reality, the merging of virtual, digital environments with physical, real-world environments creating a mixed reality where relevant data and information augments the real or actual experience in real-time by spatial or semantic context. Pairing 3-D virtual immersive models with a dynamic platform such as semi-autonomous robotics or personnel odometrymore » systems to create a mixed reality offers a new and innovative design information verification inspection capability, evaluation accuracy, and information gathering capability for nuclear facilities. Our paper discusses the integration of two innovative technologies, 3-D visualizations with inertial positioning systems, and the resulting augmented reality offered to the human inspector. The discussion in the paper includes an exploration of human and non-human (surrogate) inspections of a nuclear facility, integrated safeguards knowledge within a synchronized virtual model operated, or worn, by a human inspector, and the anticipated benefits to safeguards evaluations of facility operations.« less

  18. Virtual Reconstruction of Lost Architectures: from the Tls Survey to AR Visualization

    NASA Astrophysics Data System (ADS)

    Quattrini, R.; Pierdicca, R.; Frontoni, E.; Barcaglioni, R.

    2016-06-01

    The exploitation of high quality 3D models for dissemination of archaeological heritage is currently an investigated topic, although Mobile Augmented Reality platforms for historical architecture are not available, allowing to develop low-cost pipelines for effective contents. The paper presents a virtual anastylosis, starting from historical sources and from 3D model based on TLS survey. Several efforts and outputs in augmented or immersive environments, exploiting this reconstruction, are discussed. The work demonstrates the feasibility of a 3D reconstruction approach for complex architectural shapes starting from point clouds and its AR/VR exploitation, allowing the superimposition with archaeological evidences. Major contributions consist in the presentation and the discussion of a pipeline starting from the virtual model, to its simplification showing several outcomes, comparing also the supported data qualities and advantages/disadvantages due to MAR and VR limitations.

  19. Immersive Environments: Using Flow and Sound to Blur Inhabitant and Surroundings

    NASA Astrophysics Data System (ADS)

    Laverty, Luke

    Following in the footsteps of motif-reviving, aesthetically-focused Postmodern and deconstructivist architecture, purely computer-generated formalist contemporary architecture (i.e. blobitecture) has been reduced to vast, empty sculptural, and therefore, purely ocularcentric gestures for their own sake. Taking precedent over the deliberate relation to the people inhabiting them beyond scaleless visual stimulation, the forms become separated from and hostile toward their inhabitants; a boundary appears. This thesis calls for a reintroduction of human-centered design beyond Modern functionalism and ergonomics and Postmodern form and metaphor into architecture by exploring ecological psychology (specifically how one becomes attached to objects) and phenomenology (specifically sound) in an attempt to reach a contemporary human scale using the technology of today: the physiological mind. Psychologist Dr. Mihaly Csikszentmihalyi's concept of flow---when one becomes so mentally immersed within the current activity and immediate surroundings that the boundary between inhabitant and environment becomes transparent through a form of trance---is the embodiment of this thesis' goal, but it is limited to only specific moments throughout the day and typically studied without regard to the environment. Physiologically, the area within the brain---the medial prefrontal cortex---stimulated during flow experiences is also stimulated by the synthesis of sound, memory, and emotion. By exploiting sound (a sense not typically focused on within phenomenology) as a form of constant nuance within the everyday productive dissonance, the engagement and complete concentration on one's own interpretation of this sensory input affords flow experiences and, therefore, a blurred boundary with one's environment. This thesis aims to answer the question: How does the built environment embody flow? The above concept will be illustrated within a ubiquitous building type---the everyday housing tower---in the form of a live-work vertical artist commune in New York City---the antithesis of intimate, human architectural environments---coupled with the design of a sound sensory experiential walk through the surrounding blurred neighborhood boundaries in the attempt to exploit and create an environment one becomes absorbed within and feels comfortable enough with which to experience flow. To do so, the characteristics of flow lead to the capturing of the senses, interaction, and flexibility. This thesis will explore and exploit how one perceives, interacts with, and becomes attached to when confronted with a space or artifact; reintroducing the humanity into contemporary architecture.

  20. Using Immersive Visualizations to Improve Decision Making and Enhancing Public Understanding of Earth Resource and Climate Issues

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Raynolds, R. G.; Dechesne, M.

    2008-12-01

    New visualization technologies, from ArcGIS to Google Earth, have allowed for the integration of complex, disparate data sets to produce visually rich and compelling three-dimensional models of sub-surface and surface resource distribution patterns. The rendering of these models allows the public to quickly understand complicated geospatial relationships that would otherwise take much longer to explain using traditional media. We have impacted the community through topical policy presentations at both state and city levels, adult education classes at the Denver Museum of Nature and Science (DMNS), and public lectures at DMNS. We have constructed three-dimensional models from well data and surface observations which allow policy makers to better understand the distribution of groundwater in sandstone aquifers of the Denver Basin. Our presentations to local governments in the Denver metro area have allowed resource managers to better project future ground water depletion patterns, and to encourage development of alternative sources. DMNS adult education classes on water resources, geography, and regional geology, as well as public lectures on global issues such as earthquakes, tsunamis, and resource depletion, have utilized the visualizations developed from these research models. In addition to presenting GIS models in traditional lectures, we have also made use of the immersive display capabilities of the digital "fulldome" Gates Planetarium at DMNS. The real-time Uniview visualization application installed at Gates was designed for teaching astronomy, but it can be re-purposed for displaying our model datasets in the context of the Earth's surface. The 17-meter diameter dome of the Gates Planetarium allows an audience to have an immersive experience---similar to virtual reality CAVEs employed by the oil exploration industry---that would otherwise not be available to the general public. Public lectures in the dome allow audiences of over 100 people to comprehend dynamically- changing geospatial datasets in an exciting and engaging fashion. In our presentation, we will demonstrate how new software tools like Uniview can be used to dramatically enhance and accelerate public comprehension of complex, multi-scale geospatial phenomena.

  1. Immersive Virtual Environment Technology to Supplement Environmental Perception, Preference and Behavior Research: A Review with Applications

    PubMed Central

    Smith, Jordan W.

    2015-01-01

    Immersive virtual environment (IVE) technology offers a wide range of potential benefits to research focused on understanding how individuals perceive and respond to built and natural environments. In an effort to broaden awareness and use of IVE technology in perception, preference and behavior research, this review paper describes how IVE technology can be used to complement more traditional methods commonly applied in public health research. The paper also describes a relatively simple workflow for creating and displaying 360° virtual environments of built and natural settings and presents two freely-available and customizable applications that scientists from a variety of disciplines, including public health, can use to advance their research into human preferences, perceptions and behaviors related to built and natural settings. PMID:26378565

  2. Immersive Virtual Environment Technology to Supplement Environmental Perception, Preference and Behavior Research: A Review with Applications.

    PubMed

    Smith, Jordan W

    2015-09-11

    Immersive virtual environment (IVE) technology offers a wide range of potential benefits to research focused on understanding how individuals perceive and respond to built and natural environments. In an effort to broaden awareness and use of IVE technology in perception, preference and behavior research, this review paper describes how IVE technology can be used to complement more traditional methods commonly applied in public health research. The paper also describes a relatively simple workflow for creating and displaying 360° virtual environments of built and natural settings and presents two freely-available and customizable applications that scientists from a variety of disciplines, including public health, can use to advance their research into human preferences, perceptions and behaviors related to built and natural settings.

  3. Anxiety provocation and measurement using virtual reality in patients with obsessive-compulsive disorder.

    PubMed

    Kim, Kwanguk; Kim, Chan-Hyung; Cha, Kyung Ryeol; Park, Junyoung; Han, Kiwan; Kim, Yun Ki; Kim, Jae-Jin; Kim, In Young; Kim, Sun I

    2008-12-01

    The current study is a preliminary test of a virtual reality (VR) anxiety-provoking tool using a sample of participants with obsessive-compulsive disorder (OCD). The tasks were administrated to 33 participants with OCD and 30 healthy control participants. In the VR task, participants navigated through a virtual environment using a joystick and head-mounted display. The virtual environment consisted of three phases: training, distraction, and the main task. After the training and distraction phases, participants were allowed to check (a common OCD behavior) freely, as they would in the real world, and a visual analogy scale of anxiety was recorded during VR. Participants' anxiety in the virtual environment was measured with a validated measure of psychiatric symptoms and functions and analyzed with a VR questionnaire. Results revealed that those with OCD had significantly higher anxiety in the virtual environment than did healthy controls, and the decreased ratio of anxiety in participants with OCD was also higher than that of healthy controls. Moreover, the degree of anxiety of an individual with OCD was positively correlated with a his or her symptom score and immersive tendency score. These results suggest the possibility that VR technology has a value as an anxiety-provoking or treatment tool for OCD.

  4. An Analysis of VR Technology Used in Immersive Simulations with a Serious Game Perspective.

    PubMed

    Menin, Aline; Torchelsen, Rafael; Nedel, Luciana

    2018-03-01

    Using virtual environments (VEs) is a safer and cost-effective alternative to executing dangerous tasks, such as training firefighters and industrial operators. Immersive virtual reality (VR) combined with game aspects have the potential to improve the user experience in the VE by increasing realism, engagement, and motivation. This article investigates the impact of VR technology on 46 immersive gamified simulations with serious purposes and classifies it towards a taxonomy. Our findings suggest that immersive VR improves simulation outcomes, such as increasing learning gain and knowledge retention and improving clinical outcomes for rehabilitation. However, it also has limitations such as motion sickness and restricted access to VR hardware. Our contributions are to provide a better understanding of the benefits and limitations of using VR in immersive simulations with serious purposes, to propose a taxonomy that classifies them, and to discuss whether methods and participants profiles influence results.

  5. The Case of Literacy Motivation: Playful 3D Immersive Learning Environments and Problem-Focused Education for Blended Digital Storytelling

    ERIC Educational Resources Information Center

    Mystakidis, Stylianos; Berki, Eleni

    2018-01-01

    The University of Patras' Library Services designed and offered to primary and secondary schools the pilot educational program "From the Ancient to the Modern Tablets," featuring immersive multimedia learning experiences about the book history. The pilot program consisted of three stages: a playful library tour, followed by an…

  6. Early Childhood Bilingualism in the Montessori Children's House: Guessable Context and the Planned Environment. Spotlight: Montessori--Multilingual, Multicultural.

    ERIC Educational Resources Information Center

    Rosanova, Michael

    1998-01-01

    Describes the InterCultura Montessori School language immersion program in Oak Park, Illinois. Profiles the work of several children to illustrate important language learning strategies. Recommends that language immersion programs include: survival vocabulary skills; repetition of key grammatical forms; use of objects, pictures, and dramatization;…

  7. Early Childhood Bilingualism in the Montessori Children's House: Guessable Context and the Planned Environment.

    ERIC Educational Resources Information Center

    Rosanova, M. J.

    The language immersion approach of the Intercultural Montessori School (Oak Park, Illinois) for children aged 2-6 years is described and discussed. An introductory section gives background information on early work with immersion by Maria Montessori, a personal experience leading to the school's establishment, and the response of language and…

  8. Hands-on Learning in the Virtual World

    ERIC Educational Resources Information Center

    Branson, John; Thomson, Diane

    2013-01-01

    The U.S. military has long understood the value of immersive simulations in education. Before the Navy entrusts a ship to a crew, crew members must first practice and demonstrate their competency in a fully immersive, simulated environment. Why not teach students in the same way? K-12 educators in Pennsylvania, USA, recently did just that when…

  9. Collaborative Science Learning in Three-Dimensional Immersive Virtual Worlds: Pre-Service Teachers' Experiences in Second Life

    ERIC Educational Resources Information Center

    Nussli, Natalie; Oh, Kevin; McCandless, Kevin

    2014-01-01

    The purpose of this mixed methods study was to help pre-service teachers experience and evaluate the potential of Second Life, a three-dimensional immersive virtual environment, for potential integration into their future teaching. By completing collaborative assignments in Second Life, nineteen pre-service general education teachers explored an…

  10. Affordances and Limitations of Immersive Participatory Augmented Reality Simulations for Teaching and Learning

    ERIC Educational Resources Information Center

    Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca

    2009-01-01

    The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…

  11. Computer-Assisted Culture Learning in an Online Augmented Reality Environment Based on Free-Hand Gesture Interaction

    ERIC Educational Resources Information Center

    Yang, Mau-Tsuen; Liao, Wan-Che

    2014-01-01

    The physical-virtual immersion and real-time interaction play an essential role in cultural and language learning. Augmented reality (AR) technology can be used to seamlessly merge virtual objects with real-world images to realize immersions. Additionally, computer vision (CV) technology can recognize free-hand gestures from live images to enable…

  12. Creating a Transformational Learning Experience: Immersing Students in an Intensive Interdisciplinary Learning Environment

    ERIC Educational Resources Information Center

    White, Shelley K.; Nitkin, Mindell Reiss

    2014-01-01

    The Simmons World Challenge is a unique, interdisciplinary program recently developed at Simmons College. It immerses students in an intensive winter-session course that challenges them to tackle a pressing social issue, such as poverty or hunger, and create actionable solutions to the problem. The program was conceived and designed to harness the…

  13. How 3D immersive visualization is changing medical diagnostics

    NASA Astrophysics Data System (ADS)

    Koning, Anton H. J.

    2011-03-01

    Originally the only way to look inside the human body without opening it up was by means of two dimensional (2D) images obtained using X-ray equipment. The fact that human anatomy is inherently three dimensional leads to ambiguities in interpretation and problems of occlusion. Three dimensional (3D) imaging modalities such as CT, MRI and 3D ultrasound remove these drawbacks and are now part of routine medical care. While most hospitals 'have gone digital', meaning that the images are no longer printed on film, they are still being viewed on 2D screens. However, this way valuable depth information is lost, and some interactions become unnecessarily complex or even unfeasible. Using a virtual reality (VR) system to present volumetric data means that depth information is presented to the viewer and 3D interaction is made possible. At the Erasmus MC we have developed V-Scope, an immersive volume visualization system for visualizing a variety of (bio-)medical volumetric datasets, ranging from 3D ultrasound, via CT and MRI, to confocal microscopy, OPT and 3D electron-microscopy data. In this talk we will address the advantages of such a system for both medical diagnostics as well as for (bio)medical research.

  14. The influence of visual characteristics of barriers on railway noise perception.

    PubMed

    Maffei, Luigi; Masullo, Massimiliano; Aletta, Francesco; Di Gabriele, Maria

    2013-02-15

    Noise annoyance is considered as the main effect of noise, it is a complex and multifaceted psychological concept dealing with immediate behavioral and evaluative aspects. In the last decades the research has intensely investigated the correlation between noise exposure and noise annoyance, nevertheless recent studies confirm that non-auditory factors influence the noise perception of individuals. In particular audio-video interaction can play a fundamental role. Today Immersive Virtual Reality (IVR) systems allow building laboratory test providing realistic experiences of the surrounding environment to detect more accurate information about the reactions of the local population. Regarding the interventions for environmental noise control the barriers represent the main solution; however some aspects related to their visual characteristic have to be further investigated. This paper presented a case study, where a sample of residents living close to a railway line assessed noise-related aspects for several barriers with different visual characteristics in an IVR laboratory test. In particular, three main factors were analyzed: the barrier type concerning the visibility of the noise source through the screen, the visual aspect of the barrier concerning some aesthetic issues and the noise level at the receiver concerning the acoustic performance of the barrier and the magnitude of the sound source. The main results of the ANOVA analysis showed that for transparent barriers Perceived Loudness and Noise Annoyance were judged lower than for opaque barriers; this difference increased as noise level increased. Copyright © 2012. Published by Elsevier B.V.

  15. Simulation of Physical Experiments in Immersive Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Wasfy, Tamer M.

    2001-01-01

    An object-oriented event-driven immersive Virtual environment is described for the creation of virtual labs (VLs) for simulating physical experiments. Discussion focuses on a number of aspects of the VLs, including interface devices, software objects, and various applications. The VLs interface with output devices, including immersive stereoscopic screed(s) and stereo speakers; and a variety of input devices, including body tracking (head and hands), haptic gloves, wand, joystick, mouse, microphone, and keyboard. The VL incorporates the following types of primitive software objects: interface objects, support objects, geometric entities, and finite elements. Each object encapsulates a set of properties, methods, and events that define its behavior, appearance, and functions. A container object allows grouping of several objects. Applications of the VLs include viewing the results of the physical experiment, viewing a computer simulation of the physical experiment, simulation of the experiments procedure, computational steering, and remote control of the physical experiment. In addition, the VL can be used as a risk-free (safe) environment for training. The implementation of virtual structures testing machines, virtual wind tunnels, and a virtual acoustic testing facility is described.

  16. Walking through doorways causes forgetting: Further explorations.

    PubMed

    Radvansky, Gabriel A; Krawietz, Sabine A; Tamplin, Andrea K

    2011-08-01

    Previous research using virtual environments has revealed a location-updating effect in which there is a decline in memory when people move from one location to another. Here we assess whether this effect reflects the influence of the experienced context, in terms of the degree of immersion of a person in an environment, as suggested by some work in spatial cognition, or by a shift in context. In Experiment 1, the degree of immersion was reduced by using smaller displays. In comparison, in Experiment 2 an actual, rather than a virtual, environment was used, to maximize immersion. Location-updating effects were observed under both of these conditions. In Experiment 3, the original encoding context was reinstated by having a person return to the original room in which objects were first encoded. However, inconsistent with an encoding specificity account, memory did not improve by reinstating this context. Finally, we did a further analysis of the results of this and previous experiments to assess the differential influence of foregrounding and retrieval interference. Overall, these data are interpreted in terms of the event horizon model of event cognition and memory.

  17. A Conceptual Framework for Mediated Environments

    ERIC Educational Resources Information Center

    Childs, Mark

    2010-01-01

    Background: Immersive virtual worlds are one of a range of different platforms that can be grouped under the concept of mediated environments, i.e. environments that create a metaphorical space in which participants can position themselves and be embodied. Synthesising the literatures concerning the various mediated environment technologies…

  18. Classification of Movement and Inhibition Using a Hybrid BCI.

    PubMed

    Chmura, Jennifer; Rosing, Joshua; Collazos, Steven; Goodwin, Shikha J

    2017-01-01

    Brain-computer interfaces (BCIs) are an emerging technology that are capable of turning brain electrical activity into commands for an external device. Motor imagery (MI)-when a person imagines a motion without executing it-is widely employed in BCI devices for motor control because of the endogenous origin of its neural control mechanisms, and the similarity in brain activation to actual movements. Challenges with translating a MI-BCI into a practical device used outside laboratories include the extensive training required, often due to poor user engagement and visual feedback response delays; poor user flexibility/freedom to time the execution/inhibition of their movements, and to control the movement type (right arm vs. left leg) and characteristics (reaching vs. grabbing); and high false positive rates of motion control. Solutions to improve sensorimotor activation and user performance of MI-BCIs have been explored. Virtual reality (VR) motor-execution tasks have replaced simpler visual feedback (smiling faces, arrows) and have solved this problem to an extent. Hybrid BCIs (hBCIs) implementing an additional control signal to MI have improved user control capabilities to a limited extent. These hBCIs either fail to allow the patients to gain asynchronous control of their movements, or have a high false positive rate. We propose an immersive VR environment which provides visual feedback that is both engaging and immediate, but also uniquely engages a different cognitive process in the patient that generates event-related potentials (ERPs). These ERPs provide a key executive function for the users to execute/inhibit movements. Additionally, we propose signal processing strategies and machine learning algorithms to move BCIs toward developing long-term signal stability in patients with distinctive brain signals and capabilities to control motor signals. The hBCI itself and the VR environment we propose would help to move BCI technology outside laboratory environments for motor rehabilitation in hospitals, and potentially for controlling a prosthetic.

  19. Classification of Movement and Inhibition Using a Hybrid BCI

    PubMed Central

    Chmura, Jennifer; Rosing, Joshua; Collazos, Steven; Goodwin, Shikha J.

    2017-01-01

    Brain-computer interfaces (BCIs) are an emerging technology that are capable of turning brain electrical activity into commands for an external device. Motor imagery (MI)—when a person imagines a motion without executing it—is widely employed in BCI devices for motor control because of the endogenous origin of its neural control mechanisms, and the similarity in brain activation to actual movements. Challenges with translating a MI-BCI into a practical device used outside laboratories include the extensive training required, often due to poor user engagement and visual feedback response delays; poor user flexibility/freedom to time the execution/inhibition of their movements, and to control the movement type (right arm vs. left leg) and characteristics (reaching vs. grabbing); and high false positive rates of motion control. Solutions to improve sensorimotor activation and user performance of MI-BCIs have been explored. Virtual reality (VR) motor-execution tasks have replaced simpler visual feedback (smiling faces, arrows) and have solved this problem to an extent. Hybrid BCIs (hBCIs) implementing an additional control signal to MI have improved user control capabilities to a limited extent. These hBCIs either fail to allow the patients to gain asynchronous control of their movements, or have a high false positive rate. We propose an immersive VR environment which provides visual feedback that is both engaging and immediate, but also uniquely engages a different cognitive process in the patient that generates event-related potentials (ERPs). These ERPs provide a key executive function for the users to execute/inhibit movements. Additionally, we propose signal processing strategies and machine learning algorithms to move BCIs toward developing long-term signal stability in patients with distinctive brain signals and capabilities to control motor signals. The hBCI itself and the VR environment we propose would help to move BCI technology outside laboratory environments for motor rehabilitation in hospitals, and potentially for controlling a prosthetic. PMID:28860986

  20. Comparisons of ice packs, hot water immersion, and analgesia injection for the treatment of centipede envenomations in Taiwan.

    PubMed

    Chaou, Chung-Hsien; Chen, Chian-Kuang; Chen, Jih-Chang; Chiu, Te-Fa; Lin, Chih-Chuan

    2009-08-01

    To compare the effectiveness of ice packs and hot water immersion for the treatment of centipede envenomations. Sixty patients envenomated by centipedes were randomized into three groups and were treated with ice packs, hot water immersion, or analgesia injection. The visual analog score (VAS) for pain was measured before the treatment and 15 min afterward. Demographic data and data on local and systemic effects after centipede bites were collected. The VAS scores and the pain decrease (DeltaVAS) were compared between the three groups. All patients suffered from pain at the affected sites; other local effects included redness (n = 49, 81.7%), swelling (n = 32, 53.3%), heat (n = 14, 23.3%), itchiness (n = 5, 8.3), and bullae formation (n = 3, 5.0%). Rare systemic effects were reported. All three groups had similar VAS scores before and after treatment. They also had similar effectiveness in reducing pain caused by centipedes bites (DeltaVAS = 2.55 +/- 1.88, 2.33 +/- 1.78, and 1.55 +/- 1.68, with ice packs, analgesia, and hot water immersion, respectively, p = 0.165). Ice packs, hot water immersion, and analgesics all improved the pain from centipede envenomation. Ice pack treatment is a safe, inexpensive, and non-invasive method for pre-hospital management in patients with centipede envenomation.

  1. Software for math and science education for the deaf.

    PubMed

    Adamo-Villani, Nicoletta; Wilbur, Ronnie

    2010-01-01

    In this article, we describe the development of two novel approaches to teaching math and science concepts to deaf children using 3D animated interactive software. One approach, Mathsigner, is non-immersive and the other, SMILE, is a virtual reality immersive environment. The content is curriculum-based, and the animated signing characters are constructed with state-of-the art technology and design. We report preliminary development findings of usability and appeal based on programme features (e.g. 2D/3D, immersiveness, interaction type, avatar and interface design) and subject features (hearing status, gender and age). Programme features of 2D/3D, immersiveness and interaction type were very much affected by subject features. Among subject features, we find significant effects of hearing status (deaf children take longer time and make more mistakes than hearing children) and gender (girls take longer than boys; girls prefer immersive environments rather than desktop presentation; girls are more interested in content than technology compared to boys). For avatar type, we found a preference for seamless, deformable characters over segmented ones. For interface comparisons, there were no subject effects, but an animated interface resulted in reduced time to task completion compared to static interfaces with and without sound and highlighting. These findings identify numerous features that affect software design and appeal and suggest that designers must be careful in their assumptions during programme development.

  2. Usage of stereoscopic visualization in the learning contents of rotational motion.

    PubMed

    Matsuura, Shu

    2013-01-01

    Rotational motion plays an essential role in physics even at an introductory level. In addition, the stereoscopic display of three-dimensional graphics includes is advantageous for the presentation of rotational motions, particularly for depth recognition. However, the immersive visualization of rotational motion has been known to lead to dizziness and even nausea for some viewers. Therefore, the purpose of this study is to examine the onset of nausea and visual fatigue when learning rotational motion through the use of a stereoscopic display. The findings show that an instruction method with intermittent exposure of the stereoscopic display and a simplification of its visual components reduced the onset of nausea and visual fatigue for the viewers, which maintained the overall effect of instantaneous spatial recognition.

  3. Quantifying kinematic differences between land and water during squats, split squats, and single-leg squats in a healthy population.

    PubMed

    Severin, Anna C; Burkett, Brendan J; McKean, Mark R; Wiegand, Aaron N; Sayers, Mark G L

    2017-01-01

    Aquatic exercises can be used in clinical and sporting disciplines for both rehabilitation and sports training. However, there is limited knowledge on the influence of water immersion on the kinematics of exercises commonly used in rehabilitation and fitness programs. The aim of this study was to use inertial sensors to quantify differences in kinematics and movement variability of bodyweight squats, split squats, and single-leg squats performed on dry land and whilst immersed to the level of the greater trochanter. During two separate testing sessions, 25 active healthy university students (22.3±2.9 yr.) performed ten repetitions of each exercise, whilst tri-axial inertial sensors (100 Hz) recorded their trunk and lower body kinematics. Repeated-measures statistics tested for differences in segment orientation and speed, movement variability, and waveform patterns between environments, while coefficient of variance was used to assess differences in movement variability. Between-environment differences in segment orientation and speed were portrayed by plotting the mean difference ±95% confidence intervals (CI) throughout the tasks. The results showed that the depth of the squat and split squat were unaffected by the changed environment while water immersion allowed for a deeper single leg squat. The different environments had significant effects on the sagittal plane orientations and speeds for all segments. Water immersion increased the degree of movement variability of the segments in all exercises, except for the shank in the frontal plane, which showed more variability on land. Without compromising movement depth, the aquatic environment induces more upright trunk and shank postures during squats and split squats. The aquatic environment allows for increased squat depth during the single-leg squat, and increased shank motions in the frontal plane. Our observations therefore support the use of water-based squat tasks for rehabilitation as they appear to improve the technique without compromising movement depth.

  4. Quantifying kinematic differences between land and water during squats, split squats, and single-leg squats in a healthy population

    PubMed Central

    2017-01-01

    Aquatic exercises can be used in clinical and sporting disciplines for both rehabilitation and sports training. However, there is limited knowledge on the influence of water immersion on the kinematics of exercises commonly used in rehabilitation and fitness programs. The aim of this study was to use inertial sensors to quantify differences in kinematics and movement variability of bodyweight squats, split squats, and single-leg squats performed on dry land and whilst immersed to the level of the greater trochanter. During two separate testing sessions, 25 active healthy university students (22.3±2.9 yr.) performed ten repetitions of each exercise, whilst tri-axial inertial sensors (100 Hz) recorded their trunk and lower body kinematics. Repeated-measures statistics tested for differences in segment orientation and speed, movement variability, and waveform patterns between environments, while coefficient of variance was used to assess differences in movement variability. Between-environment differences in segment orientation and speed were portrayed by plotting the mean difference ±95% confidence intervals (CI) throughout the tasks. The results showed that the depth of the squat and split squat were unaffected by the changed environment while water immersion allowed for a deeper single leg squat. The different environments had significant effects on the sagittal plane orientations and speeds for all segments. Water immersion increased the degree of movement variability of the segments in all exercises, except for the shank in the frontal plane, which showed more variability on land. Without compromising movement depth, the aquatic environment induces more upright trunk and shank postures during squats and split squats. The aquatic environment allows for increased squat depth during the single-leg squat, and increased shank motions in the frontal plane. Our observations therefore support the use of water-based squat tasks for rehabilitation as they appear to improve the technique without compromising movement depth. PMID:28767683

  5. Life in unexpected places: Employing visual thinking strategies in global health training.

    PubMed

    Allison, Jill; Mulay, Shree; Kidd, Monica

    2017-01-01

    The desire to make meaning out of images, metaphor, and other representations indicates higher-order cognitive skills that can be difficult to teach, especially in the complex and unfamiliar environments like those encountered in many global health experiences. Because reflecting on art can help develop medical students' imaginative and interpretive skills, we used visual thinking strategies (VTS) during an immersive 4-week global health elective for medical students to help them construct new understanding of the social determinants of health in a low-resource setting. We were aware of no previous formal efforts to use art in global health training. We assembled a group of eight medical students in front of a street mural in Kathmandu and used VTS methods to interpret the scene with respect to the social determinants of health. We recorded and transcribed the conversation and conducted a thematic analysis of student responses. Students shared observations about the mural in a supportive, nonjudgmental fashion. Two main themes emerged from their observations: those of human-environment interactions (specifically community dynamics, subsistence land use, resources, and health) and entrapment/control, particularly relating to expectations of, and demands on, women in traditional farming communities. They used the images as well as their experience in Nepali communities to consolidate complex community health concepts. VTS helped students articulate their deepening understanding of the social determinants of health in Nepal, suggesting that reflection on visual art can help learners apply, analyze, and evaluate complex concepts in global health. We demonstrate the relevance of drawing upon many aspects of cultural learning, regarding art as a kind of text that holds valuable information. These findings may help provide innovative opportunities for teaching and evaluating global health training in the future.

  6. Implementing virtual reality interfaces for the geosciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, W.; Jacobsen, J.; Austin, A.

    1996-06-01

    For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called {open_quotes}Virtual Reality{close_quotes} (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools. VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter threemore » or six-dimensional parameters. In the latter, three or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world, but the keyboard is the obstacle in that typing is cumbersome, error-prone and typically slow. In the latter, the user can interact with these parameters by means of motor skills which are highly developed. Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as a tool for computing streamlines via manipulation of a {open_quotes}rake.{close_quotes} The rake is presented to the user in the form of a {open_quotes}virtual well{close_quotes} icon, and provides parameters used by the streamlines algorithm.« less

  7. The forensic holodeck: an immersive display for forensic crime scene reconstructions.

    PubMed

    Ebert, Lars C; Nguyen, Tuan T; Breitbeck, Robert; Braun, Marcel; Thali, Michael J; Ross, Steffen

    2014-12-01

    In forensic investigations, crime scene reconstructions are created based on a variety of three-dimensional image modalities. Although the data gathered are three-dimensional, their presentation on computer screens and paper is two-dimensional, which incurs a loss of information. By applying immersive virtual reality (VR) techniques, we propose a system that allows a crime scene to be viewed as if the investigator were present at the scene. We used a low-cost VR headset originally developed for computer gaming in our system. The headset offers a large viewing volume and tracks the user's head orientation in real-time, and an optical tracker is used for positional information. In addition, we created a crime scene reconstruction to demonstrate the system. In this article, we present a low-cost system that allows immersive, three-dimensional and interactive visualization of forensic incident scene reconstructions.

  8. The influence of the aquatic environment on the control of postural sway.

    PubMed

    Marinho-Buzelli, Andresa R; Rouhani, Hossein; Masani, Kei; Verrier, Mary C; Popovic, Milos R

    2017-01-01

    Balance training in the aquatic environment is often used in rehabilitation practice to improve static and dynamic balance. Although aquatic therapy is widely used in clinical practice, we still lack evidence on how immersion in water actually impacts postural control. We examined how postural sway measured using centre of pressure and trunk acceleration parameters are influenced by the aquatic environment along with the effects of visual information. Our results suggest that the aquatic environment increases postural instability, measured by the centre of pressure parameters in the time-domain. The mean velocity and area were more significantly affected when individuals stood with eyes closed in the aquatic environment. In addition, a more forward posture was assumed in water with eyes closed in comparison to standing on land. In water, the low frequencies of sway were more dominant compared to standing on dry land. Trunk acceleration differed in water and dry land only for the larger upper trunk acceleration in mediolateral direction during standing in water. This finding shows that the study participants potentially resorted to using their upper trunk to compensate for postural instability in mediolateral direction. Only the lower trunk seemed to change acceleration pattern in anteroposterior and mediolateral directions when the eyes were closed, and it did so depending on the environment conditions. The increased postural instability and the change in postural control strategies that the aquatic environment offers may be a beneficial stimulus for improving balance control. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. A Practical Guide, with Theoretical Underpinnings, for Creating Effective Virtual Reality Learning Environments

    ERIC Educational Resources Information Center

    O'Connor, Eileen A.; Domingo, Jelia

    2017-01-01

    With the advent of open source virtual environments, the associated cost reductions, and the more flexible options, avatar-based virtual reality environments are within reach of educators. By using and repurposing readily available virtual environments, instructors can bring engaging, community-building, and immersive learning opportunities to…

  10. a Low-Cost and Lightweight 3d Interactive Real Estate-Purposed Indoor Virtual Reality Application

    NASA Astrophysics Data System (ADS)

    Ozacar, K.; Ortakci, Y.; Kahraman, I.; Durgut, R.; Karas, I. R.

    2017-11-01

    Interactive 3D architectural indoor design have been more popular after it benefited from Virtual Reality (VR) technologies. VR brings computer-generated 3D content to real life scale and enable users to observe immersive indoor environments so that users can directly modify it. This opportunity enables buyers to purchase a property off-the-plan cheaper through virtual models. Instead of showing property through 2D plan or renders, this visualized interior architecture of an on-sale unbuilt property is demonstrated beforehand so that the investors have an impression as if they were in the physical building. However, current applications either use highly resource consuming software, or are non-interactive, or requires specialist to create such environments. In this study, we have created a real-estate purposed low-cost high quality fully interactive VR application that provides a realistic interior architecture of the property by using free and lightweight software: Sweet Home 3D and Unity. A preliminary study showed that participants generally liked proposed real estate-purposed VR application, and it satisfied the expectation of the property buyers.

  11. NPSNET: Aural cues for virtual world immersion

    NASA Astrophysics Data System (ADS)

    Dahl, Leif A.

    1992-09-01

    NPSNET is a low-cost visual and aural simulation system designed and implemented at the Naval Postgraduate School. NPSNET is an example of a virtual world simulation environment that incorporates real-time aural cues through software-hardware interaction. In the current implementation of NPSNET, a graphics workstation functions in the sound server role which involves sending and receiving networked sound message packets across a Local Area Network, composed of multiple graphics workstations. The network messages contain sound file identification information that is transmitted from the sound server across an RS-422 protocol communication line to a serial to Musical Instrument Digital Interface (MIDI) converter. The MIDI converter, in turn relays the sound byte to a sampler, an electronic recording and playback device. The sampler correlates the hexadecimal input to a specific note or stored sound and sends it as an audio signal to speakers via an amplifier. The realism of a simulation is improved by involving multiple participant senses and removing external distractions. This thesis describes the incorporation of sound as aural cues, and the enhancement they provide in the virtual simulation environment of NPSNET.

  12. Introducing an Avatar Acceptance Model: Student Intention to Use 3D Immersive Learning Tools in an Online Learning Classroom

    ERIC Educational Resources Information Center

    Kemp, Jeremy William

    2011-01-01

    This quantitative survey study examines the willingness of online students to adopt an immersive virtual environment as a classroom tool and compares this with their feelings about more traditional learning modes including our ANGEL learning management system and the Elluminate live Web conferencing tool. I surveyed 1,108 graduate students in…

  13. Productive confusions: learning from simulations of pandemic virus outbreaks in Second Life

    NASA Astrophysics Data System (ADS)

    Cárdenas, Micha; Greci, Laura S.; Hurst, Samantha; Garman, Karen; Hoffman, Helene; Huang, Ricky; Gates, Michael; Kho, Kristen; Mehrmand, Elle; Porteous, Todd; Calvitti, Alan; Higginbotham, Erin; Agha, Zia

    2011-03-01

    Users of immersive virtual reality environments have reported a wide variety of side and after effects including the confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real can be turned around to explore the possibilities for immersion with minimal technological support in virtual world group training simulations. This paper will describe observations from my time working as an artist/researcher with the UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will examine moments of training to learn the software interface, moments within the drill and interviews after the drill.

  14. Female artists and the VR crucible: expanding the aesthetic vocabulary

    NASA Astrophysics Data System (ADS)

    Morie, Jacquelyn Ford

    2012-03-01

    Virtual Reality was a technological wonder in its early days, and it was widely held to be a domain where men were the main practitioners. However, a survey done in 2007 of VR Artworks (Immersive Virtual Environments or VEs) showed that women have actually created the majority of artistic immersive works. This argues against the popular idea that the field has been totally dominated by men. While men have made great contributions in advancing the field, especially technologically, it appears most artistic works emerge from a decidedly feminine approach. Such an approach seems well suited to immersive environments as it incorporates aspects of inclusion, wholeness, and a blending of the body and the spirit. Female attention to holistic concerns fits the gestalt approach needed to create in a fully functional yet open-ended virtual world, which focuses not so much on producing a finished object (like a text or a sculpture) but rather on creating a possibility for becoming, like bringing a child into the world. Immersive VEs are not objective works of art to be hung on a wall and critiqued. They are vehicles for experience, vessels to live within for a piece of time.

  15. A Succinct Overview of Virtual Reality Technology Use in Alzheimer’s Disease

    PubMed Central

    García-Betances, Rebeca I.; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer’s disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers’ education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments. PMID:26029101

  16. An unusual case of an immersion hand presentation in a military signaller operating in the jungle in Belize.

    PubMed

    Forbes, Kirstie E; Foster, P

    2017-12-01

    Belize, hosting one of the British Army's overseas training areas, provides access to challenging terrain and austere environments, which allows the delivery of training to soldiers on survival and combat within the jungle environment. A 26-year-old infanteer on exercise in Belize presented with progressive bilateral dry, painful, oedematous hands, secondary to the harsh environmental conditions of the jungle and inadequate drying of his hands resulting in his inability to perform his combat duties. The symptoms completely resolved with drying, emollient application and analgesia. While there are no reported cases of immersion hand, comparisons can be made with the well-reported warm weather immersion foot. This case highlights the importance of force preparation and soldier education for units deploying to the jungle. Simple preventive measures, including adequate 'wet-dry' drills and use of emollients can reduce the prevalence of immersion hand, a preventable condition, which can have a significant impact on the overall combat effectiveness of the unit. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Immersive telepresence system using high-resolution omnidirectional movies and a locomotion interface

    NASA Astrophysics Data System (ADS)

    Ikeda, Sei; Sato, Tomokazu; Kanbara, Masayuki; Yokoya, Naokazu

    2004-05-01

    Technology that enables users to experience a remote site virtually is called telepresence. A telepresence system using real environment images is expected to be used in the field of entertainment, medicine, education and so on. This paper describes a novel telepresence system which enables users to walk through a photorealistic virtualized environment by actual walking. To realize such a system, a wide-angle high-resolution movie is projected on an immersive multi-screen display to present users the virtualized environments and a treadmill is controlled according to detected user's locomotion. In this study, we use an omnidirectional multi-camera system to acquire images real outdoor scene. The proposed system provides users with rich sense of walking in a remote site.

  18. Laboratory Investigation of Space and Planetary Dust Grains

    NASA Technical Reports Server (NTRS)

    Spann, James

    2005-01-01

    Dust in space is ubiquitous and impacts diverse observed phenomena in various ways. Understanding the dominant mechanisms that control dust grain properties and its impact on surrounding environments is basic to improving our understanding observed processes at work in space. There is a substantial body of work on the theory and modeling of dust in space and dusty plasmas. To substantiate and validate theory and models, laboratory investigations and space borne observations have been conducted. Laboratory investigations are largely confined to an assembly of dust grains immersed in a plasma environment. Frequently the behaviors of these complex dusty plasmas in the laboratory have raised more questions than verified theories. Space borne observations have helped us characterize planetary environments. The complex behavior of dust grains in space indicates the need to understand the microphysics of individual grains immersed in a plasma or space environment.

  19. Virtually numbed: immersive video gaming alters real-life experience.

    PubMed

    Weger, Ulrich W; Loughnan, Stephen

    2014-04-01

    As actors in a highly mechanized environment, we are citizens of a world populated not only by fellow humans, but also by virtual characters (avatars). Does immersive video gaming, during which the player takes on the mantle of an avatar, prompt people to adopt the coldness and rigidity associated with robotic behavior and desensitize them to real-life experience? In one study, we correlated participants' reported video-gaming behavior with their emotional rigidity (as indicated by the number of paperclips that they removed from ice-cold water). In a second experiment, we manipulated immersive and nonimmersive gaming behavior and then likewise measured the extent of the participants' emotional rigidity. Both studies yielded reliable impacts, and thus suggest that immersion into a robotic viewpoint desensitizes people to real-life experiences in oneself and others.

  20. Training wheelchair navigation in immersive virtual environments for patients with spinal cord injury - end-user input to design an effective system.

    PubMed

    Nunnerley, Joanne; Gupta, Swati; Snell, Deborah; King, Marcus

    2017-05-01

    A user-centred design was used to develop and test the feasibility of an immersive 3D virtual reality wheelchair training tool for people with spinal cord injury (SCI). A Wheelchair Training System was designed and modelled using the Oculus Rift headset and a Dynamic Control wheelchair joystick. The system was tested by clinicians and expert wheelchair users with SCI. Data from focus groups and individual interviews were analysed using a general inductive approach to thematic analysis. Four themes emerged: Realistic System, which described the advantages of a realistic virtual environment; a Wheelchair Training System, which described participants' thoughts on the wheelchair training applications; Overcoming Resistance to Technology, the obstacles to introducing technology within the clinical setting; and Working outside the Rehabilitation Bubble which described the protective hospital environment. The Oculus Rift Wheelchair Training System has the potential to provide a virtual rehabilitation setting which could allow wheelchair users to learn valuable community wheelchair use in a safe environment. Nausea appears to be a side effect of the system, which will need to be resolved before this can be a viable clinical tool. Implications for Rehabilitation Immersive virtual reality shows promising benefit for wheelchair training in a rehabilitation setting. Early engagement with consumers can improve product development.

  1. Auditory and visual 3D virtual reality therapy as a new treatment for chronic subjective tinnitus: Results of a randomized controlled trial.

    PubMed

    Malinvaud, D; Londero, A; Niarra, R; Peignard, Ph; Warusfel, O; Viaud-Delmon, I; Chatellier, G; Bonfils, P

    2016-03-01

    Subjective tinnitus (ST) is a frequent audiologic condition that still requires effective treatment. This study aimed at evaluating two therapeutic approaches: Virtual Reality (VR) immersion in auditory and visual 3D environments and Cognitive Behaviour Therapy (CBT). This open, randomized and therapeutic equivalence trial used bilateral testing of VR versus CBT. Adult patients displaying unilateral or predominantly unilateral ST, and fulfilling inclusion criteria were included after giving their written informed consent. We measured the different therapeutic effect by comparing the mean scores of validated questionnaires and visual analog scales, pre and post protocol. Equivalence was established if both strategies did not differ for more than a predetermined limit. We used univariate and multivariate analysis adjusted on baseline values to assess treatment efficacy. In addition of this trial, purely exploratory comparison to a waiting list group (WL) was provided. Between August, 2009 and November, 2011, 148 of 162 screened patients were enrolled (VR n = 61, CBT n = 58, WL n = 29). These groups did not differ at baseline for demographic data. Three month after the end of the treatment, we didn't find any difference between VR and CBT groups either for tinnitus severity (p = 0.99) or tinnitus handicap (p = 0.36). VR appears to be at least as effective as CBT in unilateral ST patients. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Effect of predictive sign of acceleration on heart rate variability in passive translation situation: preliminary evidence using visual and vestibular stimuli in VR environment

    PubMed Central

    Watanabe, Hiroshi; Teramoto, Wataru; Umemura, Hiroyuki

    2007-01-01

    Objective We studied the effects of the presentation of a visual sign that warned subjects of acceleration around the yaw and pitch axes in virtual reality (VR) on their heart rate variability. Methods Synchronization of the immersive virtual reality equipment (CAVE) and motion base system generated a driving scene and provided subjects with dynamic and wide-ranging depth information and vestibular input. The heart rate variability of 21 subjects was measured while the subjects observed a simulated driving scene for 16 minutes under three different conditions. Results When the predictive sign of the acceleration appeared 3500 ms before the acceleration, the index of the activity of the autonomic nervous system (low/high frequency ratio; LF/HF ratio) of subjects did not change much, whereas when no sign appeared the LF/HF ratio increased over the observation time. When the predictive sign of the acceleration appeared 750 ms before the acceleration, no systematic change occurred. Conclusion The visual sign which informed subjects of the acceleration affected the activity of the autonomic nervous system when it appeared long enough before the acceleration. Also, our results showed the importance of the interval between the sign and the event and the relationship between the gradual representation of events and their quantity. PMID:17903267

  3. Immersive virtual reality improves movement patterns in patients after ACL reconstruction: implications for enhanced criteria-based return-to-sport rehabilitation.

    PubMed

    Gokeler, Alli; Bisschop, Marsha; Myer, Gregory D; Benjaminse, Anne; Dijkstra, Pieter U; van Keeken, Helco G; van Raay, Jos J A M; Burgerhof, Johannes G M; Otten, Egbert

    2016-07-01

    The purpose of this study was to evaluate the influence of immersion in a virtual reality environment on knee biomechanics in patients after ACL reconstruction (ACLR). It was hypothesized that virtual reality techniques aimed to change attentional focus would influence altered knee flexion angle, knee extension moment and peak vertical ground reaction force (vGRF) in patients following ACLR. Twenty athletes following ACLR and 20 healthy controls (CTRL) performed a step-down task in both a non-virtual reality environment and a virtual reality environment displaying a pedestrian traffic scene. A motion analysis system and force plates were used to measure kinematics and kinetics during a step-down task to analyse each single-leg landing. A significant main effect was found for environment for knee flexion excursion (P = n.s.). Significant interaction differences were found between environment and groups for vGRF (P = 0.004), knee moment (P < 0.001), knee angle at peak vGRF (P = 0.01) and knee flexion excursion (P = 0.03). There was larger effect of virtual reality environment on knee biomechanics in patients after ACLR compared with controls. Patients after ACLR immersed in virtual reality environment demonstrated knee joint biomechanics that approximate those of CTRL. The results of this study indicate that a realistic virtual reality scenario may distract patients after ACLR from conscious motor control. Application of clinically available technology may aid in current rehabilitation programmes to target altered movement patterns after ACLR. Diagnostic study, Level III.

  4. Get immersed in the Soil Sciences: the first community of avatars in the EGU Assembly 2015!

    NASA Astrophysics Data System (ADS)

    Castillo, Sebastian; Alarcón, Purificación; Beato, Mamen; Emilio Guerrero, José; José Martínez, Juan; Pérez, Cristina; Ortiz, Leovigilda; Taguas, Encarnación V.

    2015-04-01

    Virtual reality and immersive worlds refer to artificial computer-generated environments, with which users act and interact as in a known environment by the use of figurative virtual individuals (avatars). Virtual environments will be the technology of the early twenty-first century that will most dramatically change the way we live, particularly in the areas of training and education, product development and entertainment (Schmorrow, 2009). The usefulness of immersive worlds has been proved in different fields. They reduce geographic and social barriers between different stakeholders and create virtual social spaces which can positively impact learning and discussion outcomes (Lorenzo et al. 2012). In this work we present a series of interactive meetings in a virtual building to celebrate the International Year of Soil to promote the importance of soil functions and its conservation. In a virtual room, the avatars of different senior researchers will meet young scientist avatars to talk about: 1) what remains to be done in Soil Sciences; 2) which are their main current limitations and difficulties and 3) which are the future hot research lines. The interactive participation does not require physically attend to the EGU Assembly 2015. In addition, this virtual building inspired in Soil Sciences can be completed with different teaching resources from different locations around the world and it will be used to improve the learning of Soil Sciences in a multicultural context. REFERENCES: Lorenzo C.M., Sicilia, M.A., Sánchez S. 2012. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Computers & Education 59 (2012) 1361-1376 Schmorrow D.D. 2009. "Why virtual?" Theoretical Issues in Ergonomics Science 10(3): 279-282.

  5. In Vitro Investigation of the Effect of Oral Bacteria in the Surface Oxidation of Dental Implants.

    PubMed

    Sridhar, Sathyanarayanan; Wilson, Thomas G; Palmer, Kelli L; Valderrama, Pilar; Mathew, Mathew T; Prasad, Shalini; Jacobs, Michael; Gindri, Izabelle M; Rodrigues, Danieli C

    2015-10-01

    Bacteria are major contributors to the rising number of dental implant failures. Inflammation secondary to bacterial colonization and bacterial biofilm is a major etiological factor associated with early and late implant failure (peri-implantitis). Even though there is a strong association between bacteria and bacterial biofilm and failure of dental implants, their effect on the surface of implants is yet not clear. To develop and establish an in vitro testing methodology to investigate the effect of early planktonic bacterial colonization on the surface of dental implants for a period of 60 days. Commercial dental implants were immersed in bacterial (Streptococcus mutans in brain-heart infusion broth) and control (broth only) media. Immersion testing was performed for a period of 60 days. During testing, optical density and pH of immersion media were monitored. The implant surface was surveyed with different microscopy techniques post-immersion. Metal ion release in solution was detected with an electrochemical impedance spectroscopy sensor platform called metal ion electrochemical biosensor (MIEB). Bacteria grew in the implant-containing medium and provided a sustained acidic environment. Implants immersed in bacterial culture displayed various corrosion features, including surface discoloration, deformation of rough and smooth interfaces, pitting attack, and severe surface rusting. The surface features were confirmed by microscopic techniques, and metal particle generation was detected by the MIEB. Implant surface oxidation occurred in bacteria-containing medium even at early stages of immersion (2 days). The incremental corrosion resulted in dissolution of metal ions and debris into the testing solution. Dissolution of metal ions and particles in the oral environment can trigger or contribute to the development of peri-implantitis at later stages. © 2015 Wiley Periodicals, Inc.

  6. Media and Literacy: What's Good?

    ERIC Educational Resources Information Center

    Newkirk, Thomas

    2006-01-01

    For schools to effectively teach literacy, they should work with, not against, the cultural tools that students bring to school. Outside school, students' lives are immersed in visually mediated narratives. By tapping into the cultural, artistic, and linguistic resources of popular culture and multimedia, teachers can create more willing readers…

  7. 3D visualization of optical ray aberration and its broadcasting to smartphones by ray aberration generator

    NASA Astrophysics Data System (ADS)

    Hellman, Brandon; Bosset, Erica; Ender, Luke; Jafari, Naveed; McCann, Phillip; Nguyen, Chris; Summitt, Chris; Wang, Sunglin; Takashima, Yuzuru

    2017-11-01

    The ray formalism is critical to understanding light propagation, yet current pedagogy relies on inadequate 2D representations. We present a system in which real light rays are visualized through an optical system by using a collimated laser bundle of light and a fog chamber. Implementation for remote and immersive access is enabled by leveraging a commercially available 3D viewer and gesture-based remote controlling of the tool via bi-directional communication over the Internet.

  8. Effects of Exercise in Immersive Virtual Environments on Cortical Neural Oscillations and Mental State

    PubMed Central

    Vogt, Tobias; Herpers, Rainer; Askew, Christopher D.; Scherfgen, David; Strüder, Heiko K.; Schneider, Stefan

    2015-01-01

    Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence. PMID:26366305

  9. Effects of Exercise in Immersive Virtual Environments on Cortical Neural Oscillations and Mental State.

    PubMed

    Vogt, Tobias; Herpers, Rainer; Askew, Christopher D; Scherfgen, David; Strüder, Heiko K; Schneider, Stefan

    2015-01-01

    Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence.

  10. Workstations for people with disabilities: an example of a virtual reality approach

    PubMed Central

    Budziszewski, Paweł; Grabowski, Andrzej; Milanowicz, Marcin; Jankowski, Jarosław

    2016-01-01

    This article describes a method of adapting workstations for workers with motion disability using computer simulation and virtual reality (VR) techniques. A workstation for grinding spring faces was used as an example. It was adjusted for two people with a disabled right upper extremity. The study had two stages. In the first, a computer human model with a visualization of maximal arm reach and preferred workspace was used to develop a preliminary modification of a virtual workstation. In the second stage, an immersive VR environment was used to assess the virtual workstation and to add further modifications. All modifications were assessed by measuring the efficiency of work and the number of movements involved. The results of the study showed that a computer simulation could be used to determine whether a worker with a disability could access all important areas of a workstation and to propose necessary modifications. PMID:26651540

  11. Decoupling, situated cognition and immersion in art.

    PubMed

    Reboul, Anne

    2015-09-01

    Situated cognition seems incompatible with strong decoupling, where representations are deployed in the absence of their targets and are not oriented toward physical action. Yet, in art consumption, the epitome of a strongly decoupled cognitive process, the artwork is a physical part of the environment and partly controls the perception of its target by the audience, leading to immersion. Hence, art consumption combines strong decoupling with situated cognition.

  12. Psychometric Assessment of Stereoscopic Head-Mounted Displays

    DTIC Science & Technology

    2016-06-29

    Journal Article 3. DATES COVERED (From – To) Jan 2015 - Dec 2015 4. TITLE AND SUBTITLE PSYCHOMETRIC ASSESSMENT OF STEREOSCOPIC HEAD- MOUNTED DISPLAYS...to render an immersive three-dimensional constructive environment. The purpose of this effort was to quantify the impact of aircrew vision on an...simulated tasks requiring precise depth discrimination. This work will provide an example validation method for future stereoscopic virtual immersive

  13. Virtual Reality to Train Diagnostic Skills in Eating Disorders. Comparison of two Low Cost Systems.

    PubMed

    Gutiérrez-Maldonado, José; Ferrer-García, Marta; Plasanjuanelo, Joana; Andrés-Pueyo, Antonio; Talarn-Caparrós, Antoni

    2015-01-01

    Enhancing the ability to perform differential diagnosis and psychopathological exploration is important for students who wish to work in the clinical field, as well as for professionals already working in this area. Virtual reality (VR) simulations can immerse students totally in educational experiences in a way that is not possible using other methods. Learning in a VR environment can also be more effective and motivating than usual classroom practices. Traditionally, immersion has been considered central to the quality of a VR system; immersive VR is considered a special and unique experience that cannot achieved by three-dimensional (3D) interactions on desktop PCs. However, some authors have suggested that if the content design is emotionally engaging, immersive systems are not always necessary. The main purpose of this study is to compare the efficacy and usability of two low-cost VR systems, offering different levels of immersion, in order to develop the ability to perform diagnostic interviews in eating disorders by means of simulations of psychopathological explorations.

  14. Mobile Virtual Reality : A Solution for Big Data Visualization

    NASA Astrophysics Data System (ADS)

    Marshall, E.; Seichter, N. D.; D'sa, A.; Werner, L. A.; Yuen, D. A.

    2015-12-01

    Pursuits in geological sciences and other branches of quantitative sciences often require data visualization frameworks that are in continual need of improvement and new ideas. Virtual reality is a medium of visualization that has large audiences originally designed for gaming purposes; Virtual reality can be captured in Cave-like environment but they are unwieldy and expensive to maintain. Recent efforts by major companies such as Facebook have focussed more on a large market , The Oculus is the first of such kind of mobile devices The operating system Unity makes it possible for us to convert the data files into a mesh of isosurfaces and be rendered into 3D. A user is immersed inside of the virtual reality and is able to move within and around the data using arrow keys and other steering devices, similar to those employed in XBox.. With introductions of products like the Oculus Rift and Holo Lens combined with ever increasing mobile computing strength, mobile virtual reality data visualization can be implemented for better analysis of 3D geological and mineralogical data sets. As more new products like the Surface Pro 4 and other high power yet very mobile computers are introduced to the market, the RAM and graphics card capacity necessary to run these models is more available, opening doors to this new reality. The computing requirements needed to run these models are a mere 8 GB of RAM and 2 GHz of CPU speed, which many mobile computers are starting to exceed. Using Unity 3D software to create a virtual environment containing a visual representation of the data, any data set converted into FBX or OBJ format which can be traversed by wearing the Oculus Rift device. This new method for analysis in conjunction with 3D scanning has potential applications in many fields, including the analysis of precious stones or jewelry. Using hologram technology to capture in high-resolution the 3D shape, color, and imperfections of minerals and stones, detailed review and analysis of the stone can be done remotely without ever seeing the real thing. This strategy can be game-changer for shoppers without having to go to the store.

  15. New Technologies for Acquisition and 3-D Visualization of Geophysical and Other Data Types Combined for Enhanced Understandings and Efficiencies of Oil and Gas Operations, Deepwater Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Thomson, J. A.; Gee, L. J.; George, T.

    2002-12-01

    This presentation shows results of a visualization method used to display and analyze multiple data types in a geospatially referenced three-dimensional (3-D) space. The integrated data types include sonar and seismic geophysical data, pipeline and geotechnical engineering data, and 3-D facilities models. Visualization of these data collectively in proper 3-D orientation yields insights and synergistic understandings not previously obtainable. Key technological components of the method are: 1) high-resolution geophysical data obtained using a newly developed autonomous underwater vehicle (AUV), 2) 3-D visualization software that delivers correctly positioned display of multiple data types and full 3-D flight navigation within the data space and 3) a highly immersive visualization environment (HIVE) where multidisciplinary teams can work collaboratively to develop enhanced understandings of geospatially complex data relationships. The initial study focused on an active deepwater development area in the Green Canyon protraction area, Gulf of Mexico. Here several planned production facilities required detailed, integrated data analysis for design and installation purposes. To meet the challenges of tight budgets and short timelines, an innovative new method was developed based on the combination of newly developed technologies. Key benefits of the method include enhanced understanding of geologically complex seabed topography and marine soils yielding safer and more efficient pipeline and facilities siting. Environmental benefits include rapid and precise identification of potential locations of protected deepwater biological communities for avoidance and protection during exploration and production operations. In addition, the method allows data presentation and transfer of learnings to an audience outside the scientific and engineering team. This includes regulatory personnel, marine archaeologists, industry partners and others.

  16. Corrosion of RoHS-Compliant Surface Finishes in Corrosive Mixed Flowing Gas Environments

    NASA Astrophysics Data System (ADS)

    Hannigan, K.; Reid, M.; Collins, M. N.; Dalton, E.; Xu, C.; Wright, B.; Demirkan, K.; Opila, R. L.; Reents, W. D.; Franey, J. P.; Fleming, D. A.; Punch, J.

    2012-03-01

    Recently, the corrosion resistance of printed wiring board (PWB) finishes has generated considerable interest due to field failures observed in various parts of the world. This study investigates the corrosion issues associated with the different lead-free PWB surface finishes. Corrosion products on various PWB surface finishes generated in mixed flowing gas (MFG) environments were studied, and analysis techniques such as scanning electron microscopy, energy-dispersive x-ray, x-ray diffraction, focused ion beam, and scanning Auger microscopy were used to quantify the corrosion layer thickness and determine the composition of corrosion products. The corrosion on organic solderability preservative samples shows similar corrosion products to bare copper and is mainly due to direct attack of copper traces by corrosive gases. The corrosion on electroless nickel immersion gold occurs primarily through the porosity in the film and is accelerated by the galvanic potential between gold and copper; similar results were observed on immersion silver. Immersion tin shows excellent corrosion resistance due to its inherent corrosion resistance in the MFG environment as well as the opposite galvanic potential between tin and copper compared with gold or silver and copper.

  17. Sonic environment of aircraft structure immersed in a supersonic jet flow stream

    NASA Technical Reports Server (NTRS)

    Guinn, W. A.; Balena, F. J.; Soovere, J.

    1976-01-01

    Test methods for determining the sonic environment of aircraft structure that is immersed in the flow stream of a high velocity jet or that is subjected to the noise field surrounding the jet, were investigated. Sonic environment test data measured on a SCAT 15-F model in the flow field of Mach 1.5 and 2.5 jets were processed. Narrow band, lateral cross correlation and noise contour plots are presented. Data acquisition and reduction methods are depicted. A computer program for scaling the model data is given that accounts for model size, jet velocity, transducer size, and jet density. Comparisons of scaled model data and full size aircraft data are made for the L-1011, S-3A, and a V/STOL lower surface blowing concept. Sonic environment predictions are made for an engine-over-the-wing SST configuration.

  18. The Fidelity of ’Feel’: Emotional Affordance in Virtual Environments

    DTIC Science & Technology

    2005-07-01

    The Fidelity of “Feel”: Emotional Affordance in Virtual Environments Jacquelyn Ford Morie, Josh Williams, Aimee Dozois, Donat-Pierre Luigi... environment but also the participant. We do this with the focus on what emotional affordances this manipulation will provide. Our first evaluation scenario...emotionally affective VEs. Keywords: Immersive Environments , Virtual Environments , VEs, Virtual Reality, emotion , affordance, fidelity, presence

  19. Bonding capacity of the GFRP-S on strengthened RC beams after sea water immersion

    NASA Astrophysics Data System (ADS)

    Sultan, Mufti Amir; Djamaluddin, Rudy

    2017-11-01

    Construction of concrete structures that located in extreme environments are such as coastal areas will result in decreased strength or even the damage of the structures. As well know, chloride contained in sea water is responsible for strength reduction or structure fail were hence maintenance and repairs on concrete structure urgently needed. One popular method of structural improvements which under investigation is to use the material Glass Fibre Reinforced Polymer which has one of the advantages such as corrosion resistance. This research will be conducted experimental studies to investigate the bonding capacity behavior of reinforced concrete beams with reinforcement GFRP-S immersed in sea water using immersion time of one month, three months, six months and twelve months. Test specimen consists of 12 pieces of reinforced concrete beams with dimensions (150x200x3000) mm that had been reinforced with GFRP-S in the area of bending, the beam without immersion (B0), immersion one month (B1), three months (B3), six months (B6) and twelve months (B12). Test specimen were cured for 28 days before the application of the GFRP sheet. Test specimen B1, B3, B6 and B12 that have been immersed in sea water pool with a immersion time each 1, 3, 6 and 12 months. The test specimen without immersion test by providing a static load until it reaches the failure, to record data during the test strain gauge mounted on the surface of the specimen and the GFRP to collect the strain value. From the research it obvious that there is a decrease bonding capacity on specimens immersed for one month, three months, six months and twelve months against the test object without immersion of 8.85%; 8.89%; 9.33% and 11.04%.

  20. Ecophysiological importance of cloud immersion in a relic spruce-fir forest at elevational limits, southern Appalachian Mountains, USA.

    PubMed

    Berry, Z Carter; Smith, William K

    2013-11-01

    Climate warming predicts changes to the frequency and height of cloud-immersion events in mountain communities. Threatened southern Appalachian spruce-fir forests have been suggested to persist because of frequent periods of cloud immersion. These relic forests exist on only seven mountaintop areas, grow only above ca. 1,500 m elevation (maximum 2,037 m), and harbor the endemic Abies fraseri. To predict future distribution, we examined the ecophysiological effects of cloud immersion on saplings of A. fraseri and Picea rubens at their upper and lower elevational limits. Leaf photosynthesis, conductance, transpiration, xylem water potentials, and general abiotic variables were measured simultaneously on individuals at the top (1,960 m) and bottom (1,510 m) of their elevation limits on numerous clear and cloud-immersed days throughout the growing season. The high elevation sites had 1.5 as many cloud-immersed days (75 % of days) as the low elevation sites (56 % of days). Cloud immersion resulted in higher photosynthesis, leaf conductance, and xylem water potentials, particularly during afternoon measurements. Leaf conductance remained higher throughout the day with corresponding increases in photosynthesis and transpiration, despite low photon flux density levels, leading to an increase in water potentials from morning to afternoon. The endemic A. fraseri had a greater response in carbon gain and water balance in response to cloud immersion. Climate models predict warmer temperatures with a decrease in the frequency of cloud immersion for this region, leading to an environment on these peaks similar to elevations where spruce-fir communities currently do not exist. Because spruce-fir communities may rely on cloud immersion for improved carbon gain and water conservation, an upslope shift is likely if cloud ceilings rise. Their ultimate survival will likely depend on the magnitude of changes in cloud regimes.

  1. Use of Visual and Proprioceptive Feedback to Improve Gait Speed and Spatiotemporal Symmetry Following Chronic Stroke: A Case Series

    PubMed Central

    Feasel, Jeff; Wentz, Erin; Brooks, Frederick P.; Whitton, Mary C.

    2012-01-01

    Background and Purpose Persistent deficits in gait speed and spatiotemporal symmetry are prevalent following stroke and can limit the achievement of community mobility goals. Rehabilitation can improve gait speed, but has shown limited ability to improve spatiotemporal symmetry. The incorporation of combined visual and proprioceptive feedback regarding spatiotemporal symmetry has the potential to be effective at improving gait. Case Description A 60-year-old man (18 months poststroke) and a 53-year-old woman (21 months poststroke) each participated in gait training to improve gait speed and spatiotemporal symmetry. Each patient performed 18 sessions (6 weeks) of combined treadmill-based gait training followed by overground practice. To assist with relearning spatiotemporal symmetry, treadmill-based training for both patients was augmented with continuous, real-time visual and proprioceptive feedback from an immersive virtual environment and a dual belt treadmill, respectively. Outcomes Both patients improved gait speed (patient 1: 0.35 m/s improvement; patient 2: 0.26 m/s improvement) and spatiotemporal symmetry. Patient 1, who trained with step-length symmetry feedback, improved his step-length symmetry ratio, but not his stance-time symmetry ratio. Patient 2, who trained with stance-time symmetry feedback, improved her stance-time symmetry ratio. She had no step-length asymmetry before training. Discussion Both patients made improvements in gait speed and spatiotemporal symmetry that exceeded those reported in the literature. Further work is needed to ascertain the role of combined visual and proprioceptive feedback for improving gait speed and spatiotemporal symmetry after chronic stroke. PMID:22228605

  2. Real-time visualization of magnetic flux densities for transcranial magnetic stimulation on commodity and fully immersive VR systems

    NASA Astrophysics Data System (ADS)

    Kalivarapu, Vijay K.; Serrate, Ciro; Hadimani, Ravi L.

    2017-05-01

    Transcranial Magnetic Stimulation (TMS) is a non-invasive procedure that uses time varying short pulses of magnetic fields to stimulate nerve cells in the brain. In this method, a magnetic field generator ("TMS coil") produces small electric fields in the region of the brain via electromagnetic induction. This technique can be used to excite or inhibit firing of neurons, which can then be used for treatment of various neurological disorders such as Parkinson's disease, stroke, migraine, and depression. It is however challenging to focus the induced electric field from TMS coils to smaller regions of the brain. Since electric and magnetic fields are governed by laws of electromagnetism, it is possible to numerically simulate and visualize these fields to accurately determine the site of maximum stimulation and also to develop TMS coils that can focus the fields on the targeted regions. However, current software to compute and visualize these fields are not real-time and can work for only one position/orientation of TMS coil, severely limiting their usage. This paper describes the development of an application that computes magnetic flux densities (h-fields) and visualizes their distribution for different TMS coil position/orientations in real-time using GPU shaders. The application is developed for desktop, commodity VR (HTC Vive), and fully immersive VR CAVETM systems, for use by researchers, scientists, and medical professionals to quickly and effectively view the distribution of h-fields from MRI brain scans.

  3. VISUAL3D - An EIT network on visualization of geomodels

    NASA Astrophysics Data System (ADS)

    Bauer, Tobias

    2017-04-01

    When it comes to interpretation of data and understanding of deep geological structures and bodies at different scales then modelling tools and modelling experience is vital for deep exploration. Geomodelling provides a platform for integration of different types of data, including new kinds of information (e.g., new improved measuring methods). EIT Raw Materials, initiated by the EIT (European Institute of Innovation and Technology) and funded by the European Commission, is the largest and strongest consortium in the raw materials sector worldwide. The VISUAL3D network of infrastructure is an initiative by EIT Raw Materials and aims at bringing together partners with 3D-4D-visualisation infrastructure and 3D-4D-modelling experience. The recently formed network collaboration interlinks hardware, software and expert knowledge in modelling visualization and output. A special focus will be the linking of research, education and industry and integrating multi-disciplinary data and to visualize the data in three and four dimensions. By aiding network collaborations we aim at improving the combination of geomodels with differing file formats and data characteristics. This will create an increased competency in modelling visualization and the ability to interchange and communicate models more easily. By combining knowledge and experience in geomodelling with expertise in Virtual Reality visualization partners of EIT Raw Materials but also external parties will have the possibility to visualize, analyze and validate their geomodels in immersive VR-environments. The current network combines partners from universities, research institutes, geological surveys and industry with a strong background in geological 3D-modelling and 3D visualization and comprises: Luleå University of Technology, Geological Survey of Finland, Geological Survey of Denmark and Greenland, TUBA Freiberg, Uppsala University, Geological Survey of France, RWTH Aachen, DMT, KGHM Cuprum, Boliden, Montan Universität Leoben, Slovenian National Building and Civil Engineering Institute, Tallinn University of Technology and Turku University. The infrastructure within the network comprises different types of capturing and visualization hardware, ranging from high resolution cubes, VR walls, VR goggle solutions, high resolution photogrammetry, UAVs, lidar-scanners, and many more.

  4. Determination of immersion factors for radiance sensors in marine and inland waters: a semi-analytical approach using refractive index approximation

    NASA Astrophysics Data System (ADS)

    Dev, Pravin J.; Shanmugam, P.

    2016-05-01

    Underwater radiometers are generally calibrated in air using a standard source. The immersion factors are required for these radiometers to account for the change in the in-water measurements with respect to in-air due to the different refractive index of the medium. The immersion factors previously determined for RAMSES series of commercial radiometers manufactured by TriOS are applicable to clear oceanic waters. In typical inland and turbid productive coastal waters, these experimentally determined immersion factors yield significantly large errors in water-leaving radiances (Lw) and hence remote sensing reflectances (Rrs). To overcome this limitation, a semi-analytical method with based on the refractive index approximation is proposed in this study, with the aim of obtaining reliable Lw and Rrs from RAMSES radiometers for turbid and productive waters within coastal and inland water environments. We also briefly show the variation of pure water immersion factors (Ifw) and newly derived If on Lw and Rrs for clear and turbid waters. The remnant problems other than the immersion factor coefficients such as transmission, air-water and water-air Fresnel's reflectances are also discussed.

  5. Renal and cardiovascular responses to water immersion in trained runners and swimmers

    NASA Technical Reports Server (NTRS)

    Convertino, V. A.; Tatro, D. L.; Rogan, R. B.

    1993-01-01

    The purpose of this study was to determine if fluid-electrolyte, renal, hormonal, and cardiovascular responses during and after multi-hour water immersion were associated with aerobic training. Additionally, we compared these responses in those who trained in a hypogravic versus a 1-g environment. Seventeen men comprised three similarly aged groups: six long-distance runners, five competitive swimmers, and six untrained control subjects. Each subject underwent 5 h of immersion in water [mean (SE)] 36.0 (0.5) degrees C to the neck. Immediately before and at each hour of immersion, blood and urine samples were collected and analyzed for sodium (Na), potassium, osmolality, and creatinine (Cr). Plasma antidiuretic hormone and aldosterone were also measured. Hematocrits were used to calculate relative changes in plasma volume (% delta Vpl). Heart rate response to submaximal cycle ergometer exercise (35% peak oxygen uptake) was measured before and after water immersion. Water immersion induced significant increases in urine flow, Na clearance (CNa), and a 3-5% decrease in Vpl. Urine flow during immersion was greater (P < 0.05) in runners [2.4 (0.4) ml.min-1] compared to controls [1.3 (0.1) ml.min-1]. However, % delta Vpl, CCr, CNa and CH2O during immersion were not different (P > 0.05) between runners, swimmers, and controls. After 5 h of immersion, there was an increase (P < 0.05) in submaximal exercise heart rate of 9 (3) and 10 (3) beats.min-1 in both runners and controls, respectively, but no change (P > 0.05) was observed in swimmers.(ABSTRACT TRUNCATED AT 250 WORDS).

  6. Low clouds and cloud immersion enhance photosynthesis in understory species of a southern Appalachian spruce-fir forest (USA).

    PubMed

    Johnson, Daniel M; Smith, William K

    2006-11-01

    High-altitude forests of the southern Appalachian Mountains (USA) are frequently immersed in clouds, as are many mountain forests. They may be particularly sensitive to predicted increases in cloud base altitude with global warming. However, few studies have addressed the impacts of immersion on incident sunlight and photosynthesis. Understory sunlight (photosynthetically active radiation, PAR) was measured during clear, low cloud, and cloud-immersed conditions at Mount Mitchell and Roan Mountain, NC (USA) along with accompanying photosynthesis in four representative understory species. Understory PAR was substantially less variable on immersed vs. clear days. Photosynthesis became light-saturated between ∼100 and 400 μmol · m(-2) · s(-1) PAR for all species measured, corresponding closely to the sunlight environment measured during immersion. Estimated daily carbon gain was 26% greater on clear days at a more open canopy site but was 22% greater on immersed/cloudy days at a more closed canopy site. F(v)/F(m) (maximum photosystem II efficiency) in Abies fraseri seedlings exposed to 2.5 min full sunlight was significantly reduced (10%), indicating potential reductions in photosynthesis on clear days. In addition, photosynthesis in microsites with canopy cover was nearly 3-fold greater under immersed (2.6 mmol · m(-2) · h(-1)) vs. clear conditions (0.9 mmol · m(-2) · h(-1)). Thus, cloud immersion provided more constant PAR regimes that enhanced photosynthesis, especially in shaded microsites. Future studies are needed to predict the survival of these refugial forests under potential changes in cloud regimes.

  7. ARENA - A Collaborative Immersive Environment for Virtual Fieldwork

    NASA Astrophysics Data System (ADS)

    Kwasnitschka, T.

    2012-12-01

    Whenever a geoscientific study area is not readily accessible, as is the case on the deep seafloor, it is difficult to apply traditional but effective methods of fieldwork, which often require physical presence of the observer. The Artificial Research Environment for Networked Analysis (ARENA), developed at GEOMAR | Helmholtz Centre for Ocean Research Kiel within the Cluster of Excellence "The Future Ocean", provides a backend solution to robotic research on the seafloor by means of an immersive simulation environment for marine research: A hemispherical screen of 6m diameter covering the entire lower hemisphere surrounds a group of up to four researchers at once. A variety of open source (e.g. Microsoft Research World Wide Telescope) and commercial software platforms allow the interaction with e.g. in-situ recorded video, vector maps, terrain, textured geometry, point cloud and volumetric data in four dimensions. Data can be put into a holistic, georeferenced context and viewed on scales stretching from centimeters to global. Several input devices from joysticks to gestures and vocalized commands allow interaction with the simulation, depending on individual preference. Annotations added to the dataset during the simulation session catalyze the following quantitative evaluation. Both the special simulator design, making data perception a group experience, and the ability to connect remote instances or scaled down versions of ARENA over the Internet are significant advantages over established immersive simulation environments.

  8. Evaluation of knowledge transfer in an immersive virtual learning environment for the transportation community : [tech summary].

    DOT National Transportation Integrated Search

    2014-05-01

    mmersive Virtual Learning Environments (IVLEs) are extensively used in training, but few rigorous scienti c investigations regarding : the transfer of learning have been conducted. Measurement of learning transfer through evaluative methods is key...

  9. Brave New World

    ERIC Educational Resources Information Center

    Panettieri, Joseph C.

    2007-01-01

    Across the globe, progressive universities are embracing any number of MUVEs (multi-user virtual environments), 3D environments, and "immersive" virtual reality tools. And within the next few months, several universities are expected to test so-called "telepresence" videoconferencing systems from Cisco Systems and other leading…

  10. Altering User Movement Behaviour in Virtual Environments.

    PubMed

    Simeone, Adalberto L; Mavridou, Ifigeneia; Powell, Wendy

    2017-04-01

    In immersive Virtual Reality systems, users tend to move in a Virtual Environment as they would in an analogous physical environment. In this work, we investigated how user behaviour is affected when the Virtual Environment differs from the physical space. We created two sets of four environments each, plus a virtual replica of the physical environment as a baseline. The first focused on aesthetic discrepancies, such as a water surface in place of solid ground. The second focused on mixing immaterial objects together with those paired to tangible objects. For example, barring an area with walls or obstacles. We designed a study where participants had to reach three waypoints laid out in such a way to prompt a decision on which path to follow based on the conflict between the mismatching visual stimuli and their awareness of the real layout of the room. We analysed their performances to determine whether their trajectories were altered significantly from the shortest route. Our results indicate that participants altered their trajectories in presence of surfaces representing higher walking difficulty (for example, water instead of grass). However, when the graphical appearance was found to be ambiguous, there was no significant trajectory alteration. The environments mixing immaterial with physical objects had the most impact on trajectories with a mean deviation from the shortest route of 60 cm against the 37 cm of environments with aesthetic alterations. The co-existance of paired and unpaired virtual objects was reported to support the idea that all objects participants saw were backed by physical props. From these results and our observations, we derive guidelines on how to alter user movement behaviour in Virtual Environments.

  11. Constructing Image-Based Culture Definitions Using Metaphors: Impact of a Cross-Cultural Immersive Experience

    ERIC Educational Resources Information Center

    Tuleja, Elizabeth A.

    2017-01-01

    This study provides an approach to teaching and learning in the international business (IB) classroom about cultural values, beliefs, attitudes, and norms through the study of cultural metaphor. The methodology is based on established qualitative methods by using participants' visual pictures and written explanations--representative of their…

  12. The Flatworld Simulation Control Architecture (FSCA): A Framework for Scalable Immersive Visualization Systems

    DTIC Science & Technology

    2004-12-01

    handling using the X10 home automation protocol. Each 3D graphics client renders its scene according to an assigned virtual camera position. By having...control protocol. DMX is a versatile and robust framework which overcomes limitations of the X10 home automation protocol which we are currently using

  13. Compensating Scientism through "The Black Hole."

    ERIC Educational Resources Information Center

    Roth, Lane

    The focal image of the film "The Black Hole" functions as a visual metaphor for the sacred, order, unity, and eternal time. The black hole is a symbol that unites the antinomic pairs of conscious/unconscious, water/fire, immersion/emersion, death/rebirth, and hell/heaven. The black hole is further associated with the quest for…

  14. Evidence of Blocking with Geometric Cues in a Virtual Watermaze

    ERIC Educational Resources Information Center

    Redhead, Edward S.; Hamilton, Derek A.

    2009-01-01

    Three computer based experiments, testing human participants in a non-immersive virtual watermaze task, used a blocking design to assess whether two sets of geometric cues would compete in a manner described by associative models of learning. In stage 1, participants were required to discriminate between visually distinct platforms. In stage 2,…

  15. Exploring Learning through Audience Interaction in Virtual Reality Dome Theaters

    NASA Astrophysics Data System (ADS)

    Apostolellis, Panagiotis; Daradoumis, Thanasis

    Informal learning in public spaces like museums, science centers and planetariums is increasingly popular during the last years. Recent advancements in large-scale displays allowed contemporary technology-enhanced museums to get equipped with digital domes, some with real-time capabilities like Virtual Reality systems. By conducting extensive literature review we have come to the conclusion that little to no research has been carried out on the leaning outcomes that the combination of VR and audience interaction can provide in the immersive environments of dome theaters. Thus, we propose that audience collaboration in immersive virtual reality environments presents a promising approach to support effective learning in groups of school aged children.

  16. Immersive virtual reality platform for medical training: a "killer-application".

    PubMed

    2000-01-01

    The Medical Readiness Trainer (MRT) integrates fully immersive Virtual Reality (VR), highly advanced medical simulation technologies, and medical data to enable unprecedented medical education and training. The flexibility offered by the MRT environment serves as a practical teaching tool today and in the near future the will serve as an ideal vehicle for facilitating the transition to the next level of medical practice, i.e., telepresence and next generation Internet-based collaborative learning.

  17. Assessment of refractive index of pigments by Gaussian fitting of light backscattering data in context of the liquid immersion method.

    PubMed

    Niskanen, Ilpo; Peiponen, Kai-Erik; Räty, Jukka

    2010-05-01

    Using a multifunction spectrophotometer, the refractive index of a pigment can be estimated by measuring the backscattering of light from the pigment in immersion liquids having slightly different refractive indices. A simple theoretical Gaussian function model related to the optical path distribution is introduced that makes it possible to describe quantitatively the backscattering signal from transparent pigments using a set of only a few immersion liquids. With the aid of the data fitting by a Gaussian function, the measurement time of the refractive index of the pigment can be reduced. The backscattering measurement technique is suggested to be useful in industrial measurement environments of pigments.

  18. Development of simulation interfaces for evaluation task with the use of physiological data and virtual reality applied to a vehicle simulator

    NASA Astrophysics Data System (ADS)

    Miranda, Mateus R.; Costa, Henrik; Oliveira, Luiz; Bernardes, Thiago; Aguiar, Carla; Miosso, Cristiano; Oliveira, Alessandro B. S.; Diniz, Alberto C. G. C.; Domingues, Diana Maria G.

    2015-03-01

    This paper aims at describing an experimental platform used to evaluate the performance of individuals at training immersive physiological games. The platform proposed is embedded in an immersive environment in a CAVE of Virtual Reality and consists on a base frame with actuators with three degrees of freedom, sensor array interface and physiological sensors. Physiological data of breathing, galvanic skin resistance (GSR) and pressure on the hand of the user and a subjective questionnaire were collected during the experiments. The theoretical background used in a project focused on Software Engineering, Biomedical Engineering in the field of Ergonomics and Creative Technologies in order to presents this case study, related of an evaluation of a vehicular simulator located inside the CAVE. The analysis of the simulator uses physiological data of the drivers obtained in a period of rest and after the experience, with and without movements at the simulator. Also images from the screen are captured through time at the embedded experience and data collected through physiological data visualization (average frequency and RMS graphics). They are empowered by the subjective questionnaire as strong lived experience provided by the technological apparatus. The performed immersion experience inside the CAVE allows to replicate behaviors from physical spaces inside data space enhanced by physiological properties. In this context, the biocybrid condition is expanded beyond art and entertainment, as it is applied to automotive engineering and biomedical engineering. In fact, the kinesthetic sensations amplified by synesthesia replicates the sensation of displacement in the interior of an automobile, as well as the sensations of vibration and vertical movements typical of a vehicle, different speeds, collisions, etc. The contribution of this work is the possibility to tracing a stress analysis protocol for drivers while operating a vehicle getting affective behaviors coming from physiological data, mixed to embedded simulation in Mixed Reality.

  19. Desktop-VR system for preflight 3D navigation training

    NASA Astrophysics Data System (ADS)

    Aoki, Hirofumi; Oman, Charles M.; Buckland, Daniel A.; Natapoff, Alan

    Crews who inhabit spacecraft with complex 3D architecture frequently report inflight disorientation and navigation problems. Preflight virtual reality (VR) training may reduce those risks. Although immersive VR techniques may better support spatial orientation training in a local environment, a non-immersive desktop (DT) system may be more convenient for navigation training in "building scale" spaces, especially if the two methods achieve comparable results. In this study trainees' orientation and navigation performance during simulated space station emergency egress tasks was compared while using immersive head-mounted display (HMD) and DT-VR systems. Analyses showed no differences in pointing angular-error or egress time among the groups. The HMD group was significantly faster than DT group when pointing from destination to start location and from start toward different destination. However, this may be attributed to differences in the input device used (a head-tracker for HMD group vs. a keyboard touchpad or a gamepad in the DT group). All other 3D navigation performance measures were similar using the immersive and non-immersive VR systems, suggesting that the simpler desktop VR system may be useful for astronaut 3D navigation training.

  20. Silver nanoparticles enhance wound healing in zebrafish (Danio rerio).

    PubMed

    Seo, Seung Beom; Dananjaya, S H S; Nikapitiya, Chamilani; Park, Bae Keun; Gooneratne, Ravi; Kim, Tae-Yoon; Lee, Jehee; Kim, Cheol-Hee; De Zoysa, Mahanama

    2017-09-01

    Silver nanoparticles (AgNPs) were successfully synthesized by a chemical reduction method, physico-chemically characterized and their effect on wound-healing activity in zebrafish was investigated. The prepared AgNPs were circular-shaped, water soluble with average diameter and zeta potential of 72.66 nm and -0.45 mv, respectively. Following the creation of a laser skin wound on zebrafish, the effect of AgNPs on wound-healing activity was tested by two methods, direct skin application (2 μg/wound) and immersion in a solution of AgNPs and water (50 μg/L). The zebrafish were followed for 20 days post-wounding (dpw) by visual observation of wound size, calculating wound healing percentage (WHP), and histological examination. Visually, both direct skin application and immersion AgNPs treatments displayed clear and faster wound closure at 5, 10 and 20 dpw compared to the controls, which was confirmed by 5 dpw histology data. At 5 dpw, WHP was highest in the AgNPs immersion group (36.6%) > AgNPs direct application group (23.7%) > controls (18.2%), showing that WHP was most effective in fish immersed in AgNPs solution. In general, exposure to AgNPs induced gene expression of selected wound-healing-related genes, namely, transforming growth factor (TGF-β), matrix metalloproteinase (MMP) -9 and -13, pro-inflammatory cytokines (IL-1β and TNF-α) and antioxidant enzymes (superoxide dismutase and catalase), which observed differentiation at 12 and 24 h against the control; but the results were not consistently significant, and many either reached basal levels or were down regulated at 5 dpw in the wounded muscle. These results suggest that AgNPs are effective in acceleration of wound healing and altered the expression of some wound-healing-related genes. However, the detailed mechanism of enhanced wound healing remains to be investigated in fish. Copyright © 2017 Elsevier Ltd. All rights reserved.

Top