Sample records for interactive immersive visualization

  1. Adoption of the Creative Process According to the Immersive Method

    ERIC Educational Resources Information Center

    Vuk, Sonja; Tacol, Tonka; Vogrinc, Janez

    2015-01-01

    The immersive method is a new concept of visual education that is better suited to the needs of students in contemporary post-industrial society. The features of the immersive method are: (1) it emerges from interaction with visual culture; (2) it encourages understanding of contemporary art (as an integral part of visual culture); and (3) it…

  2. Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.

    PubMed

    Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E

    2007-01-01

    This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.

  3. KinImmerse: Macromolecular VR for NMR ensembles

    PubMed Central

    Block, Jeremy N; Zielinski, David J; Chen, Vincent B; Davis, Ian W; Vinson, E Claire; Brady, Rachael; Richardson, Jane S; Richardson, David C

    2009-01-01

    Background In molecular applications, virtual reality (VR) and immersive virtual environments have generally been used and valued for the visual and interactive experience – to enhance intuition and communicate excitement – rather than as part of the actual research process. In contrast, this work develops a software infrastructure for research use and illustrates such use on a specific case. Methods The Syzygy open-source toolkit for VR software was used to write the KinImmerse program, which translates the molecular capabilities of the kinemage graphics format into software for display and manipulation in the DiVE (Duke immersive Virtual Environment) or other VR system. KinImmerse is supported by the flexible display construction and editing features in the KiNG kinemage viewer and it implements new forms of user interaction in the DiVE. Results In addition to molecular visualizations and navigation, KinImmerse provides a set of research tools for manipulation, identification, co-centering of multiple models, free-form 3D annotation, and output of results. The molecular research test case analyzes the local neighborhood around an individual atom within an ensemble of nuclear magnetic resonance (NMR) models, enabling immersive visual comparison of the local conformation with the local NMR experimental data, including target curves for residual dipolar couplings (RDCs). Conclusion The promise of KinImmerse for production-level molecular research in the DiVE is shown by the locally co-centered RDC visualization developed there, which gave new insights now being pursued in wider data analysis. PMID:19222844

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eric A. Wernert; William R. Sherman; Patrick O'Leary

    Immersive visualization makes use of the medium of virtual reality (VR) - it is a subset of virtual reality focused on the application of VR technologies to scientific and information visualization. As the name implies, there is a particular focus on the physically immersive aspect of VR that more fully engages the perceptual and kinesthetic capabilities of the scientist with the goal of producing greater insight. The immersive visualization community is uniquely positioned to address the analysis needs of the wide spectrum of domain scientists who are becoming increasingly overwhelmed by data. The outputs of computational science simulations and high-resolutionmore » sensors are creating a data deluge. Data is coming in faster than it can be analyzed, and there are countless opportunities for discovery that are missed as the data speeds by. By more fully utilizing the scientists visual and other sensory systems, and by offering a more natural user interface with which to interact with computer-generated representations, immersive visualization offers great promise in taming this data torrent. However, increasing the adoption of immersive visualization in scientific research communities can only happen by simultaneously lowering the engagement threshold while raising the measurable benefits of adoption. Scientists time spent immersed with their data will thus be rewarded with higher productivity, deeper insight, and improved creativity. Immersive visualization ties together technologies and methodologies from a variety of related but frequently disjoint areas, including hardware, software and human-computer interaction (HCI) disciplines. In many ways, hardware is a solved problem. There are well established technologies including large walk-in systems such as the CAVE{trademark} and head-based systems such as the Wide-5{trademark}. The advent of new consumer-level technologies now enable an entirely new generation of immersive displays, with smaller footprints and costs, widening the potential consumer base. While one would be hard-pressed to call software a solved problem, we now understand considerably more about best practices for designing and developing sustainable, scalable software systems, and we have useful software examples that illuminate the way to even better implementations. As with any research endeavour, HCI will always be exploring new topics in interface design, but we now have a sizable knowledge base of the strengths and weaknesses of the human perceptual systems and we know how to design effective interfaces for immersive systems. So, in a research landscape with a clear need for better visualization and analysis tools, a methodology in immersive visualization that has been shown to effectively address some of those needs, and vastly improved supporting technologies and knowledge of hardware, software, and HCI, why hasn't immersive visualization 'caught on' more with scientists? What can we do as a community of immersive visualization researchers and practitioners to facilitate greater adoption by scientific communities so as to make the transition from 'the promise of virtual reality' to 'the reality of virtual reality'.« less

  5. Spherical Panoramas for Astrophysical Data Visualization

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.

    2017-05-01

    Data immersion has advantages in astrophysical visualization. Complex multi-dimensional data and phase spaces can be explored in a seamless and interactive viewing environment. Putting the user in the data is a first step toward immersive data analysis. We present a technique for creating 360° spherical panoramas with astrophysical data. The three-dimensional software package Blender and the Google Spatial Media module are used together to immerse users in data exploration. Several examples employing these methods exhibit how the technique works using different types of astronomical data.

  6. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  7. Immersive Visual Analytics for Transformative Neutron Scattering Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Daniel, Jamison R; Drouhard, Margaret

    The ORNL Spallation Neutron Source (SNS) provides the most intense pulsed neutron beams in the world for scientific research and development across a broad range of disciplines. SNS experiments produce large volumes of complex data that are analyzed by scientists with varying degrees of experience using 3D visualization and analysis systems. However, it is notoriously difficult to achieve proficiency with 3D visualizations. Because 3D representations are key to understanding the neutron scattering data, scientists are unable to analyze their data in a timely fashion resulting in inefficient use of the limited and expensive SNS beam time. We believe a moremore » intuitive interface for exploring neutron scattering data can be created by combining immersive virtual reality technology with high performance data analytics and human interaction. In this paper, we present our initial investigations of immersive visualization concepts as well as our vision for an immersive visual analytics framework that could lower the barriers to 3D exploratory data analysis of neutron scattering data at the SNS.« less

  8. Novel Safranin-Tinted Candida rugosa Lipase Nanoconjugates Reagent for Visualizing Latent Fingerprints on Stainless Steel Knives Immersed in a Natural Outdoor Pond.

    PubMed

    Azman, Aida Rasyidah; Mahat, Naji Arafat; Abdul Wahab, Roswanira; Abdul Razak, Fazira Ilyana; Hamzah, Hafezul Helmi

    2018-05-25

    Waterways are popular locations for the disposition of criminal evidence because the recovery of latent fingerprints from such evidence is difficult. Currently, small particle reagent is a method often used to visualize latent fingerprints containing carcinogenic and hazardous compounds. This study proposes an eco-friendly, safranin-tinted Candida rugosa lipase (triacylglycerol ester hydrolysis EC 3.1.1.3) with functionalized carbon nanotubes (CRL-MWCNTS/GA/SAF) as an alternative reagent to the small particle reagent. The CRL-MWCNTS/GA/SAF reagent was compared with the small particle reagent to visualize groomed, full fingerprints deposited on stainless steel knives which were immersed in a natural outdoor pond for 30 days. The quality of visualized fingerprints using the new reagent was similar (modified-Centre for Applied Science and Technology grade: 4; p > 0.05) to small particle reagent, even after 15 days of immersion. Despite the slight decrease in quality of visualized fingerprints using the CRL-MWCNTS/GA/SAF on the last three immersion periods, the fingerprints remained forensically identifiable (modified-Centre for Applied Science and Technology grade: 3). The possible chemical interactions that enabled successful visualization is also discussed. Thus, this novel reagent may provide a relatively greener alternative for the visualization of latent fingerprints on immersed non-porous objects.

  9. The ALIVE Project: Astronomy Learning in Immersive Virtual Environments

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Sahami, K.; Denn, G.

    2008-06-01

    The Astronomy Learning in Immersive Virtual Environments (ALIVE) project seeks to discover learning modes and optimal teaching strategies using immersive virtual environments (VEs). VEs are computer-generated, three-dimensional environments that can be navigated to provide multiple perspectives. Immersive VEs provide the additional benefit of surrounding a viewer with the simulated reality. ALIVE evaluates the incorporation of an interactive, real-time ``virtual universe'' into formal college astronomy education. In the experiment, pre-course, post-course, and curriculum tests will be used to determine the efficacy of immersive visualizations presented in a digital planetarium versus the same visual simulations in the non-immersive setting of a normal classroom, as well as a control case using traditional classroom multimedia. To normalize for inter-instructor variability, each ALIVE instructor will teach at least one of each class in each of the three test groups.

  10. An Immersive VR System for Sports Education

    NASA Astrophysics Data System (ADS)

    Song, Peng; Xu, Shuhong; Fong, Wee Teck; Chin, Ching Ling; Chua, Gim Guan; Huang, Zhiyong

    The development of new technologies has undoubtedly promoted the advances of modern education, among which Virtual Reality (VR) technologies have made the education more visually accessible for students. However, classroom education has been the focus of VR applications whereas not much research has been done in promoting sports education using VR technologies. In this paper, an immersive VR system is designed and implemented to create a more intuitive and visual way of teaching tennis. A scalable system architecture is proposed in addition to the hardware setup layout, which can be used for various immersive interactive applications such as architecture walkthroughs, military training simulations, other sports game simulations, interactive theaters, and telepresent exhibitions. Realistic interaction experience is achieved through accurate and robust hybrid tracking technology, while the virtual human opponent is animated in real time using shader-based skin deformation. Potential future extensions are also discussed to improve the teaching/learning experience.

  11. Hybrid 2-D and 3-D Immersive and Interactive User Interface for Scientific Data Visualization

    DTIC Science & Technology

    2017-08-01

    visualization, 3-D interactive visualization, scientific visualization, virtual reality, real -time ray tracing 16. SECURITY CLASSIFICATION OF: 17...scientists to employ in the real world. Other than user-friendly software and hardware setup, scientists also need to be able to perform their usual...and scientific visualization communities mostly have different research priorities. For the VR community, the ability to support real -time user

  12. Discovering new methods of data fusion, visualization, and analysis in 3D immersive environments for hyperspectral and laser altimetry data

    NASA Astrophysics Data System (ADS)

    Moore, C. A.; Gertman, V.; Olsoy, P.; Mitchell, J.; Glenn, N. F.; Joshi, A.; Norpchen, D.; Shrestha, R.; Pernice, M.; Spaete, L.; Grover, S.; Whiting, E.; Lee, R.

    2011-12-01

    Immersive virtual reality environments such as the IQ-Station or CAVE° (Cave Automated Virtual Environment) offer new and exciting ways to visualize and explore scientific data and are powerful research and educational tools. Combining remote sensing data from a range of sensor platforms in immersive 3D environments can enhance the spectral, textural, spatial, and temporal attributes of the data, which enables scientists to interact and analyze the data in ways never before possible. Visualization and analysis of large remote sensing datasets in immersive environments requires software customization for integrating LiDAR point cloud data with hyperspectral raster imagery, the generation of quantitative tools for multidimensional analysis, and the development of methods to capture 3D visualizations for stereographic playback. This study uses hyperspectral and LiDAR data acquired over the China Hat geologic study area near Soda Springs, Idaho, USA. The data are fused into a 3D image cube for interactive data exploration and several methods of recording and playback are investigated that include: 1) creating and implementing a Virtual Reality User Interface (VRUI) patch configuration file to enable recording and playback of VRUI interactive sessions within the CAVE and 2) using the LiDAR and hyperspectral remote sensing data and GIS data to create an ArcScene 3D animated flyover, where left- and right-eye visuals are captured from two independent monitors for playback in a stereoscopic player. These visualizations can be used as outreach tools to demonstrate how integrated data and geotechnology techniques can help scientists see, explore, and more adequately comprehend scientific phenomena, both real and abstract.

  13. Molecular Dynamics Visualization (MDV): Stereoscopic 3D Display of Biomolecular Structure and Interactions Using the Unity Game Engine.

    PubMed

    Wiebrands, Michael; Malajczuk, Chris J; Woods, Andrew J; Rohl, Andrew L; Mancera, Ricardo L

    2018-06-21

    Molecular graphics systems are visualization tools which, upon integration into a 3D immersive environment, provide a unique virtual reality experience for research and teaching of biomolecular structure, function and interactions. We have developed a molecular structure and dynamics application, the Molecular Dynamics Visualization tool, that uses the Unity game engine combined with large scale, multi-user, stereoscopic visualization systems to deliver an immersive display experience, particularly with a large cylindrical projection display. The application is structured to separate the biomolecular modeling and visualization systems. The biomolecular model loading and analysis system was developed as a stand-alone C# library and provides the foundation for the custom visualization system built in Unity. All visual models displayed within the tool are generated using Unity-based procedural mesh building routines. A 3D user interface was built to allow seamless dynamic interaction with the model while being viewed in 3D space. Biomolecular structure analysis and display capabilities are exemplified with a range of complex systems involving cell membranes, protein folding and lipid droplets.

  14. Virtual hydrology observatory: an immersive visualization of hydrology modeling

    NASA Astrophysics Data System (ADS)

    Su, Simon; Cruz-Neira, Carolina; Habib, Emad; Gerndt, Andreas

    2009-02-01

    The Virtual Hydrology Observatory will provide students with the ability to observe the integrated hydrology simulation with an instructional interface by using a desktop based or immersive virtual reality setup. It is the goal of the virtual hydrology observatory application to facilitate the introduction of field experience and observational skills into hydrology courses through innovative virtual techniques that mimic activities during actual field visits. The simulation part of the application is developed from the integrated atmospheric forecast model: Weather Research and Forecasting (WRF), and the hydrology model: Gridded Surface/Subsurface Hydrologic Analysis (GSSHA). Both the output from WRF and GSSHA models are then used to generate the final visualization components of the Virtual Hydrology Observatory. The various visualization data processing techniques provided by VTK are 2D Delaunay triangulation and data optimization. Once all the visualization components are generated, they are integrated into the simulation data using VRFlowVis and VR Juggler software toolkit. VR Juggler is used primarily to provide the Virtual Hydrology Observatory application with fully immersive and real time 3D interaction experience; while VRFlowVis provides the integration framework for the hydrologic simulation data, graphical objects and user interaction. A six-sided CAVETM like system is used to run the Virtual Hydrology Observatory to provide the students with a fully immersive experience.

  15. Time Series Data Visualization in World Wide Telescope

    NASA Astrophysics Data System (ADS)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  16. Data Visualization Using Immersive Virtual Reality Tools

    NASA Astrophysics Data System (ADS)

    Cioc, Alexandru; Djorgovski, S. G.; Donalek, C.; Lawler, E.; Sauer, F.; Longo, G.

    2013-01-01

    The growing complexity of scientific data poses serious challenges for an effective visualization. Data sets, e.g., catalogs of objects detected in sky surveys, can have a very high dimensionality, ~ 100 - 1000. Visualizing such hyper-dimensional data parameter spaces is essentially impossible, but there are ways of visualizing up to ~ 10 dimensions in a pseudo-3D display. We have been experimenting with the emerging technologies of immersive virtual reality (VR) as a platform for a scientific, interactive, collaborative data visualization. Our initial experiments used the virtual world of Second Life, and more recently VR worlds based on its open source code, OpenSimulator. There we can visualize up to ~ 100,000 data points in ~ 7 - 8 dimensions (3 spatial and others encoded as shapes, colors, sizes, etc.), in an immersive virtual space where scientists can interact with their data and with each other. We are now developing a more scalable visualization environment using the popular (practically an emerging standard) Unity 3D Game Engine, coded using C#, JavaScript, and the Unity Scripting Language. This visualization tool can be used through a standard web browser, or a standalone browser of its own. Rather than merely plotting data points, the application creates interactive three-dimensional objects of various shapes, colors, and sizes, and of course the XYZ positions, encoding various dimensions of the parameter space, that can be associated interactively. Multiple users can navigate through this data space simultaneously, either with their own, independent vantage points, or with a shared view. At this stage ~ 100,000 data points can be easily visualized within seconds on a simple laptop. The displayed data points can contain linked information; e.g., upon a clicking on a data point, a webpage with additional information can be rendered within the 3D world. A range of functionalities has been already deployed, and more are being added. We expect to make this visualization tool freely available to the academic community within a few months, on an experimental (beta testing) basis.

  17. Immersive Interaction, Manipulation and Analysis of Large 3D Datasets for Planetary and Earth Sciences

    NASA Astrophysics Data System (ADS)

    Pariser, O.; Calef, F.; Manning, E. M.; Ardulov, V.

    2017-12-01

    We will present implementation and study of several use-cases of utilizing Virtual Reality (VR) for immersive display, interaction and analysis of large and complex 3D datasets. These datasets have been acquired by the instruments across several Earth, Planetary and Solar Space Robotics Missions. First, we will describe the architecture of the common application framework that was developed to input data, interface with VR display devices and program input controllers in various computing environments. Tethered and portable VR technologies will be contrasted and advantages of each highlighted. We'll proceed to presenting experimental immersive analytics visual constructs that enable augmentation of 3D datasets with 2D ones such as images and statistical and abstract data. We will conclude by presenting comparative analysis with traditional visualization applications and share the feedback provided by our users: scientists and engineers.

  18. 3D Immersive Visualization with Astrophysical Data

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.

    2017-01-01

    We present the refinement of a new 3D immersion technique for astrophysical data visualization.Methodology to create 360 degree spherical panoramas is reviewed. The 3D software package Blender coupled with Python and the Google Spatial Media module are used together to create the final data products. Data can be viewed interactively with a mobile phone or tablet or in a web browser. The technique can apply to different kinds of astronomical data including 3D stellar and galaxy catalogs, images, and planetary maps.

  19. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  20. JackIn Head: Immersive Visual Telepresence System with Omnidirectional Wearable Camera.

    PubMed

    Kasahara, Shunichi; Nagai, Shohei; Rekimoto, Jun

    2017-03-01

    Sharing one's own immersive experience over the Internet is one of the ultimate goals of telepresence technology. In this paper, we present JackIn Head, a visual telepresence system featuring an omnidirectional wearable camera with image motion stabilization. Spherical omnidirectional video footage taken around the head of a local user is stabilized and then broadcast to others, allowing remote users to explore the immersive visual environment independently of the local user's head direction. We describe the system design of JackIn Head and report the evaluation results of real-time image stabilization and alleviation of cybersickness. Then, through an exploratory observation study, we investigate how individuals can remotely interact, communicate with, and assist each other with our system. We report our observation and analysis of inter-personal communication, demonstrating the effectiveness of our system in augmenting remote collaboration.

  1. Immersive Molecular Visualization with Omnidirectional Stereoscopic Ray Tracing and Remote Rendering

    PubMed Central

    Stone, John E.; Sherman, William R.; Schulten, Klaus

    2016-01-01

    Immersive molecular visualization provides the viewer with intuitive perception of complex structures and spatial relationships that are of critical interest to structural biologists. The recent availability of commodity head mounted displays (HMDs) provides a compelling opportunity for widespread adoption of immersive visualization by molecular scientists, but HMDs pose additional challenges due to the need for low-latency, high-frame-rate rendering. State-of-the-art molecular dynamics simulations produce terabytes of data that can be impractical to transfer from remote supercomputers, necessitating routine use of remote visualization. Hardware-accelerated video encoding has profoundly increased frame rates and image resolution for remote visualization, however round-trip network latencies would cause simulator sickness when using HMDs. We present a novel two-phase rendering approach that overcomes network latencies with the combination of omnidirectional stereoscopic progressive ray tracing and high performance rasterization, and its implementation within VMD, a widely used molecular visualization and analysis tool. The new rendering approach enables immersive molecular visualization with rendering techniques such as shadows, ambient occlusion lighting, depth-of-field, and high quality transparency, that are particularly helpful for the study of large biomolecular complexes. We describe ray tracing algorithms that are used to optimize interactivity and quality, and we report key performance metrics of the system. The new techniques can also benefit many other application domains. PMID:27747138

  2. Radiological tele-immersion for next generation networks.

    PubMed

    Ai, Z; Dech, F; Rasmussen, M; Silverstein, J C

    2000-01-01

    Since the acquisition of high-resolution three-dimensional patient images has become widespread, medical volumetric datasets (CT or MR) larger than 100 MB and encompassing more than 250 slices are common. It is important to make this patient-specific data quickly available and usable to many specialists at different geographical sites. Web-based systems have been developed to provide volume or surface rendering of medical data over networks with low fidelity, but these cannot adequately handle stereoscopic visualization or huge datasets. State-of-the-art virtual reality techniques and high speed networks have made it possible to create an environment for clinicians geographically distributed to immersively share these massive datasets in real-time. An object-oriented method for instantaneously importing medical volumetric data into Tele-Immersive environments has been developed at the Virtual Reality in Medicine Laboratory (VRMedLab) at the University of Illinois at Chicago (UIC). This networked-VR setup is based on LIMBO, an application framework or template that provides the basic capabilities of Tele-Immersion. We have developed a modular general purpose Tele-Immersion program that automatically combines 3D medical data with the methods for handling the data. For this purpose a DICOM loader for IRIS Performer has been developed. The loader was designed for SGI machines as a shared object, which is executed at LIMBO's runtime. The loader loads not only the selected DICOM dataset, but also methods for rendering, handling, and interacting with the data, bringing networked, real-time, stereoscopic interaction with radiological data to reality. Collaborative, interactive methods currently implemented in the loader include cutting planes and windowing. The Tele-Immersive environment has been tested on the UIC campus over an ATM network. We tested the environment with 3 nodes; one ImmersaDesk at the VRMedLab, one CAVE at the Electronic Visualization Laboratory (EVL) on east campus, and a CT scan machine in UIC Hospital. CT data was pulled directly from the scan machine to the Tele-Immersion server in our Laboratory, and then the data was synchronously distributed by our Onyx2 Rack server to all the VR setups. Instead of permitting medical volume visualization at one VR device, by combining teleconferencing, tele-presence, and virtual reality, the Tele-Immersive environment will enable geographically distributed clinicians to intuitively interact with the same medical volumetric models, point, gesture, converse, and see each other. This environment will bring together clinicians at different geographic locations to participate in Tele-Immersive consultation and collaboration.

  3. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  4. Coupled auralization and virtual video for immersive multimedia displays

    NASA Astrophysics Data System (ADS)

    Henderson, Paul D.; Torres, Rendell R.; Shimizu, Yasushi; Radke, Richard; Lonsway, Brian

    2003-04-01

    The implementation of maximally-immersive interactive multimedia in exhibit spaces requires not only the presentation of realistic visual imagery but also the creation of a perceptually accurate aural experience. While conventional implementations treat audio and video problems as essentially independent, this research seeks to couple the visual sensory information with dynamic auralization in order to enhance perceptual accuracy. An implemented system has been developed for integrating accurate auralizations with virtual video techniques for both interactive presentation and multi-way communication. The current system utilizes a multi-channel loudspeaker array and real-time signal processing techniques for synthesizing the direct sound, early reflections, and reverberant field excited by a moving sound source whose path may be interactively defined in real-time or derived from coupled video tracking data. In this implementation, any virtual acoustic environment may be synthesized and presented in a perceptually-accurate fashion to many participants over a large listening and viewing area. Subject tests support the hypothesis that the cross-modal coupling of aural and visual displays significantly affects perceptual localization accuracy.

  5. Immersive virtual reality for visualization of abdominal CT

    NASA Astrophysics Data System (ADS)

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A.; Bodenheimer, Robert E.

    2013-03-01

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  6. Immersive Virtual Reality for Visualization of Abdominal CT.

    PubMed

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A; Bodenheimer, Robert E

    2013-03-28

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two-dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  7. Headphone and Head-Mounted Visual Displays for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Begault, Duran R.; Ellis, Stephen R.; Wenzel, Elizabeth M.; Trejo, Leonard J. (Technical Monitor)

    1998-01-01

    A realistic auditory environment can contribute to both the overall subjective sense of presence in a virtual display, and to a quantitative metric predicting human performance. Here, the role of audio in a virtual display and the importance of auditory-visual interaction are examined. Conjectures are proposed regarding the effectiveness of audio compared to visual information for creating a sensation of immersion, the frame of reference within a virtual display, and the compensation of visual fidelity by supplying auditory information. Future areas of research are outlined for improving simulations of virtual visual and acoustic spaces. This paper will describe some of the intersensory phenomena that arise during operator interaction within combined visual and auditory virtual environments. Conjectures regarding audio-visual interaction will be proposed.

  8. LibIsopach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas

    2016-12-06

    LibIsopach is a toolkit for high performance distributed immersive visualization, leveraging modern OpenGL. It features a multi-process scenegraph, explicit instance rendering, mesh generation, and three-dimensional user interaction event processing.

  9. Enhancements to VTK enabling Scientific Visualization in Immersive Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick; Jhaveri, Sankhesh; Chaudhary, Aashish

    Modern scientific, engineering and medical computational sim- ulations, as well as experimental and observational data sens- ing/measuring devices, produce enormous amounts of data. While statistical analysis provides insight into this data, scientific vi- sualization is tactically important for scientific discovery, prod- uct design and data analysis. These benefits are impeded, how- ever, when scientific visualization algorithms are implemented from scratch—a time-consuming and redundant process in im- mersive application development. This process can greatly ben- efit from leveraging the state-of-the-art open-source Visualization Toolkit (VTK) and its community. Over the past two (almost three) decades, integrating VTK with a virtual reality (VR)more » environment has only been attempted to varying degrees of success. In this pa- per, we demonstrate two new approaches to simplify this amalga- mation of an immersive interface with visualization rendering from VTK. In addition, we cover several enhancements to VTK that pro- vide near real-time updates and efficient interaction. Finally, we demonstrate the combination of VTK with both Vrui and OpenVR immersive environments in example applications.« less

  10. Simulation Exploration through Immersive Parallel Planes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas J; Bush, Brian W; Gruchalla, Kenny M

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  11. Simulation Exploration through Immersive Parallel Planes: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas; Bush, Brian W.; Gruchalla, Kenny

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  12. Wireless physiological monitoring and ocular tracking: 3D calibration in a fully-immersive virtual health care environment.

    PubMed

    Zhang, Lelin; Chi, Yu Mike; Edelstein, Eve; Schulze, Jurgen; Gramann, Klaus; Velasquez, Alvaro; Cauwenberghs, Gert; Macagno, Eduardo

    2010-01-01

    Wireless physiological/neurological monitoring in virtual reality (VR) offers a unique opportunity for unobtrusively quantifying human responses to precisely controlled and readily modulated VR representations of health care environments. Here we present such a wireless, light-weight head-mounted system for measuring electrooculogram (EOG) and electroencephalogram (EEG) activity in human subjects interacting with and navigating in the Calit2 StarCAVE, a five-sided immersive 3-D visualization VR environment. The system can be easily expanded to include other measurements, such as cardiac activity and galvanic skin responses. We demonstrate the capacity of the system to track focus of gaze in 3-D and report a novel calibration procedure for estimating eye movements from responses to the presentation of a set of dynamic visual cues in the StarCAVE. We discuss cyber and clinical applications that include a 3-D cursor for visual navigation in VR interactive environments, and the monitoring of neurological and ocular dysfunction in vision/attention disorders.

  13. Scalable metadata environments (MDE): artistically impelled immersive environments for large-scale data exploration

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Margolis, Todd; Prudhomme, Andrew; Schulze, Jürgen P.; Mostafavi, Iman; Lewis, J. P.; Gossmann, Joachim; Singh, Rajvikram

    2014-02-01

    Scalable Metadata Environments (MDEs) are an artistic approach for designing immersive environments for large scale data exploration in which users interact with data by forming multiscale patterns that they alternatively disrupt and reform. Developed and prototyped as part of an art-science research collaboration, we define an MDE as a 4D virtual environment structured by quantitative and qualitative metadata describing multidimensional data collections. Entire data sets (e.g.10s of millions of records) can be visualized and sonified at multiple scales and at different levels of detail so they can be explored interactively in real-time within MDEs. They are designed to reflect similarities and differences in the underlying data or metadata such that patterns can be visually/aurally sorted in an exploratory fashion by an observer who is not familiar with the details of the mapping from data to visual, auditory or dynamic attributes. While many approaches for visual and auditory data mining exist, MDEs are distinct in that they utilize qualitative and quantitative data and metadata to construct multiple interrelated conceptual coordinate systems. These "regions" function as conceptual lattices for scalable auditory and visual representations within virtual environments computationally driven by multi-GPU CUDA-enabled fluid dyamics systems.

  14. 3D Immersive Visualization: An Educational Tool in Geosciences

    NASA Astrophysics Data System (ADS)

    Pérez-Campos, N.; Cárdenas-Soto, M.; Juárez-Casas, M.; Castrejón-Pineda, R.

    2007-05-01

    3D immersive visualization is an innovative tool currently used in various disciplines, such as medicine, architecture, engineering, video games, etc. Recently, the Universidad Nacional Autónoma de México (UNAM) mounted a visualization theater (Ixtli) with leading edge technology, for academic and research purposes that require immersive 3D tools for a better understanding of the concepts involved. The Division of Engineering in Earth Sciences of the School of Engineering, UNAM, is running a project focused on visualization of geoscience data. Its objective is to incoporate educational material in geoscience courses in order to support and to improve the teaching-learning process, especially in well-known difficult topics for students. As part of the project, proffessors and students are trained in visualization techniques, then their data are adapted and visualized in Ixtli as part of a class or a seminar, where all the attendants can interact, not only among each other but also with the object under study. As part of our results, we present specific examples used in basic geophysics courses, such as interpreted seismic cubes, seismic-wave propagation models, and structural models from bathymetric, gravimetric and seismological data; as well as examples from ongoing applied projects, such as a modeled SH upward wave, the occurrence of an earthquake cluster in 1999 in the Popocatepetl volcano, and a risk atlas from Delegación Alvaro Obregón in Mexico City. All these examples, plus those to come, constitute a library for students and professors willing to explore another dimension of the teaching-learning process. Furthermore, this experience can be enhaced by rich discussions and interactions by videoconferences with other universities and researchers.

  15. Stereoscopic applications for design visualization

    NASA Astrophysics Data System (ADS)

    Gilson, Kevin J.

    2007-02-01

    Advances in display technology and 3D design visualization applications have made real-time stereoscopic visualization of architectural and engineering projects a reality. Parsons Brinkerhoff (PB) is a transportation consulting firm that has used digital visualization tools from their inception and has helped pioneer the application of those tools to large scale infrastructure projects. PB is one of the first Architecture/Engineering/Construction (AEC) firms to implement a CAVE- an immersive presentation environment that includes stereoscopic rear-projection capability. The firm also employs a portable stereoscopic front-projection system, and shutter-glass systems for smaller groups. PB is using commercial real-time 3D applications in combination with traditional 3D modeling programs to visualize and present large AEC projects to planners, clients and decision makers in stereo. These presentations create more immersive and spatially realistic presentations of the proposed designs. This paper will present the basic display tools and applications, and the 3D modeling techniques PB is using to produce interactive stereoscopic content. The paper will discuss several architectural and engineering design visualizations we have produced.

  16. The effect of visual and interaction fidelity on spatial cognition in immersive virtual environments.

    PubMed

    Mania, Katerina; Wooldridge, Dave; Coxon, Matthew; Robinson, Andrew

    2006-01-01

    Accuracy of memory performance per se is an imperfect reflection of the cognitive activity (awareness states) that underlies performance in memory tasks. The aim of this research is to investigate the effect of varied visual and interaction fidelity of immersive virtual environments on memory awareness states. A between groups experiment was carried out to explore the effect of rendering quality on location-based recognition memory for objects and associated states of awareness. The experimental space, consisting of two interconnected rooms, was rendered either flat-shaded or using radiosity rendering. The computer graphics simulations were displayed on a stereo head-tracked Head Mounted Display. Participants completed a recognition memory task after exposure to the experimental space and reported one of four states of awareness following object recognition. These reflected the level of visual mental imagery involved during retrieval, the familiarity of the recollection, and also included guesses. Experimental results revealed variations in the distribution of participants' awareness states across conditions while memory performance failed to reveal any. Interestingly, results revealed a higher proportion of recollections associated with mental imagery in the flat-shaded condition. These findings comply with similar effects revealed in two earlier studies summarized here, which demonstrated that the less "naturalistic" interaction interface or interface of low interaction fidelity provoked a higher proportion of recognitions based on visual mental images.

  17. The use of ambient audio to increase safety and immersion in location-based games

    NASA Astrophysics Data System (ADS)

    Kurczak, John Jason

    The purpose of this thesis is to propose an alternative type of interface for mobile software being used while walking or running. Our work addresses the problem of visual user interfaces for mobile software be- ing potentially unsafe for pedestrians, and not being very immersive when used for location-based games. In addition, location-based games and applications can be dif- ficult to develop when directly interfacing with the sensors used to track the user's location. These problems need to be addressed because portable computing devices are be- coming a popular tool for navigation, playing games, and accessing the internet while walking. This poses a safety problem for mobile users, who may be paying too much attention to their device to notice and react to hazards in their environment. The difficulty of developing location-based games and other location-aware applications may significantly hinder the prevalence of applications that explore new interaction techniques for ubiquitous computing. We created the TREC toolkit to address the issues with tracking sensors while developing location-based games and applications. We have developed functional location-based applications with TREC to demonstrate the amount of work that can be saved by using this toolkit. In order to have a safer and more immersive alternative to visual interfaces, we have developed ambient audio interfaces for use with mobile applications. Ambient audio uses continuous streams of sound over headphones to present information to mobile users without distracting them from walking safely. In order to test the effectiveness of ambient audio, we ran a study to compare ambient audio with handheld visual interfaces in a location-based game. We compared players' ability to safely navigate the environment, their sense of immersion in the game, and their performance at the in-game tasks. We found that ambient audio was able to significantly increase players' safety and sense of immersion compared to a visual interface, while players performed signifi- cantly better at the game tasks when using the visual interface. This makes ambient audio a legitimate alternative to visual interfaces for mobile users when safety and immersion are a priority.

  18. Designing EvoRoom: An Immersive Simulation Environment for Collective Inquiry in Secondary Science

    NASA Astrophysics Data System (ADS)

    Lui, Michelle Mei Yee

    This dissertation investigates the design of complex inquiry for co-located students to work as a knowledge community within a mixed-reality learning environment. It presents the design of an immersive simulation called EvoRoom and corresponding collective inquiry activities that allow students to explore concepts around topics of evolution and biodiversity in a Grade 11 Biology course. EvoRoom is a room-sized simulation of a rainforest, modeled after Borneo in Southeast Asia, where several projected displays are stitched together to form a large, animated simulation on each opposing wall of the room. This serves to create an immersive environment in which students work collaboratively as individuals, in small groups and a collective community to investigate science topics using the simulations as an evidentiary base. Researchers and a secondary science teacher co-designed a multi-week curriculum that prepared students with preliminary ideas and expertise, then provided them with guided activities within EvoRoom, supported by tablet-based software as well as larger visualizations of their collective progress. Designs encompassed the broader curriculum, as well as all EvoRoom materials (e.g., projected displays, student tablet interfaces, collective visualizations) and activity sequences. This thesis describes a series of three designs that were developed and enacted iteratively over two and a half years, presenting key features that enhanced students' experiences within the immersive environment, their interactions with peers, and their inquiry outcomes. Primary research questions are concerned with the nature of effective design for such activities and environments, and the kinds of interactions that are seen at the individual, collaborative and whole-class levels. The findings fall under one of three themes: 1) the physicality of the room, 2) the pedagogical script for student observation and reflection and collaboration, and 3) ways of including collective visualizations in the activity. Discrete findings demonstrate how the above variables, through their design as inquiry components (i.e., activity, room, scripts and scaffolds on devices, collective visualizations), can mediate the students' interactions with one another, with their teacher, and impact the outcomes of their inquiry. A set of design recommendations is drawn from the results of this research to guide future design or research efforts.

  19. Art-Science-Technology collaboration through immersive, interactive 3D visualization

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2014-12-01

    At the W. M. Keck Center for Active Visualization in Earth Sciences (KeckCAVES), a group of geoscientists and computer scientists collaborate to develop and use of interactive, immersive, 3D visualization technology to view, manipulate, and interpret data for scientific research. The visual impact of immersion in a CAVE environment can be extremely compelling, and from the outset KeckCAVES scientists have collaborated with artists to bring this technology to creative works, including theater and dance performance, installations, and gamification. The first full-fledged collaboration designed and produced a performance called "Collapse: Suddenly falling down", choreographed by Della Davidson, which investigated the human and cultural response to natural and man-made disasters. Scientific data (lidar scans of disaster sites, such as landslides and mine collapses) were fully integrated into the performance by the Sideshow Physical Theatre. This presentation will discuss both the technological and creative characteristics of, and lessons learned from the collaboration. Many parallels between the artistic and scientific process emerged. We observed that both artists and scientists set out to investigate a topic, solve a problem, or answer a question. Refining that question or problem is an essential part of both the creative and scientific workflow. Both artists and scientists seek understanding (in this case understanding of natural disasters). Differences also emerged; the group noted that the scientists sought clarity (including but not limited to quantitative measurements) as a means to understanding, while the artists embraced ambiguity, also as a means to understanding. Subsequent art-science-technology collaborations have responded to evolving technology for visualization and include gamification as a means to explore data, and use of augmented reality for informal learning in museum settings.

  20. Stereoscopic display of 3D models for design visualization

    NASA Astrophysics Data System (ADS)

    Gilson, Kevin J.

    2006-02-01

    Advances in display technology and 3D design visualization applications have made real-time stereoscopic visualization of architectural and engineering projects a reality. Parsons Brinkerhoff (PB) is a transportation consulting firm that has used digital visualization tools from their inception and has helped pioneer the application of those tools to large scale infrastructure projects. PB is one of the first Architecture/Engineering/Construction (AEC) firms to implement a CAVE- an immersive presentation environment that includes stereoscopic rear-projection capability. The firm also employs a portable stereoscopic front-projection system, and shutter-glass systems for smaller groups. PB is using commercial real-time 3D applications in combination with traditional 3D modeling programs to visualize and present large AEC projects to planners, clients and decision makers in stereo. These presentations create more immersive and spatially realistic presentations of the proposed designs. This paper will present the basic display tools and applications, and the 3D modeling techniques PB is using to produce interactive stereoscopic content. The paper will discuss several architectural and engineering design visualizations we have produced.

  1. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  2. Highly immersive virtual reality laparoscopy simulation: development and future aspects.

    PubMed

    Huber, Tobias; Wunderling, Tom; Paschold, Markus; Lang, Hauke; Kneist, Werner; Hansen, Christian

    2018-02-01

    Virtual reality (VR) applications with head-mounted displays (HMDs) have had an impact on information and multimedia technologies. The current work aimed to describe the process of developing a highly immersive VR simulation for laparoscopic surgery. We combined a VR laparoscopy simulator (LapSim) and a VR-HMD to create a user-friendly VR simulation scenario. Continuous clinical feedback was an essential aspect of the development process. We created an artificial VR (AVR) scenario by integrating the simulator video output with VR game components of figures and equipment in an operating room. We also created a highly immersive VR surrounding (IVR) by integrating the simulator video output with a [Formula: see text] video of a standard laparoscopy scenario in the department's operating room. Clinical feedback led to optimization of the visualization, synchronization, and resolution of the virtual operating rooms (in both the IVR and the AVR). Preliminary testing results revealed that individuals experienced a high degree of exhilaration and presence, with rare events of motion sickness. The technical performance showed no significant difference compared to that achieved with the standard LapSim. Our results provided a proof of concept for the technical feasibility of an custom highly immersive VR-HMD setup. Future technical research is needed to improve the visualization, immersion, and capability of interacting within the virtual scenario.

  3. Using Interactive Visualization to Analyze Solid Earth Data and Geodynamics Models

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.; Kreylos, O.; Billen, M. I.; Hamann, B.; Jadamec, M. A.; Rundle, J. B.; van Aalsburg, J.; Yikilmaz, M. B.

    2008-12-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. Major projects such as EarthScope and GeoEarthScope are producing the data needed to characterize the structure and kinematics of Earth's surface and interior at unprecedented resolution. At the same time, high-performance computing enables high-precision and fine- detail simulation of geodynamics processes, complementing the observational data. To facilitate interpretation and analysis of these datasets, to evaluate models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. VR has traditionally been used primarily as a presentation tool allowing active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for accelerated scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. Our approach to VR takes advantage of the specialized skills of geoscientists who are trained to interpret geological and geophysical data generated from field observations. Interactive tools allow the scientist to explore and interpret geodynamic models, tomographic models, and topographic observations, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulations or field observations. The use of VR technology enables us to improve our interpretation of crust and mantle structure and of geodynamical processes. Mapping tools based on computer visualization allow virtual "field studies" in inaccessible regions, and an interactive tool allows us to construct digital fault models for use in numerical models. Using the interactive tools on a high-end platform such as an immersive virtual reality room known as a Cave Automatic Virtual Environment (CAVE), enables the scientist to stand in data three-dimensional dataset while taking measurements. The CAVE involves three or more projection surfaces arranged as walls in a room. Stereo projectors combined with a motion tracking system and immersion recreates the experience of carrying out research in the field. This high-end system provides significant advantages for scientists working with complex volumetric data. The interactive tools also work on low-cost platforms that provide stereo views and the potential for interactivity such as a Geowall or a 3D enabled TV. The Geowall is also a well-established tool for education, and in combination with the tools we have developed, enables the rapid transfer of research data and new knowledge to the classroom. The interactive visualization tools can also be used on a desktop or laptop with or without stereo capability. Further information about the Virtual Reality User Interface (VRUI), the 3DVisualizer, the Virtual mapping tools, and the LIDAR viewer, can be found on the KeckCAVES website, www.keckcaves.org.

  4. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D

    PubMed Central

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron

    2017-01-01

    Abstract Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. PMID:28814063

  5. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D.

    PubMed

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron; Gümüs, Zeynep H

    2017-08-01

    Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. © The Authors 2017. Published by Oxford University Press.

  6. Multisensory Integration in the Virtual Hand Illusion with Active Movement

    PubMed Central

    Satoh, Satoru; Hachimura, Kozaburo

    2016-01-01

    Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality. PMID:27847822

  7. Chemistry in Second Life

    PubMed Central

    Lang, Andrew SID; Bradley, Jean-Claude

    2009-01-01

    This review will focus on the current level on chemistry research, education, and visualization possible within the multi-user virtual environment of Second Life. We discuss how Second Life has been used as a platform for the interactive and collaborative visualization of data from molecules and proteins to spectra and experimental data. We then review how these visualizations can be scripted for immersive educational activities and real-life collaborative research. We also discuss the benefits of the social networking affordances of Second Life for both chemists and chemistry students. PMID:19852781

  8. Chemistry in second life.

    PubMed

    Lang, Andrew S I D; Bradley, Jean-Claude

    2009-10-23

    This review will focus on the current level on chemistry research, education, and visualization possible within the multi-user virtual environment of Second Life. We discuss how Second Life has been used as a platform for the interactive and collaborative visualization of data from molecules and proteins to spectra and experimental data. We then review how these visualizations can be scripted for immersive educational activities and real-life collaborative research. We also discuss the benefits of the social networking affordances of Second Life for both chemists and chemistry students.

  9. Interactive and Stereoscopic Hybrid 3D Viewer of Radar Data with Gesture Recognition

    NASA Astrophysics Data System (ADS)

    Goenetxea, Jon; Moreno, Aitor; Unzueta, Luis; Galdós, Andoni; Segura, Álvaro

    This work presents an interactive and stereoscopic 3D viewer of weather information coming from a Doppler radar. The hybrid system shows a GIS model of the regional zone where the radar is located and the corresponding reconstructed 3D volume weather data. To enhance the immersiveness of the navigation, stereoscopic visualization has been added to the viewer, using a polarized glasses based system. The user can interact with the 3D virtual world using a Nintendo Wiimote for navigating through it and a Nintendo Wii Nunchuk for giving commands by means of hand gestures. We also present a dynamic gesture recognition procedure that measures the temporal advance of the performed gesture postures. Experimental results show how dynamic gestures are effectively recognized so that a more natural interaction and immersive navigation in the virtual world is achieved.

  10. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  11. How 3D immersive visualization is changing medical diagnostics

    NASA Astrophysics Data System (ADS)

    Koning, Anton H. J.

    2011-03-01

    Originally the only way to look inside the human body without opening it up was by means of two dimensional (2D) images obtained using X-ray equipment. The fact that human anatomy is inherently three dimensional leads to ambiguities in interpretation and problems of occlusion. Three dimensional (3D) imaging modalities such as CT, MRI and 3D ultrasound remove these drawbacks and are now part of routine medical care. While most hospitals 'have gone digital', meaning that the images are no longer printed on film, they are still being viewed on 2D screens. However, this way valuable depth information is lost, and some interactions become unnecessarily complex or even unfeasible. Using a virtual reality (VR) system to present volumetric data means that depth information is presented to the viewer and 3D interaction is made possible. At the Erasmus MC we have developed V-Scope, an immersive volume visualization system for visualizing a variety of (bio-)medical volumetric datasets, ranging from 3D ultrasound, via CT and MRI, to confocal microscopy, OPT and 3D electron-microscopy data. In this talk we will address the advantages of such a system for both medical diagnostics as well as for (bio)medical research.

  12. PC-Based Virtual Reality for CAD Model Viewing

    ERIC Educational Resources Information Center

    Seth, Abhishek; Smith, Shana S.-F.

    2004-01-01

    Virtual reality (VR), as an emerging visualization technology, has introduced an unprecedented communication method for collaborative design. VR refers to an immersive, interactive, multisensory, viewer-centered, 3D computer-generated environment and the combination of technologies required to build such an environment. This article introduces the…

  13. Learning immersion without getting wet

    NASA Astrophysics Data System (ADS)

    Aguilera, Julieta C.

    2012-03-01

    This paper describes the teaching of an immersive environments class on the Spring of 2011. The class had students from undergraduate as well as graduate art related majors. Their digital background and interests were also diverse. These variables were channeled as different approaches throughout the semester. Class components included fundamentals of stereoscopic computer graphics to explore spatial depth, 3D modeling and skeleton animation to in turn explore presence, exposure to formats like a stereo projection wall and dome environments to compare field of view across devices, and finally, interaction and tracking to explore issues of embodiment. All these components were supported by theoretical readings discussed in class. Guest artists presented their work in Virtual Reality, Dome Environments and other immersive formats. Museum professionals also introduced students to space science visualizations, which utilize immersive formats. Here I present the assignments and their outcome, together with insights as to how the creation of immersive environments can be learned through constraints that expose students to situations of embodied cognition.

  14. Not Just a Game … When We Play Together, We Learn Together: Interactive Virtual Environments and Gaming Engines for Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Anderson, J. W.

    2017-12-01

    An ideal tool for ecologists and land managers to investigate the impacts of both projected environmental changes and policy alternatives is the creation of immersive, interactive, virtual landscapes. As a new frontier in visualizing and understanding geospatial data, virtual landscapes require a new toolbox for data visualization that includes traditional GIS tools and uncommon tools such as the Unity3d game engine. Game engines provide capabilities to not only explore data but to build and interact with dynamic models collaboratively. These virtual worlds can be used to display and illustrate data that is often more understandable and plausible to both stakeholders and policy makers than is achieved using traditional maps.Within this context we will present funded research that has been developed utilizing virtual landscapes for geographic visualization and decision support among varied stakeholders. We will highlight the challenges and lessons learned when developing interactive virtual environments that require large multidisciplinary team efforts with varied competences. The results will emphasize the importance of visualization and interactive virtual environments and the link with emerging research disciplines within Visual Analytics.

  15. Understanding Immersivity: Image Generation and Transformation Processes in 3D Immersive Environments

    PubMed Central

    Kozhevnikov, Maria; Dhond, Rupali P.

    2012-01-01

    Most research on three-dimensional (3D) visual-spatial processing has been conducted using traditional non-immersive 2D displays. Here we investigated how individuals generate and transform mental images within 3D immersive (3DI) virtual environments, in which the viewers perceive themselves as being surrounded by a 3D world. In Experiment 1, we compared participants’ performance on the Shepard and Metzler (1971) mental rotation (MR) task across the following three types of visual presentation environments; traditional 2D non-immersive (2DNI), 3D non-immersive (3DNI – anaglyphic glasses), and 3DI (head mounted display with position and head orientation tracking). In Experiment 2, we examined how the use of different backgrounds affected MR processes within the 3DI environment. In Experiment 3, we compared electroencephalogram data recorded while participants were mentally rotating visual-spatial images presented in 3DI vs. 2DNI environments. Overall, the findings of the three experiments suggest that visual-spatial processing is different in immersive and non-immersive environments, and that immersive environments may require different image encoding and transformation strategies than the two other non-immersive environments. Specifically, in a non-immersive environment, participants may utilize a scene-based frame of reference and allocentric encoding whereas immersive environments may encourage the use of a viewer-centered frame of reference and egocentric encoding. These findings also suggest that MR performed in laboratory conditions using a traditional 2D computer screen may not reflect spatial processing as it would occur in the real world. PMID:22908003

  16. The effect of visual-vestibulosomatosensory conflict induced by virtual reality on postural stability in humans.

    PubMed

    Nishiike, Suetaka; Okazaki, Suzuyo; Watanabe, Hiroshi; Akizuki, Hironori; Imai, Takao; Uno, Atsuhiko; Kitahara, Tadashi; Horii, Arata; Takeda, Noriaki; Inohara, Hidenori

    2013-01-01

    In this study, we examined the effects of sensory inputs of visual-vestibulosomatosensory conflict induced by virtual reality (VR) on subjective dizziness, posture stability and visual dependency on postural control in humans. Eleven healthy young volunteers were immersed in two different VR conditions. In the control condition, subjects walked voluntarily with the background images of interactive computer graphics proportionally synchronized to their walking pace. In the visual-vestibulosomatosensory conflict condition, subjects kept still, but the background images that subjects experienced in the control condition were presented. The scores of both Graybiel's and Hamilton's criteria, postural instability and Romberg ratio were measured before and after the two conditions. After immersion in the conflict condition, both subjective dizziness and objective postural instability were significantly increased, and Romberg ratio, an index of the visual dependency on postural control, was slightly decreased. These findings suggest that sensory inputs of visual-vestibulosomatosensory conflict induced by VR induced motion sickness, resulting in subjective dizziness and postural instability. They also suggest that adaptation to the conflict condition decreases the contribution of visual inputs to postural control with re-weighing of vestibulosomatosensory inputs. VR may be used as a rehabilitation tool for dizzy patients by its ability to induce sensory re-weighing of postural control.

  17. MinOmics, an Integrative and Immersive Tool for Multi-Omics Analysis.

    PubMed

    Maes, Alexandre; Martinez, Xavier; Druart, Karen; Laurent, Benoist; Guégan, Sean; Marchand, Christophe H; Lemaire, Stéphane D; Baaden, Marc

    2018-06-21

    Proteomic and transcriptomic technologies resulted in massive biological datasets, their interpretation requiring sophisticated computational strategies. Efficient and intuitive real-time analysis remains challenging. We use proteomic data on 1417 proteins of the green microalga Chlamydomonas reinhardtii to investigate physicochemical parameters governing selectivity of three cysteine-based redox post translational modifications (PTM): glutathionylation (SSG), nitrosylation (SNO) and disulphide bonds (SS) reduced by thioredoxins. We aim to understand underlying molecular mechanisms and structural determinants through integration of redox proteome data from gene- to structural level. Our interactive visual analytics approach on an 8.3 m2 display wall of 25 MPixel resolution features stereoscopic three dimensions (3D) representation performed by UnityMol WebGL. Virtual reality headsets complement the range of usage configurations for fully immersive tasks. Our experiments confirm that fast access to a rich cross-linked database is necessary for immersive analysis of structural data. We emphasize the possibility to display complex data structures and relationships in 3D, intrinsic to molecular structure visualization, but less common for omics-network analysis. Our setup is powered by MinOmics, an integrated analysis pipeline and visualization framework dedicated to multi-omics analysis. MinOmics integrates data from various sources into a materialized physical repository. We evaluate its performance, a design criterion for the framework.

  18. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges

    PubMed Central

    Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel

    2015-01-01

    The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants’ behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants’ height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother). PMID:26157414

  19. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases.

    PubMed

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-04-15

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients' brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies.

  20. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges.

    PubMed

    Bombari, Dario; Schmid Mast, Marianne; Canadas, Elena; Bachmann, Manuel

    2015-01-01

    The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants' behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants' height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother).

  1. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases

    PubMed Central

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-01-01

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients’ brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies. PMID:25206907

  2. Developing effective serious games: the effect of background sound on visual fidelity perception with varying texture resolution.

    PubMed

    Rojas, David; Kapralos, Bill; Cristancho, Sayra; Collins, Karen; Hogue, Andrew; Conati, Cristina; Dubrowski, Adam

    2012-01-01

    Despite the benefits associated with virtual learning environments and serious games, there are open, fundamental issues regarding simulation fidelity and multi-modal cue interaction and their effect on immersion, transfer of knowledge, and retention. Here we describe the results of a study that examined the effect of ambient (background) sound on the perception of visual fidelity (defined with respect to texture resolution). Results suggest that the perception of visual fidelity is dependent on ambient sound and more specifically, white noise can have detrimental effects on our perception of high quality visuals. The results of this study will guide future studies that will ultimately aid in developing an understanding of the role that fidelity, and multi-modal interactions play with respect to knowledge transfer and retention for users of virtual simulations and serious games.

  3. Interactive Visualization to Advance Earthquake Simulation

    NASA Astrophysics Data System (ADS)

    Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn

    2008-04-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.

  4. A new multimodal interactive way of subjective scoring of 3D video quality of experience

    NASA Astrophysics Data System (ADS)

    Kim, Taewan; Lee, Kwanghyun; Lee, Sanghoon; Bovik, Alan C.

    2014-03-01

    People that watch today's 3D visual programs, such as 3D cinema, 3D TV and 3D games, experience wide and dynamically varying ranges of 3D visual immersion and 3D quality of experience (QoE). It is necessary to be able to deploy reliable methodologies that measure each viewers subjective experience. We propose a new methodology that we call Multimodal Interactive Continuous Scoring of Quality (MICSQ). MICSQ is composed of a device interaction process between the 3D display and a separate device (PC, tablet, etc.) used as an assessment tool, and a human interaction process between the subject(s) and the device. The scoring process is multimodal, using aural and tactile cues to help engage and focus the subject(s) on their tasks. Moreover, the wireless device interaction process makes it possible for multiple subjects to assess 3D QoE simultaneously in a large space such as a movie theater, and at di®erent visual angles and distances.

  5. Recent Advances in Immersive Visualization of Ocean Data: Virtual Reality Through the Web on Your Laptop Computer

    NASA Astrophysics Data System (ADS)

    Hermann, A. J.; Moore, C.; Soreide, N. N.

    2002-12-01

    Ocean circulation is irrefutably three dimensional, and powerful new measurement technologies and numerical models promise to expand our three-dimensional knowledge of the dynamics further each year. Yet, most ocean data and model output is still viewed using two-dimensional maps. Immersive visualization techniques allow the investigator to view their data as a three dimensional world of surfaces and vectors which evolves through time. The experience is not unlike holding a part of the ocean basin in one's hand, turning and examining it from different angles. While immersive, three dimensional visualization has been possible for at least a decade, the technology was until recently inaccessible (both physically and financially) for most researchers. It is not yet fully appreciated by practicing oceanographers how new, inexpensive computing hardware and software (e.g. graphics cards and controllers designed for the huge PC gaming market) can be employed for immersive, three dimensional, color visualization of their increasingly huge datasets and model output. In fact, the latest developments allow immersive visualization through web servers, giving scientists the ability to "fly through" three-dimensional data stored half a world away. Here we explore what additional insight is gained through immersive visualization, describe how scientists of very modest means can easily avail themselves of the latest technology, and demonstrate its implementation on a web server for Pacific Ocean model output.

  6. Hybrid Reality Lab Capabilities - Video 2

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2016-01-01

    Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.

  7. Vroom: designing an augmented environment for remote collaboration in digital cinema production

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; Cornish, Tracy

    2013-03-01

    As media technologies become increasingly affordable, compact and inherently networked, new generations of telecollaborative platforms continue to arise which integrate these new affordances. Virtual reality has been primarily concerned with creating simulations of environments that can transport participants to real or imagined spaces that replace the "real world". Meanwhile Augmented Reality systems have evolved to interleave objects from Virtual Reality environments into the physical landscape. Perhaps now there is a new class of systems that reverse this precept to enhance dynamic media landscapes and immersive physical display environments to enable intuitive data exploration through collaboration. Vroom (Virtual Room) is a next-generation reconfigurable tiled display environment in development at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego. Vroom enables freely scalable digital collaboratories, connecting distributed, high-resolution visualization resources for collaborative work in the sciences, engineering and the arts. Vroom transforms a physical space into an immersive media environment with large format interactive display surfaces, video teleconferencing and spatialized audio built on a highspeed optical network backbone. Vroom enables group collaboration for local and remote participants to share knowledge and experiences. Possible applications include: remote learning, command and control, storyboarding, post-production editorial review, high resolution video playback, 3D visualization, screencasting and image, video and multimedia file sharing. To support these various scenarios, Vroom features support for multiple user interfaces (optical tracking, touch UI, gesture interface, etc.), support for directional and spatialized audio, giga-pixel image interactivity, 4K video streaming, 3D visualization and telematic production. This paper explains the design process that has been utilized to make Vroom an accessible and intuitive immersive environment for remote collaboration specifically for digital cinema production.

  8. Art, science, and immersion: data-driven experiences

    NASA Astrophysics Data System (ADS)

    West, Ruth G.; Monroe, Laura; Ford Morie, Jacquelyn; Aguilera, Julieta

    2013-03-01

    This panel and dialog-paper explores the potentials at the intersection of art, science, immersion and highly dimensional, "big" data to create new forms of engagement, insight and cultural forms. We will address questions such as: "What kinds of research questions can be identified at the intersection of art + science + immersive environments that can't be expressed otherwise?" "How is art+science+immersion distinct from state-of-the art visualization?" "What does working with immersive environments and visualization offer that other approaches don't or can't?" "Where does immersion fall short?" We will also explore current trends in the application of immersion for gaming, scientific data, entertainment, simulation, social media and other new forms of big data. We ask what expressive, arts-based approaches can contribute to these forms in the broad cultural landscape of immersive technologies.

  9. Inclusion of Immersive Virtual Learning Environments and Visual Control Systems to Support the Learning of Students with Asperger Syndrome

    ERIC Educational Resources Information Center

    Lorenzo, Gonzalo; Pomares, Jorge; Lledo, Asuncion

    2013-01-01

    This paper presents the use of immersive virtual reality systems in the educational intervention with Asperger students. The starting points of this study are features of these students' cognitive style that requires an explicit teaching style supported by visual aids and highly structured environments. The proposed immersive virtual reality…

  10. Immersive Virtual Moon Scene System Based on Panoramic Camera Data of Chang'E-3

    NASA Astrophysics Data System (ADS)

    Gao, X.; Liu, J.; Mu, L.; Yan, W.; Zeng, X.; Zhang, X.; Li, C.

    2014-12-01

    The system "Immersive Virtual Moon Scene" is used to show the virtual environment of Moon surface in immersive environment. Utilizing stereo 360-degree imagery from panoramic camera of Yutu rover, the system enables the operator to visualize the terrain and the celestial background from the rover's point of view in 3D. To avoid image distortion, stereo 360-degree panorama stitched by 112 images is projected onto inside surface of sphere according to panorama orientation coordinates and camera parameters to build the virtual scene. Stars can be seen from the Moon at any time. So we render the sun, planets and stars according to time and rover's location based on Hipparcos catalogue as the background on the sphere. Immersing in the stereo virtual environment created by this imaged-based rendering technique, the operator can zoom, pan to interact with the virtual Moon scene and mark interesting objects. Hardware of the immersive virtual Moon system is made up of four high lumen projectors and a huge curve screen which is 31 meters long and 5.5 meters high. This system which take all panoramic camera data available and use it to create an immersive environment, enable operator to interact with the environment and mark interesting objects contributed heavily to establishment of science mission goals in Chang'E-3 mission. After Chang'E-3 mission, the lab with this system will be open to public. Besides this application, Moon terrain stereo animations based on Chang'E-1 and Chang'E-2 data will be showed to public on the huge screen in the lab. Based on the data of lunar exploration,we will made more immersive virtual moon scenes and animations to help the public understand more about the Moon in the future.

  11. Visualization of reservoir simulation data with an immersive virtual reality system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, B.K.

    1996-10-01

    This paper discusses an investigation into the use of an immersive virtual reality (VR) system to visualize reservoir simulation output data. The hardware and software configurations of the test-immersive VR system are described and compared to a nonimmersive VR system and to an existing workstation screen-based visualization system. The structure of 3D reservoir simulation data and the actions to be performed on the data within the VR system are discussed. The subjective results of the investigation are then presented, followed by a discussion of possible future work.

  12. Single-shot water-immersion microscopy platform for qualitative visualization and quantitative phase imaging of biosamples

    NASA Astrophysics Data System (ADS)

    Picazo-Bueno, José Ángel; Cojoc, Dan; Torre, Vincent; Micó, Vicente

    2017-07-01

    We present the combination of a single-shot water-immersion digital holographic microscopy with broadband illumination for simultaneous visualization of coherent and incoherent images using microbeads and different biosamples.

  13. Immersive visualization of rail simulation data.

    DOT National Transportation Integrated Search

    2016-01-01

    The prime objective of this project was to create scientific, immersive visualizations of a Rail-simulation. This project is a part of a larger initiative that consists of three distinct parts. The first step consists of performing a finite element a...

  14. Enhancing Tele-robotics with Immersive Virtual Reality

    DTIC Science & Technology

    2017-11-03

    graduate and undergraduate students within the Digital Gaming and Simulation, Computer Science, and psychology programs have actively collaborated...investigates the use of artificial intelligence and visual computing. Numerous fields across the human-computer interaction and gaming research areas...invested in digital gaming and simulation to cognitively stimulate humans by computers, forming a $10.5B industry [1]. On the other hand, cognitive

  15. Scientific Visualization Made Easy for the Scientist

    NASA Astrophysics Data System (ADS)

    Westerhoff, M.; Henderson, B.

    2002-12-01

    amirar is an application program used in creating 3D visualizations and geometric models of 3D image data sets from various application areas, e.g. medicine, biology, biochemistry, chemistry, physics, and engineering. It has demonstrated significant adoption in the market place since becoming commercially available in 2000. The rapid adoption has expanded the features being requested by the user base and broadened the scope of the amira product offering. The amira product offering includes amira Standard, amiraDevT, used to extend the product capabilities by users, amiraMolT, used for molecular visualization, amiraDeconvT, used to improve quality of image data, and amiraVRT, used in immersive VR environments. amira allows the user to construct a visualization tailored to his or her needs without requiring any programming knowledge. It also allows 3D objects to be represented as grids suitable for numerical simulations, notably as triangular surfaces and volumetric tetrahedral grids. The amira application also provides methods to generate such grids from voxel data representing an image volume, and it includes a general-purpose interactive 3D viewer. amiraDev provides an application-programming interface (API) that allows the user to add new components by C++ programming. amira supports many import formats including a 'raw' format allowing immediate access to your native uniform data sets. amira uses the power and speed of the OpenGLr and Open InventorT graphics libraries and 3D graphics accelerators to allow you to access over 145 modules, enabling you to process, probe, analyze and visualize your data. The amiraMolT extension adds powerful tools for molecular visualization to the existing amira platform. amiraMolT contains support for standard molecular file formats, tools for visualization and analysis of static molecules as well as molecular trajectories (time series). amiraDeconv adds tools for the deconvolution of 3D microscopic images. Deconvolution is the process of increasing image quality and resolution by computationally compensating artifacts of the recording process. amiraDeconv supports 3D wide field microscopy as well as 3D confocal microscopy. It offers both non-blind and blind image deconvolution algorithms. Non-blind deconvolution uses an individual measured point spread function, while non-blind algorithms work on the basis of only a few recording parameters (like numerical aperture or zoom factor). amiraVR is a specialized and extended version of the amira visualization system which is dedicated for use in immersive installations, such as large-screen stereoscopic projections, CAVEr or Holobenchr systems. Among others, it supports multi-threaded multi-pipe rendering, head-tracking, advanced 3D interaction concepts, and 3D menus allowing interaction with any amira object in the same way as on the desktop. With its unique set of features, amiraVR represents both a VR (Virtual Reality) ready application for scientific and medical visualization in immersive environments, and a development platform that allows building VR applications.

  16. The Worldviews Network: Transformative Global Change Education in Immersive Environments

    NASA Astrophysics Data System (ADS)

    Hamilton, H.; Yu, K. C.; Gardiner, N.; McConville, D.; Connolly, R.; "Irving, Lindsay", L. S.

    2011-12-01

    Our modern age is defined by an astounding capacity to generate scientific information. From DNA to dark matter, human ingenuity and technologies create an endless stream of data about ourselves and the world of which we are a part. Yet we largely founder in transforming information into understanding, and understanding into rational action for our society as a whole. Earth and biodiversity scientists are especially frustrated by this impasse because the data they gather often point to a clash between Earth's capacity to sustain life and the decisions that humans make to garner the planet's resources. Immersive virtual environments offer an underexplored link in the translation of scientific data into public understanding, dialogue, and action. The Worldviews Network is a collaboration of scientists, artists, and educators focused on developing best practices for the use of immersive environments for science-based ecological literacy education. A central tenet of the Worldviews Network is that there are multiple ways to know and experience the world, so we are developing scientifically accurate, geographically relevant, and culturally appropriate programming to promote ecological literacy within informal science education programs across the United States. The goal of Worldviews Network is to offer transformative learning experiences, in which participants are guided on a process integrating immersive visual explorations, critical reflection and dialogue, and design-oriented approaches to action - or more simply, seeing, knowing, and doing. Our methods center on live presentations, interactive scientific visualizations, and sustainability dialogues hosted at informal science institutions. Our approach uses datasets from the life, Earth, and space sciences to illuminate the complex conditions that support life on earth and the ways in which ecological systems interact. We are leveraging scientific data from federal agencies, non-governmental organizations, and our own research to develop a library of immersive visualization stories and templates that explore ecological relationships across time at cosmic, global, and bioregional scales, with learning goals aligned to climate and earth science literacy principles. These experiential narratives are used to increase participants' awareness of global change issues as well as to engage them in dialogues and design processes focused on steps they can take within their own communities to systemically address these interconnected challenges. More than 600 digital planetariums in the U.S. collectively represent a pioneering opportunity for distributing Earth systems messages over large geographic areas. By placing the viewer-and Earth itself-within the context of the rest of the universe, digital planetariums can uniquely provide essential transcalar perspectives on the complex interdependencies of Earth's interacting physical and biological systems. The Worldviews Network is creating innovative, data-driven approaches for engaging the American public in dialogues about human-induced global changes.

  17. The forensic holodeck: an immersive display for forensic crime scene reconstructions.

    PubMed

    Ebert, Lars C; Nguyen, Tuan T; Breitbeck, Robert; Braun, Marcel; Thali, Michael J; Ross, Steffen

    2014-12-01

    In forensic investigations, crime scene reconstructions are created based on a variety of three-dimensional image modalities. Although the data gathered are three-dimensional, their presentation on computer screens and paper is two-dimensional, which incurs a loss of information. By applying immersive virtual reality (VR) techniques, we propose a system that allows a crime scene to be viewed as if the investigator were present at the scene. We used a low-cost VR headset originally developed for computer gaming in our system. The headset offers a large viewing volume and tracks the user's head orientation in real-time, and an optical tracker is used for positional information. In addition, we created a crime scene reconstruction to demonstrate the system. In this article, we present a low-cost system that allows immersive, three-dimensional and interactive visualization of forensic incident scene reconstructions.

  18. Planning, Implementation and Optimization of Future space Missions using an Immersive Visualization Environement (IVE) Machine

    NASA Astrophysics Data System (ADS)

    Harris, E.

    Planning, Implementation and Optimization of Future Space Missions using an Immersive Visualization Environment (IVE) Machine E. N. Harris, Lockheed Martin Space Systems, Denver, CO and George.W. Morgenthaler, U. of Colorado at Boulder History: A team of 3-D engineering visualization experts at the Lockheed Martin Space Systems Company have developed innovative virtual prototyping simulation solutions for ground processing and real-time visualization of design and planning of aerospace missions over the past 6 years. At the University of Colorado, a team of 3-D visualization experts are developing the science of 3-D visualization and immersive visualization at the newly founded BP Center for Visualization, which began operations in October, 2001. (See IAF/IAA-01-13.2.09, "The Use of 3-D Immersive Visualization Environments (IVEs) to Plan Space Missions," G. A. Dorn and G. W. Morgenthaler.) Progressing from Today's 3-D Engineering Simulations to Tomorrow's 3-D IVE Mission Planning, Simulation and Optimization Techniques: 3-D (IVEs) and visualization simulation tools can be combined for efficient planning and design engineering of future aerospace exploration and commercial missions. This technology is currently being developed and will be demonstrated by Lockheed Martin in the (IVE) at the BP Center using virtual simulation for clearance checks, collision detection, ergonomics and reach-ability analyses to develop fabrication and processing flows for spacecraft and launch vehicle ground support operations and to optimize mission architecture and vehicle design subject to realistic constraints. Demonstrations: Immediate aerospace applications to be demonstrated include developing streamlined processing flows for Reusable Space Transportation Systems and Atlas Launch Vehicle operations and Mars Polar Lander visual work instructions. Long-range goals include future international human and robotic space exploration missions such as the development of a Mars Reconnaissance Orbiter and Lunar Base construction scenarios. Innovative solutions utilizing Immersive Visualization provide the key to streamlining the mission planning and optimizing engineering design phases of future aerospace missions.

  19. The Hologram in My Hand: How Effective is Interactive Exploration of 3D Visualizations in Immersive Tangible Augmented Reality?

    PubMed

    Bach, Benjamin; Sicat, Ronell; Beyer, Johanna; Cordeil, Maxime; Pfister, Hanspeter

    2018-01-01

    We report on a controlled user study comparing three visualization environments for common 3D exploration. Our environments differ in how they exploit natural human perception and interaction capabilities. We compare an augmented-reality head-mounted display (Microsoft HoloLens), a handheld tablet, and a desktop setup. The novel head-mounted HoloLens display projects stereoscopic images of virtual content into a user's real world and allows for interaction in-situ at the spatial position of the 3D hologram. The tablet is able to interact with 3D content through touch, spatial positioning, and tangible markers, however, 3D content is still presented on a 2D surface. Our hypothesis is that visualization environments that match human perceptual and interaction capabilities better to the task at hand improve understanding of 3D visualizations. To better understand the space of display and interaction modalities in visualization environments, we first propose a classification based on three dimensions: perception, interaction, and the spatial and cognitive proximity of the two. Each technique in our study is located at a different position along these three dimensions. We asked 15 participants to perform four tasks, each task having different levels of difficulty for both spatial perception and degrees of freedom for interaction. Our results show that each of the tested environments is more effective for certain tasks, but that generally the desktop environment is still fastest and most precise in almost all cases.

  20. OnSight: Multi-platform Visualization of the Surface of Mars

    NASA Astrophysics Data System (ADS)

    Abercrombie, S. P.; Menzies, A.; Winter, A.; Clausen, M.; Duran, B.; Jorritsma, M.; Goddard, C.; Lidawer, A.

    2017-12-01

    A key challenge of planetary geology is to develop an understanding of an environment that humans cannot (yet) visit. Instead, scientists rely on visualizations created from images sent back by robotic explorers, such as the Curiosity Mars rover. OnSight is a multi-platform visualization tool that helps scientists and engineers to visualize the surface of Mars. Terrain visualization allows scientists to understand the scale and geometric relationships of the environment around the Curiosity rover, both for scientific understanding and for tactical consideration in safely operating the rover. OnSight includes a web-based 2D/3D visualization tool, as well as an immersive mixed reality visualization. In addition, OnSight offers a novel feature for communication among the science team. Using the multiuser feature of OnSight, scientists can meet virtually on Mars, to discuss geology in a shared spatial context. Combining web-based visualization with immersive visualization allows OnSight to leverage strengths of both platforms. This project demonstrates how 3D visualization can be adapted to either an immersive environment or a computer screen, and will discuss advantages and disadvantages of both platforms.

  1. The Selimiye Mosque of Edirne, Turkey - AN Immersive and Interactive Virtual Reality Experience Using Htc Vive

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Büyüksalih, G.; Tschirschwitz, F.; Kan, T.; Deggim, S.; Kaya, Y.; Baskaraca, A. P.

    2017-05-01

    Recent advances in contemporary Virtual Reality (VR) technologies are going to have a significant impact on veryday life. Through VR it is possible to virtually explore a computer-generated environment as a different reality, and to immerse oneself into the past or in a virtual museum without leaving the current real-life situation. For such the ultimate VR experience, the user should only see the virtual world. Currently, the user must wear a VR headset which fits around the head and over the eyes to visually separate themselves from the physical world. Via the headset images are fed to the eyes through two small lenses. Cultural heritage monuments are ideally suited both for thorough multi-dimensional geometric documentation and for realistic interactive visualisation in immersive VR applications. Additionally, the game industry offers tools for interactive visualisation of objects to motivate users to virtually visit objects and places. In this paper the generation of a virtual 3D model of the Selimiye mosque in the city of Edirne, Turkey and its processing for data integration into the game engine Unity is presented. The project has been carried out as a co-operation between BİMTAŞ, a company of the Greater Municipality of Istanbul, Turkey and the Photogrammetry & Laser Scanning Lab of the HafenCity University Hamburg, Germany to demonstrate an immersive and interactive visualisation using the new VR system HTC Vive. The workflow from data acquisition to VR visualisation, including the necessary programming for navigation, is described. Furthermore, the possible use (including simultaneous multiple users environments) of such a VR visualisation for a CH monument is discussed in this contribution.

  2. Virtual environment display for a 3D audio room simulation

    NASA Technical Reports Server (NTRS)

    Chapin, William L.; Foster, Scott H.

    1992-01-01

    The development of a virtual environment simulation system integrating a 3D acoustic audio model with an immersive 3D visual scene is discussed. The system complements the acoustic model and is specified to: allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; reinforce the listener's feeling of telepresence in the acoustical environment with visual and proprioceptive sensations; enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations.

  3. Real-time, interactive, visually updated simulator system for telepresence

    NASA Technical Reports Server (NTRS)

    Schebor, Frederick S.; Turney, Jerry L.; Marzwell, Neville I.

    1991-01-01

    Time delays and limited sensory feedback of remote telerobotic systems tend to disorient teleoperators and dramatically decrease the operator's performance. To remove the effects of time delays, key components were designed and developed of a prototype forward simulation subsystem, the Global-Local Environment Telerobotic Simulator (GLETS) that buffers the operator from the remote task. GLETS totally immerses an operator in a real-time, interactive, simulated, visually updated artificial environment of the remote telerobotic site. Using GLETS, the operator will, in effect, enter into a telerobotic virtual reality and can easily form a gestalt of the virtual 'local site' that matches the operator's normal interactions with the remote site. In addition to use in space based telerobotics, GLETS, due to its extendable architecture, can also be used in other teleoperational environments such as toxic material handling, construction, and undersea exploration.

  4. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    NASA Astrophysics Data System (ADS)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  5. Visualization of molecular structures using HoloLens-based augmented reality

    PubMed Central

    Hoffman, MA; Provance, JB

    2017-01-01

    Biological molecules and biologically active small molecules are complex three dimensional structures. Current flat screen monitors are limited in their ability to convey the full three dimensional characteristics of these molecules. Augmented reality devices, including the Microsoft HoloLens, offer an immersive platform to change how we interact with molecular visualizations. We describe a process to incorporate the three dimensional structures of small molecules and complex proteins into the Microsoft HoloLens using aspirin and the human leukocyte antigen (HLA) as examples. Small molecular structures can be introduced into the HoloStudio application, which provides native support for rotating, resizing and performing other interactions with these molecules. Larger molecules can be imported through the Unity gaming development platform and then Microsoft Visual Developer. The processes described here can be modified to import a wide variety of molecular structures into augmented reality systems and improve our comprehension of complex structural features. PMID:28815109

  6. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    PubMed Central

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-01-01

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. PMID:24113680

  7. Force sensitive handles and capacitive touch sensor for driving a flexible haptic-based immersive system.

    PubMed

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-10-09

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  8. Planning, implementation and optimization of future space missions using an immersive visualization environment (IVE) machine

    NASA Astrophysics Data System (ADS)

    Nathan Harris, E.; Morgenthaler, George W.

    2004-07-01

    Beginning in 1995, a team of 3-D engineering visualization experts assembled at the Lockheed Martin Space Systems Company and began to develop innovative virtual prototyping simulation tools for performing ground processing and real-time visualization of design and planning of aerospace missions. At the University of Colorado, a team of 3-D visualization experts also began developing the science of 3-D visualization and immersive visualization at the newly founded British Petroleum (BP) Center for visualization, which began operations in October, 2001. BP acquired ARCO in the year 2000 and awarded the 3-D flexible IVE developed by ARCO (beginning in 1990) to the University of Colorado, CU, the winner in a competition among 6 Universities. CU then hired Dr. G. Dorn, the leader of the ARCO team as Center Director, and the other experts to apply 3-D immersive visualization to aerospace and to other University Research fields, while continuing research on surface interpretation of seismic data and 3-D volumes. This paper recounts further progress and outlines plans in Aerospace applications at Lockheed Martin and CU.

  9. How virtual reality works: illusions of vision in "real" and virtual environments

    NASA Astrophysics Data System (ADS)

    Stark, Lawrence W.

    1995-04-01

    Visual illusions abound in normal vision--illusions of clarity and completeness, of continuity in time and space, of presence and vivacity--and are part and parcel of the visual world inwhich we live. These illusions are discussed in terms of the human visual system, with its high- resolution fovea, moved from point to point in the visual scene by rapid saccadic eye movements (EMs). This sampling of visual information is supplemented by a low-resolution, wide peripheral field of view, especially sensitive to motion. Cognitive-spatial models controlling perception, imagery, and 'seeing,' also control the EMs that shift the fovea in the Scanpath mode. These illusions provide for presence, the sense off being within an environment. They equally well lead to 'Telepresence,' the sense of being within a virtual display, especially if the operator is intensely interacting within an eye-hand and head-eye human-machine interface that provides for congruent visual and motor frames of reference. Interaction, immersion, and interest compel telepresence; intuitive functioning and engineered information flows can optimize human adaptation to the artificial new world of virtual reality, as virtual reality expands into entertainment, simulation, telerobotics, and scientific visualization and other professional work.

  10. Immersive Training Systems: Virtual Reality and Education and Training.

    ERIC Educational Resources Information Center

    Psotka, Joseph

    1995-01-01

    Describes virtual reality (VR) technology and VR research on education and training. Focuses on immersion as the key added value of VR, analyzes cognitive variables connected to immersion, how it is generated in synthetic environments and its benefits. Discusses value of tracked, immersive visual displays over nonimmersive simulations. Contains 78…

  11. Journey to the centre of the cell: Virtual reality immersion into scientific data.

    PubMed

    Johnston, Angus P R; Rae, James; Ariotti, Nicholas; Bailey, Benjamin; Lilja, Andrew; Webb, Robyn; Ferguson, Charles; Maher, Sheryl; Davis, Thomas P; Webb, Richard I; McGhee, John; Parton, Robert G

    2018-02-01

    Visualization of scientific data is crucial not only for scientific discovery but also to communicate science and medicine to both experts and a general audience. Until recently, we have been limited to visualizing the three-dimensional (3D) world of biology in 2 dimensions. Renderings of 3D cells are still traditionally displayed using two-dimensional (2D) media, such as on a computer screen or paper. However, the advent of consumer grade virtual reality (VR) headsets such as Oculus Rift and HTC Vive means it is now possible to visualize and interact with scientific data in a 3D virtual world. In addition, new microscopic methods provide an unprecedented opportunity to obtain new 3D data sets. In this perspective article, we highlight how we have used cutting edge imaging techniques to build a 3D virtual model of a cell from serial block-face scanning electron microscope (SBEM) imaging data. This model allows scientists, students and members of the public to explore and interact with a "real" cell. Early testing of this immersive environment indicates a significant improvement in students' understanding of cellular processes and points to a new future of learning and public engagement. In addition, we speculate that VR can become a new tool for researchers studying cellular architecture and processes by populating VR models with molecular data. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Augmented Virtuality: A Real-time Process for Presenting Real-world Visual Sensory Information in an Immersive Virtual Environment for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.

    2017-12-01

    Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.

  13. Research on spatial features of streets under the influence of immersion communication technology brought by new media

    NASA Astrophysics Data System (ADS)

    Xu, Hua-wei; Feng, Chen

    2017-04-01

    The rapid development of new media has exacerbated the complexity of urban street space’s information interaction. With the influence of the immersion communication, the streetscape has constructed a special scene like ‘media convergence’, which has brought a huge challenge for maintaining the urban streetscape order. The Spatial Visual Communication Research Method which should break the limitation of the traditional aesthetic space research, can provide a brand new prospect for this phenomenon research. This study aims to analyze and summarize the communication characteristics of new media and its context, which will be helpful for understanding the social meaning within the order change of the street’s spatial and physical environment.

  14. Integrating Intelligence and Building Teams Within the Infantry Immersion Trainer

    DTIC Science & Technology

    2009-09-01

    understanding that a motivated learner (trainee) has a potential to be the best learner (trainee). 30 Yuhas et al. (2008) provided a list of...unit member needs to be fully trained and engaged and directly supporting the strategy of “every Marine collector, every Marine reporter.” Marines also...recreation of the battlefield environment—physically, visually, aurally , and aromatically. Mission actions are carried out through interaction with

  15. Terrain Modelling for Immersive Visualization for the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Wright, J.; Hartman, F.; Cooper, B.; Maxwell, S.; Yen, J.; Morrison, J.

    2004-01-01

    Immersive environments are being used to support mission operations at the Jet Propulsion Laboratory. This technology contributed to the Mars Pathfinder Mission in planning sorties for the Sojourner rover and is being used for the Mars Exploration Rover (MER) missions. The stereo imagery captured by the rovers is used to create 3D terrain models, which can be viewed from any angle, to provide a powerful and information rich immersive visualization experience. These technologies contributed heavily to both the mission success and the phenomenal level of public outreach achieved by Mars Pathfinder and MER. This paper will review the utilization of terrain modelling for immersive environments in support of MER.

  16. Influence of moving visual environment on sit-to-stand kinematics in children and adults.

    PubMed

    Slaboda, Jill C; Barton, Joseph E; Keshner, Emily A

    2009-08-01

    The effect of visual field motion on the sit-to-stand kinematics of adults and children was investigated. Children (8 to12 years of age) and adults (21 to 49 years of age) were seated in a virtual environment that rotated in the pitch and roll directions. Participants stood up either (1) concurrent with onset of visual motion or (2) after an immersion period in the moving visual environment, and (3) without visual input. Angular velocities of the head with respect to the trunk, and trunk with respect to the environment, w ere calculated as was head andtrunk center of mass. Both adults and children reduced head and trunk angular velocity after immersion in the moving visual environment. Unlike adults, children demonstrated significant differences in displacement of the head center of mass during the immersion and concurrent trials when compared to trials without visual input. Results suggest a time-dependent effect of vision on sit-to-stand kinematics in adults, whereas children are influenced by the immediate presence or absence of vision.

  17. ConfocalVR: Immersive Visualization Applied to Confocal Microscopy.

    PubMed

    Stefani, Caroline; Lacy-Hulbert, Adam; Skillman, Thomas

    2018-06-24

    ConfocalVR is a virtual reality (VR) application created to improve the ability of researchers to study the complexity of cell architecture. Confocal microscopes take pictures of fluorescently labeled proteins or molecules at different focal planes to create a stack of 2D images throughout the specimen. Current software applications reconstruct the 3D image and render it as a 2D projection onto a computer screen where users need to rotate the image to expose the full 3D structure. This process is mentally taxing, breaks down if you stop the rotation, and does not take advantage of the eye's full field of view. ConfocalVR exploits consumer-grade virtual reality (VR) systems to fully immerse the user in the 3D cellular image. In this virtual environment the user can: 1) adjust image viewing parameters without leaving the virtual space, 2) reach out and grab the image to quickly rotate and scale the image to focus on key features, and 3) interact with other users in a shared virtual space enabling real-time collaborative exploration and discussion. We found that immersive VR technology allows the user to rapidly understand cellular architecture and protein or molecule distribution. We note that it is impossible to understand the value of immersive visualization without experiencing it first hand, so we encourage readers to get access to a VR system, download this software, and evaluate it for yourself. The ConfocalVR software is available for download at http://www.confocalvr.com, and is free for nonprofits. Copyright © 2018. Published by Elsevier Ltd.

  18. An optimized web-based approach for collaborative stereoscopic medical visualization

    PubMed Central

    Kaspar, Mathias; Parsad, Nigel M; Silverstein, Jonathan C

    2013-01-01

    Objective Medical visualization tools have traditionally been constrained to tethered imaging workstations or proprietary client viewers, typically part of hospital radiology systems. To improve accessibility to real-time, remote, interactive, stereoscopic visualization and to enable collaboration among multiple viewing locations, we developed an open source approach requiring only a standard web browser with no added client-side software. Materials and Methods Our collaborative, web-based, stereoscopic, visualization system, CoWebViz, has been used successfully for the past 2 years at the University of Chicago to teach immersive virtual anatomy classes. It is a server application that streams server-side visualization applications to client front-ends, comprised solely of a standard web browser with no added software. Results We describe optimization considerations, usability, and performance results, which make CoWebViz practical for broad clinical use. We clarify technical advances including: enhanced threaded architecture, optimized visualization distribution algorithms, a wide range of supported stereoscopic presentation technologies, and the salient theoretical and empirical network parameters that affect our web-based visualization approach. Discussion The implementations demonstrate usability and performance benefits of a simple web-based approach for complex clinical visualization scenarios. Using this approach overcomes technical challenges that require third-party web browser plug-ins, resulting in the most lightweight client. Conclusions Compared to special software and hardware deployments, unmodified web browsers enhance remote user accessibility to interactive medical visualization. Whereas local hardware and software deployments may provide better interactivity than remote applications, our implementation demonstrates that a simplified, stable, client approach using standard web browsers is sufficient for high quality three-dimensional, stereoscopic, collaborative and interactive visualization. PMID:23048008

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drouhard, Margaret MEG G; Steed, Chad A; Hahn, Steven E

    In this paper, we propose strategies and objectives for immersive data visualization with applications in materials science using the Oculus Rift virtual reality headset. We provide background on currently available analysis tools for neutron scattering data and other large-scale materials science projects. In the context of the current challenges facing scientists, we discuss immersive virtual reality visualization as a potentially powerful solution. We introduce a prototype immersive visual- ization system, developed in conjunction with materials scientists at the Spallation Neutron Source, which we have used to explore large crystal structures and neutron scattering data. Finally, we offer our perspective onmore » the greatest challenges that must be addressed to build effective and intuitive virtual reality analysis tools that will be useful for scientists in a wide range of fields.« less

  20. The role of vestibular and support-tactile-proprioceptive inputs in visual-manual tracking

    NASA Astrophysics Data System (ADS)

    Kornilova, Ludmila; Naumov, Ivan; Glukhikh, Dmitriy; Khabarova, Ekaterina; Pavlova, Aleksandra; Ekimovskiy, Georgiy; Sagalovitch, Viktor; Smirnov, Yuriy; Kozlovskaya, Inesa

    Sensorimotor disorders in weightlessness are caused by changes of functioning of gravity-dependent systems, first of all - vestibular and support. The question arises, what’s the role and the specific contribution of the support afferentation in the development of observed disorders. To determine the role and effects of vestibular, support, tactile and proprioceptive afferentation on characteristics of visual-manual tracking (VMT) we conducted a comparative analysis of the data obtained after prolonged spaceflight and in a model of weightlessness - horizontal “dry” immersion. Altogether we examined 16 Russian cosmonauts before and after prolonged spaceflights (129-215 days) and 30 subjects who stayed in immersion bath for 5-7 days to evaluate the state of the vestibular function (VF) using videooculography and characteristics of the visual-manual tracking (VMT) using electrooculography & joystick with biological visual feedback. Evaluation of the VF has shown that both after immersion and after prolonged spaceflight there were significant decrease of the static torsional otolith-cervical-ocular reflex (OCOR) and simultaneous significant increase of the dynamic vestibular-cervical-ocular reactions (VCOR) with a revealed negative correlation between parameters of the otoliths and canals reactions, as well as significant changes in accuracy of perception of the subjective visual vertical which correlated with changes in OCOR. Analyze of the VMT has shown that significant disorders of the visual tracking (VT) occurred from the beginning of the immersion up to 3-4 day after while in cosmonauts similar but much more pronounced oculomotor disorders and significant changes from the baseline were observed up to R+9 day postflight. Significant changes of the manual tracking (MT) were revealed only for gain and occurred on 1 and 3 days in immersion while after spaceflight such changes were observed up to R+5 day postflight. We found correlation between characteristics of the VT and MT, between characteristics of the VF and VT and no correlation between VF and MT. It was found that removal of the support and minimization of the proprioceptive afferentation has a greater impact upon accuracy of the VT then accuracy of the MT. Hand tracking accuracy was higher than the eyes for all subjects. The hand’ motor coordination was more stable to changes in support-proprioceptive afferentation then visual tracking. The observed changes in and after immersion are similar but less pronounced with changes observed on cosmonauts after prolonged spaceflight. Keywords: visual-manual tracking, vestibular function, weightlessness, immersion.

  1. Scientific work environments in the next decade

    NASA Technical Reports Server (NTRS)

    Gomez, Julian E.

    1989-01-01

    The applications of contemporary computer graphics to scientific visualization is described, with emphasis on the nonintuitive problems. A radically different approach is proposed which centers on the idea of the scientist being in the simulation display space rather than observing it on a screen. Interaction is performed with nonstandard input devices to preserve the feeling of being immersed in the three-dimensional display space. Construction of such a system could begin now with currently available technology.

  2. High End Visualization of Geophysical Datasets Using Immersive Technology: The SIO Visualization Center.

    NASA Astrophysics Data System (ADS)

    Newman, R. L.

    2002-12-01

    How many images can you display at one time with Power Point without getting "postage stamps"? Do you have fantastic datasets that you cannot view because your computer is too slow/small? Do you assume a few 2-D images of a 3-D picture are sufficient? High-end visualization centers can minimize and often eliminate these problems. The new visualization center [http://siovizcenter.ucsd.edu] at Scripps Institution of Oceanography [SIO] immerses users into a virtual world by projecting 3-D images onto a Panoram GVR-120E wall-sized floor-to-ceiling curved screen [7' x 23'] that has 3.2 mega-pixels of resolution. The Infinite Reality graphics subsystem is driven by a single-pipe SGI Onyx 3400 with a system bandwidth of 44 Gbps. The Onyx is powered by 16 MIPS R12K processors and 16 GB of addressable memory. The system is also equipped with transmitters and LCD shutter glasses which permit stereographic 3-D viewing of high-resolution images. This center is ideal for groups of up to 60 people who can simultaneously view these large-format images. A wide range of hardware and software is available, giving the users a totally immersive working environment in which to display, analyze, and discuss large datasets. The system enables simultaneous display of video and audio streams from sources such as SGI megadesktop and stereo megadesktop, S-VHS video, DVD video, and video from a Macintosh or PC. For instance, one-third of the screen might be displaying S-VHS video from a remotely-operated-vehicle [ROV], while the remaining portion of the screen might be used for an interactive 3-D flight over the same parcel of seafloor. The video and audio combinations using this system are numerous, allowing users to combine and explore data and images in innovative ways, greatly enhancing scientists' ability to visualize, understand and collaborate on complex datasets. In the not-distant future, with the rapid growth in networking speeds in the US, it will be possible for Earth Sciences Departments to collaborate effectively while limiting the amount of physical travel required. This includes porting visualization content to the popular, low-cost Geowall visualization systems, and providing web-based access to databanks filled with stock geoscience visualizations.

  3. Development of Techniques for Visualization of Scalar and Vector Fields in the Immersive Environment

    NASA Technical Reports Server (NTRS)

    Bidasaria, Hari B.; Wilson, John W.; Nealy, John E.

    2005-01-01

    Visualization of scalar and vector fields in the immersive environment (CAVE - Cave Automated Virtual Environment) is important for its application to radiation shielding research at NASA Langley Research Center. A complete methodology and the underlying software for this purpose have been developed. The developed software has been put to use for the visualization of the earth s magnetic field, and in particular for the study of the South Atlantic Anomaly. The methodology has also been put to use for the visualization of geomagnetically trapped protons and electrons within Earth's magnetosphere.

  4. The Worldviews Network: Digital Planetariums for Engaging Public Audiences in Global Change Issues

    NASA Astrophysics Data System (ADS)

    Wyatt, R. J.; Koontz, K.; Yu, K.; Gardiner, N.; Connolly, R.; Mcconville, D.

    2013-12-01

    Utilizing the capabilities of digital planetariums, the Denver Museum of Nature & Science, the California Academy of Sciences, NOVA/WGBH, The Elumenati, and affiliates of the National Oceanic & Atmospheric Administration formed the Worldviews Network. The network's mission is to place Earth in its cosmic context to encourage participants to explore connections between social & ecological issues in their backyards. Worldviews launched with informal science institution partners: the American Museum of Natural History, the Perot Museum of Nature & Science, the Journey Museum, the Bell Museum of Natural History, the University of Michigan Natural History Museum, and the National Environmental Modeling & Analysis Center. Worldviews uses immersive visualization technology to engage public audiences on issues of global environmental change at a bioregional level. An immersive planetarium show and dialogue deepens public engagement and awareness of complex human-natural system interactions. People have altered the global climate system. Our communities are increasingly vulnerable to extreme weather events. Land use decisions that people make every day put both human lives and biodiversity at risk through direct and indirect effects. The Worldviews programs demonstrate the complex linkages between Earth's physical and biological systems and their relationship to human health, agriculture, infrastructure, water resources, and energy. We have focused on critical thresholds, such as freshwater use, biodiversity loss, land use change, and anthropogenic changes to the nitrogen and phosphorus cycles. We have been guided by environmental literacy principles to help our audiences understand that humans drive current trends in coupled human-natural systems--and that humans could choose to play an important role in reversing these trends. Museum and planetarium staff members join the Worldviews Network team and external advisers to produce programs that span cosmic, global, and bioregional scales. Each presentation employs a 'See, Know, Do' transformative learning model. 'Seeing' involves the creation, presentation, and experience of viewing immersive visualizations within the planetarium to engage visitors' visual-spatial intelligence. For 'Knowing,' the narratives are constructed to help visitors understand the web of physical-ecological-social systems that interact on Earth. The 'Doing' component emerges from interaction among participants: for example, researchers and non-governmental organizations help audience members conceive of their own relationship to the highlighted issue and ways they may remain involved in systemically addressing problems the audience identifies.

  5. Solving Fluid Structure Interaction Problems with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Barad, Michael F.; Brehm, Christoph; Kiris, Cetin C.

    2016-01-01

    An immersed boundary method for the compressible Navier-Stokes equations can be used for moving boundary problems as well as fully coupled fluid-structure interaction is presented. The underlying Cartesian immersed boundary method of the Launch Ascent and Vehicle Aerodynamics (LAVA) framework, based on the locally stabilized immersed boundary method previously presented by the authors, is extended to account for unsteady boundary motion and coupled to linear and geometrically nonlinear structural finite element solvers. The approach is validated for moving boundary problems with prescribed body motion and fully coupled fluid structure interaction problems. Keywords: Immersed Boundary Method, Higher-Order Finite Difference Method, Fluid Structure Interaction.

  6. An Examination of the Effects of Collaborative Scientific Visualization via Model-Based Reasoning on Science, Technology, Engineering, and Mathematics (STEM) Learning within an Immersive 3D World

    ERIC Educational Resources Information Center

    Soleimani, Ali

    2013-01-01

    Immersive 3D worlds can be designed to effectively engage students in peer-to-peer collaborative learning activities, supported by scientific visualization, to help with understanding complex concepts associated with learning science, technology, engineering, and mathematics (STEM). Previous research studies have shown STEM learning benefits…

  7. Virtual reality: a reality for future military pilotage?

    NASA Astrophysics Data System (ADS)

    McIntire, John P.; Martinsen, Gary L.; Marasco, Peter L.; Havig, Paul R.

    2009-05-01

    Virtual reality (VR) systems provide exciting new ways to interact with information and with the world. The visual VR environment can be synthetic (computer generated) or be an indirect view of the real world using sensors and displays. With the potential opportunities of a VR system, the question arises about what benefits or detriments a military pilot might incur by operating in such an environment. Immersive and compelling VR displays could be accomplished with an HMD (e.g., imagery on the visor), large area collimated displays, or by putting the imagery on an opaque canopy. But what issues arise when, instead of viewing the world directly, a pilot views a "virtual" image of the world? Is 20/20 visual acuity in a VR system good enough? To deliver this acuity over the entire visual field would require over 43 megapixels (MP) of display surface for an HMD or about 150 MP for an immersive CAVE system, either of which presents a serious challenge with current technology. Additionally, the same number of sensor pixels would be required to drive the displays to this resolution (and formidable network architectures required to relay this information), or massive computer clusters are necessary to create an entirely computer-generated virtual reality with this resolution. Can we presently implement such a system? What other visual requirements or engineering issues should be considered? With the evolving technology, there are many technological issues and human factors considerations that need to be addressed before a pilot is placed within a virtual cockpit.

  8. Authentic Astronomical Discovery in Planetariums: Data-Driven Immersive Lectures

    NASA Astrophysics Data System (ADS)

    Wyatt, Ryan Jason

    2018-01-01

    Planetariums are akin to “branch offices” for astronomy in major cities and other locations around the globe. With immersive, fulldome video technology, modern digital planetariums offer the opportunity to integrate authentic astronomical data into both pre-recorded shows and live lectures. At the California Academy of Sciences Morrison Planetarium, we host the monthly Benjamin Dean Astronomy Lecture Series, which features researchers describing their cutting-edge work to well-informed lay audiences. The Academy’s visualization studio and engineering teams work with researchers to visualize their data in both pre-rendered and real-time formats, and these visualizations are integrated into a variety of programs—including lectures! The assets are then made available to any other planetariums with similar software to support their programming. A lecturer can thus give the same immersive presentation to audiences in a variety of planetariums. The Academy has also collaborated with Chicago’s Adler Planetarium to bring Kavli Fulldome Lecture Series to San Francisco, and the two theaters have also linked together in live “domecasts” to share real-time content with audiences in both cities. These lecture series and other, similar projects suggest a bright future for astronomers to bring their research to the public in an immersive and visually compelling format.

  9. Study on Collaborative Object Manipulation in Virtual Environment

    NASA Astrophysics Data System (ADS)

    Mayangsari, Maria Niken; Yong-Moo, Kwon

    This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.

  10. Color stability of ceramic brackets immersed in potentially staining solutions

    PubMed Central

    Guignone, Bruna Coser; Silva, Ludimila Karsbergen; Soares, Rodrigo Villamarim; Akaki, Emilio; Goiato, Marcelo Coelho; Pithon, Matheus Melo; Oliveira, Dauro Douglas

    2015-01-01

    OBJECTIVE: To assess the color stability of five types of ceramic brackets after immersion in potentially staining solutions. METHODS: Ninety brackets were divided into 5 groups (n = 18) according to brackets commercial brands and the solutions in which they were immersed (coffee, red wine, coke and artificial saliva). The brackets assessed were Transcend (3M/Unitek, Monrovia, CA, USA), Radiance (American Orthodontics, Sheboygan, WI, USA), Mystique (GAC International Inc., Bohemia, NY, USA) and Luxi II (Rocky Mountain Orthodontics, Denver, CO, USA). Chromatic changes were analyzed with the aid of a reflectance spectrophotometer and by visual inspection at five specific time intervals. Assessment periods were as received from the manufacturer (T0), 24 hours (T1), 72 hours (T2), as well as 7 days (T3) and 14 days (T4) of immersion in the aforementioned solutions. Results were submitted to statistical analysis with ANOVA and Bonferroni correction, as well as to a multivariate profile analysis for independent and paired samples with significance level set at 5%. RESULTS: The duration of the immersion period influenced color alteration of all tested brackets, even though these changes could not always be visually observed. Different behaviors were observed for each immersion solution; however, brackets immersed in one solution progressed similarly despite minor variations. CONCLUSIONS: Staining became more intense over time and all brackets underwent color alterations when immersed in the aforementioned solutions. PMID:26352842

  11. Color stability of ceramic brackets immersed in potentially staining solutions.

    PubMed

    Guignone, Bruna Coser; Silva, Ludimila Karsbergen; Soares, Rodrigo Villamarim; Akaki, Emilio; Goiato, Marcelo Coelho; Pithon, Matheus Melo; Oliveira, Dauro Douglas

    2015-01-01

    To assess the color stability of five types of ceramic brackets after immersion in potentially staining solutions. Ninety brackets were divided into 5 groups (n = 18) according to brackets commercial brands and the solutions in which they were immersed (coffee, red wine, coke and artificial saliva). The brackets assessed were Transcend (3M/Unitek, Monrovia, CA, USA), Radiance (American Orthodontics, Sheboygan, WI, USA), Mystique (GAC International Inc., Bohemia, NY, USA) and Luxi II (Rocky Mountain Orthodontics, Denver, CO, USA). Chromatic changes were analyzed with the aid of a reflectance spectrophotometer and by visual inspection at five specific time intervals. Assessment periods were as received from the manufacturer (T0), 24 hours (T1), 72 hours (T2), as well as 7 days (T3) and 14 days (T4) of immersion in the aforementioned solutions. Results were submitted to statistical analysis with ANOVA and Bonferroni correction, as well as to a multivariate profile analysis for independent and paired samples with significance level set at 5%. The duration of the immersion period influenced color alteration of all tested brackets, even though these changes could not always be visually observed. Different behaviors were observed for each immersion solution; however, brackets immersed in one solution progressed similarly despite minor variations. Staining became more intense over time and all brackets underwent color alterations when immersed in the aforementioned solutions.

  12. Alternative Audio Solution to Enhance Immersion in Deployable Synthetic Environments

    DTIC Science & Technology

    2003-09-01

    sense of presence. For example, the musical score of a movie increases the viewers’ emotional involvement in a cinematic feature. The character...photo-realistic way can make mental immersion difficult, because any flaw in the realism will spoil the effect [SHER 03].” One way to overcome spoiling...the visual realism is to reinforce visual clues with those from other modalities. 3. Aural Modality a. General Aural displays can be

  13. Wind Tunnel Data Fusion and Immersive Visualization: A Case Study

    NASA Technical Reports Server (NTRS)

    Severance, Kurt; Brewster, Paul; Lazos, Barry; Keefe, Daniel

    2001-01-01

    This case study describes the process of fusing the data from several wind tunnel experiments into a single coherent visualization. Each experiment was conducted independently and was designed to explore different flow features around airplane landing gear. In the past, it would have been very difficult to correlate results from the different experiments. However, with a single 3-D visualization representing the fusion of the three experiments, significant insight into the composite flowfield was observed that would have been extremely difficult to obtain by studying its component parts. The results are even more compelling when viewed in an immersive environment.

  14. Semi-Immersive Virtual Turbine Engine Simulation System

    NASA Astrophysics Data System (ADS)

    Abidi, Mustufa H.; Al-Ahmari, Abdulrahman M.; Ahmad, Ali; Darmoul, Saber; Ameen, Wadea

    2018-05-01

    The design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.

  15. Using CLIPS to represent knowledge in a VR simulation

    NASA Technical Reports Server (NTRS)

    Engelberg, Mark L.

    1994-01-01

    Virtual reality (VR) is an exciting use of advanced hardware and software technologies to achieve an immersive simulation. Until recently, the majority of virtual environments were merely 'fly-throughs' in which a user could freely explore a 3-dimensional world or a visualized dataset. Now that the underlying technologies are reaching a level of maturity, programmers are seeking ways to increase the complexity and interactivity of immersive simulations. In most cases, interactivity in a virtual environment can be specified in the form 'whenever such-and-such happens to object X, it reacts in the following manner.' CLIPS and COOL provide a simple and elegant framework for representing this knowledge-base in an efficient manner that can be extended incrementally. The complexity of a detailed simulation becomes more manageable when the control flow is governed by CLIPS' rule-based inference engine as opposed to by traditional procedural mechanisms. Examples in this paper will illustrate an effective way to represent VR information in CLIPS, and to tie this knowledge base to the input and output C routines of a typical virtual environment.

  16. Simultaneous neural and movement recording in large-scale immersive virtual environments.

    PubMed

    Snider, Joseph; Plank, Markus; Lee, Dongpyo; Poizner, Howard

    2013-10-01

    Virtual reality (VR) allows precise control and manipulation of rich, dynamic stimuli that, when coupled with on-line motion capture and neural monitoring, can provide a powerful means both of understanding brain behavioral relations in the high dimensional world and of assessing and treating a variety of neural disorders. Here we present a system that combines state-of-the-art, fully immersive, 3D, multi-modal VR with temporally aligned electroencephalographic (EEG) recordings. The VR system is dynamic and interactive across visual, auditory, and haptic interactions, providing sight, sound, touch, and force. Crucially, it does so with simultaneous EEG recordings while subjects actively move about a 20 × 20 ft² space. The overall end-to-end latency between real movement and its simulated movement in the VR is approximately 40 ms. Spatial precision of the various devices is on the order of millimeters. The temporal alignment with the neural recordings is accurate to within approximately 1 ms. This powerful combination of systems opens up a new window into brain-behavioral relations and a new means of assessment and rehabilitation of individuals with motor and other disorders.

  17. Exploring 4D Flow Data in an Immersive Virtual Environment

    NASA Astrophysics Data System (ADS)

    Stevens, A. H.; Butkiewicz, T.

    2017-12-01

    Ocean models help us to understand and predict a wide range of intricate physical processes which comprise the atmospheric and oceanic systems of the Earth. Because these models output an abundance of complex time-varying three-dimensional (i.e., 4D) data, effectively conveying the myriad information from a given model poses a significant visualization challenge. The majority of the research effort into this problem has concentrated around synthesizing and examining methods for representing the data itself; by comparison, relatively few studies have looked into the potential merits of various viewing conditions and virtual environments. We seek to improve our understanding of the benefits offered by current consumer-grade virtual reality (VR) systems through an immersive, interactive 4D flow visualization system. Our dataset is a Regional Ocean Modeling System (ROMS) model representing a 12-hour tidal cycle of the currents within New Hampshire's Great Bay estuary. The model data was loaded into a custom VR particle system application using the OpenVR software library and the HTC Vive hardware, which tracks a headset and two six-degree-of-freedom (6DOF) controllers within a 5m-by-5m area. The resulting visualization system allows the user to coexist in the same virtual space as the data, enabling rapid and intuitive analysis of the flow model through natural interactions with the dataset and within the virtual environment. Whereas a traditional computer screen typically requires the user to reposition a virtual camera in the scene to obtain the desired view of the data, in virtual reality the user can simply move their head to the desired viewpoint, completely eliminating the mental context switches from data exploration/analysis to view adjustment and back. The tracked controllers become tools to quickly manipulate (reposition, reorient, and rescale) the dataset and to interrogate it by, e.g., releasing dye particles into the flow field, probing scalar velocities, placing a cutting plane through a region of interest, etc. It is hypothesized that the advantages afforded by head-tracked viewing and 6DOF interaction devices will lead to faster and more efficient examination of 4D flow data. A human factors study is currently being prepared to empirically evaluate this method of visualization and interaction.

  18. Immersive Virtual Reality Technologies as a New Platform for Science, Scholarship, and Education

    NASA Astrophysics Data System (ADS)

    Djorgovski, Stanislav G.; Hut, P.; McMillan, S.; Knop, R.; Vesperini, E.; Graham, M.; Portegies Zwart, S.; Farr, W.; Mahabal, A.; Donalek, C.; Longo, G.

    2010-01-01

    Immersive virtual reality (VR) and virtual worlds (VWs) are an emerging set of technologies which likely represent the next evolutionary step in the ways we use information technology to interact with the world of information and with other people, the roles now generally fulfilled by the Web and other common Internet applications. Currently, these technologies are mainly accessed through various VWs, e.g., the Second Life (SL), which are general platforms for a broad range of user activities. As an experiment in the utilization of these technologies for science, scholarship, education, and public outreach, we have formed the Meta-Institute for Computational Astrophysics (MICA; http://mica-vw.org), the first professional scientific organization based exclusively in VWs. The goals of MICA are: (1) Exploration, development and promotion of VWs and VR technologies for professional research in astronomy and related fields. (2) Providing and developing novel social networking venues and mechanisms for scientific collaboration and communications, including professional meetings, effective telepresence, etc. (3) Use of VWs and VR technologies for education and public outreach. (4) Exchange of ideas and joint efforts with other scientific disciplines in promoting these goals for science and scholarship in general. To this effect, we have a regular schedule of professional and public outreach events in SL, including technical seminars, workshops, journal club, collaboration meetings, public lectures, etc. We find that these technologies are already remarkably effective as a telepresence platform for scientific and scholarly discussions, meetings, etc. They can offer substantial savings of time and resources, and eliminate a lot of unnecessary travel. They are equally effective as a public outreach platform, reaching a world-wide audience. On the pure research front, we are currently exploring the use of these technologies as a venue for numerical simulations and their visualization, as well as the immersive and interactive visualization of highly-dimensional data sets.

  19. Eye Movement Analysis and Cognitive Assessment. The Use of Comparative Visual Search Tasks in a Non-immersive VR Application.

    PubMed

    Rosa, Pedro J; Gamito, Pedro; Oliveira, Jorge; Morais, Diogo; Pavlovic, Matthew; Smyth, Olivia; Maia, Inês; Gomes, Tiago

    2017-03-23

    An adequate behavioral response depends on attentional and mnesic processes. When these basic cognitive functions are impaired, the use of non-immersive Virtual Reality Applications (VRAs) can be a reliable technique for assessing the level of impairment. However, most non-immersive VRAs use indirect measures to make inferences about visual attention and mnesic processes (e.g., time to task completion, error rate). To examine whether the eye movement analysis through eye tracking (ET) can be a reliable method to probe more effectively where and how attention is deployed and how it is linked with visual working memory during comparative visual search tasks (CVSTs) in non-immersive VRAs. The eye movements of 50 healthy participants were continuously recorded while CVSTs, selected from a set of cognitive tasks in the Systemic Lisbon Battery (SLB). Then a VRA designed to assess of cognitive impairments were randomly presented. The total fixation duration, the number of visits in the areas of interest and in the interstimulus space, along with the total execution time was significantly different as a function of the Mini Mental State Examination (MMSE) scores. The present study demonstrates that CVSTs in SLB, when combined with ET, can be a reliable and unobtrusive method for assessing cognitive abilities in healthy individuals, opening it to potential use in clinical samples.

  20. KML Tours: A New Platform for Exploring and Sharing Geospatial Data

    NASA Astrophysics Data System (ADS)

    Barcay, D. P.; Weiss-Malik, M.

    2009-12-01

    Google Earth and other virtual globes have allowed millions of people to explore the world from their own home. This technology has also raised the bar for professional visualizations: enabling interactive 3D visualizations to be created from massive data-sets, and shared using the KML language. For academics and professionals alike, an engaging presentation of your geospatial data is generally expected and can be the most effective form of advertisement. To that end, we released 'Touring' in Google Earth 5.0: a new medium for cinematic expression, visualized in Google Earth and written as extensions to the KML language. In a KML tour, the author has fine-grained control over the entire visual experience: precisely moving the virtual camera through the world while dynamically modifying the content, style, position, and visibility of the displayed data. An author can synchronize audio to this experience, bringing further immersion to a visualization. KML tours can help engage a broad user-base and conveying subtle concepts that aren't immediately apparent in traditional geospatial content. Unlike a pre-rendered video, a KML Tour maintains the rich interactivity of Google Earth, allowing users to continue exploring your content, and to mash-up other content with your visualization. This session will include conceptual explanations of the Touring feature in Google Earth, the structure of the touring KML extensions, as well as examples of compelling tours.

  1. Influence of Immersive Human Scale Architectural Representation on Design Judgment

    NASA Astrophysics Data System (ADS)

    Elder, Rebecca L.

    Unrealistic visual representation of architecture within our existing environments have lost all reference to the human senses. As a design tool, visual and auditory stimuli can be utilized to determine human's perception of design. This experiment renders varying building inputs within different sites, simulated with corresponding immersive visual and audio sensory cues. Introducing audio has been proven to influence the way a person perceives a space, yet most inhabitants rely strictly on their sense of vision to make design judgments. Though not as apparent, users prefer spaces that have a better quality of sound and comfort. Through a series of questions, we can begin to analyze whether a design is fit for both an acoustic and visual environment.

  2. Leveraging Google Geo Tools for Interactive STEM Education: Insights from the GEODE Project

    NASA Astrophysics Data System (ADS)

    Dordevic, M.; Whitmeyer, S. J.; De Paor, D. G.; Karabinos, P.; Burgin, S.; Coba, F.; Bentley, C.; St John, K. K.

    2016-12-01

    Web-based imagery and geospatial tools have transformed our ability to immerse students in global virtual environments. Google's suite of geospatial tools, such as Google Earth (± Engine), Google Maps, and Street View, allow developers and instructors to create interactive and immersive environments, where students can investigate and resolve common misconceptions in STEM concepts and natural processes. The GEODE (.net) project is developing digital resources to enhance STEM education. These include virtual field experiences (VFEs), such as an interactive visualization of the breakup of the Pangaea supercontinent, a "Grand Tour of the Terrestrial Planets," and GigaPan-based VFEs of sites like the Canadian Rockies. Web-based challenges, such as EarthQuiz (.net) and the "Fold Analysis Challenge," incorporate scaffolded investigations of geoscience concepts. EarthQuiz features web-hosted imagery, such as Street View, Photo Spheres, GigaPans, and Satellite View, as the basis for guided inquiry. In the Fold Analysis Challenge, upper-level undergraduates use Google Earth to evaluate a doubly-plunging fold at Sheep Mountain, WY. GEODE.net also features: "Reasons for the Seasons"—a Google Earth-based visualization that addresses misconceptions that abound amongst students, teachers, and the public, many of whom believe that seasonality is caused by large variations in Earth's distance from the Sun; "Plate Euler Pole Finder," which helps students understand rotational motion of tectonic plates on the globe; and "Exploring Marine Sediments Using Google Earth," an exercise that uses empirical data to explore the surficial distribution of marine sediments in the modern ocean. The GEODE research team includes the authors and: Heather Almquist, Cinzia Cervato, Gene Cooper, Helen Crompton, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Barb Tewksbury, and their students and collaborating colleagues. We are supported by NSF DUE 1323419 and a Google Geo Curriculum Award.

  3. Social Interaction Development through Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Beach, Jason; Wendt, Jeremy

    2014-01-01

    The purpose of this pilot study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity…

  4. Evaluating the Effects of Immersive Embodied Interaction on Cognition in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Parmar, Dhaval

    Virtual reality is on its advent of becoming mainstream household technology, as technologies such as head-mounted displays, trackers, and interaction devices are becoming affordable and easily available. Virtual reality (VR) has immense potential in enhancing the fields of education and training, and its power can be used to spark interest and enthusiasm among learners. It is, therefore, imperative to evaluate the risks and benefits that immersive virtual reality poses to the field of education. Research suggests that learning is an embodied process. Learning depends on grounded aspects of the body including action, perception, and interactions with the environment. This research aims to study if immersive embodiment through the means of virtual reality facilitates embodied cognition. A pedagogical VR solution which takes advantage of embodied cognition can lead to enhanced learning benefits. Towards achieving this goal, this research presents a linear continuum for immersive embodied interaction within virtual reality. This research evaluates the effects of three levels of immersive embodied interactions on cognitive thinking, presence, usability, and satisfaction among users in the fields of science, technology, engineering, and mathematics (STEM) education. Results from the presented experiments show that immersive virtual reality is greatly effective in knowledge acquisition and retention, and highly enhances user satisfaction, interest and enthusiasm. Users experience high levels of presence and are profoundly engaged in the learning activities within the immersive virtual environments. The studies presented in this research evaluate pedagogical VR software to train and motivate students in STEM education, and provide an empirical analysis comparing desktop VR (DVR), immersive VR (IVR), and immersive embodied VR (IEVR) conditions for learning. This research also proposes a fully immersive embodied interaction metaphor (IEIVR) for learning of computational concepts as a future direction, and presents the challenges faced in implementing the IEIVR metaphor due to extended periods of immersion. Results from the conducted studies help in formulating guidelines for virtual reality and education researchers working in STEM education and training, and for educators and curriculum developers seeking to improve student engagement in the STEM fields.

  5. Research on Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Lobeck, William E.

    2002-01-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  6. Research on Intelligent Synthesis Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.; Loftin, R. Bowen

    2002-12-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  7. CAVE2: a hybrid reality environment for immersive simulation and information analysis

    NASA Astrophysics Data System (ADS)

    Febretti, Alessandro; Nishimoto, Arthur; Thigpen, Terrance; Talandis, Jonas; Long, Lance; Pirtle, J. D.; Peterka, Tom; Verlo, Alan; Brown, Maxine; Plepys, Dana; Sandin, Dan; Renambot, Luc; Johnson, Andrew; Leigh, Jason

    2013-03-01

    Hybrid Reality Environments represent a new kind of visualization spaces that blur the line between virtual environments and high resolution tiled display walls. This paper outlines the design and implementation of the CAVE2TM Hybrid Reality Environment. CAVE2 is the world's first near-seamless flat-panel-based, surround-screen immersive system. Unique to CAVE2 is that it will enable users to simultaneously view both 2D and 3D information, providing more flexibility for mixed media applications. CAVE2 is a cylindrical system of 24 feet in diameter and 8 feet tall, and consists of 72 near-seamless, off-axisoptimized passive stereo LCD panels, creating an approximately 320 degree panoramic environment for displaying information at 37 Megapixels (in stereoscopic 3D) or 74 Megapixels in 2D and at a horizontal visual acuity of 20/20. Custom LCD panels with shifted polarizers were built so the images in the top and bottom rows of LCDs are optimized for vertical off-center viewing- allowing viewers to come closer to the displays while minimizing ghosting. CAVE2 is designed to support multiple operating modes. In the Fully Immersive mode, the entire room can be dedicated to one virtual simulation. In 2D model, the room can operate like a traditional tiled display wall enabling users to work with large numbers of documents at the same time. In the Hybrid mode, a mixture of both 2D and 3D applications can be simultaneously supported. The ability to treat immersive work spaces in this Hybrid way has never been achieved before, and leverages the special abilities of CAVE2 to enable researchers to seamlessly interact with large collections of 2D and 3D data. To realize this hybrid ability, we merged the Scalable Adaptive Graphics Environment (SAGE) - a system for supporting 2D tiled displays, with Omegalib - a virtual reality middleware supporting OpenGL, OpenSceneGraph and Vtk applications.

  8. Eye movements, visual search and scene memory, in an immersive virtual environment.

    PubMed

    Kit, Dmitry; Katz, Leor; Sullivan, Brian; Snyder, Kat; Ballard, Dana; Hayhoe, Mary

    2014-01-01

    Visual memory has been demonstrated to play a role in both visual search and attentional prioritization in natural scenes. However, it has been studied predominantly in experimental paradigms using multiple two-dimensional images. Natural experience, however, entails prolonged immersion in a limited number of three-dimensional environments. The goal of the present experiment was to recreate circumstances comparable to natural visual experience in order to evaluate the role of scene memory in guiding eye movements in a natural environment. Subjects performed a continuous visual-search task within an immersive virtual-reality environment over three days. We found that, similar to two-dimensional contexts, viewers rapidly learn the location of objects in the environment over time, and use spatial memory to guide search. Incidental fixations did not provide obvious benefit to subsequent search, suggesting that semantic contextual cues may often be just as efficient, or that many incidentally fixated items are not held in memory in the absence of a specific task. On the third day of the experience in the environment, previous search items changed in color. These items were fixated upon with increased probability relative to control objects, suggesting that memory-guided prioritization (or Surprise) may be a robust mechanisms for attracting gaze to novel features of natural environments, in addition to task factors and simple spatial saliency.

  9. Immersive realities: articulating the shift from VR to mobile AR through artistic practice

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; Cornish, Tracy; Berry, Rodney; DeFanti, Thomas A.

    2012-03-01

    Our contemporary imaginings of technological engagement with digital environments has transitioned from flying through Virtual Reality to mobile interactions with the physical world through personal media devices. Experiences technologically mediated through social interactivity within physical environments are now being preferenced over isolated environments such as CAVEs or HMDs. Examples of this trend can be seen in early tele-collaborative artworks which strove to use advanced networking to join multiple participants in shared virtual environments. Recent developments in mobile AR allow untethered access to such shared realities in places far removed from labs and home entertainment environments, and without the bulky and expensive technologies attached to our bodies that accompany most VR. This paper addresses the emerging trend favoring socially immersive artworks via mobile Augmented Reality rather than sensorially immersive Virtual Reality installations. With particular focus on AR as a mobile, locative technology, we will discuss how concepts of immersion and interactivity are evolving with this new medium. Immersion in context of mobile AR can be redefined to describe socially interactive experiences. Having distinctly different sensory, spatial and situational properties, mobile AR offers a new form for remixing elements from traditional virtual reality with physically based social experiences. This type of immersion offers a wide array of potential for mobile AR art forms. We are beginning to see examples of how artists can use mobile AR to create social immersive and interactive experiences.

  10. Visual Immersion for Cultural Understanding and Multimodal Literacy

    ERIC Educational Resources Information Center

    Smilan, Cathy

    2017-01-01

    When considering inclusive art curriculum that accommodates all learners, including English language learners, two distinct yet inseparable issues come to mind. The first is that English language learner students can use visual language and visual literacy skills inherent in visual arts curriculum to scaffold learning in and through the arts.…

  11. Touching proteins with virtual bare hands - Visualizing protein-drug complexes and their dynamics in self-made virtual reality using gaming hardware

    NASA Astrophysics Data System (ADS)

    Ratamero, Erick Martins; Bellini, Dom; Dowson, Christopher G.; Römer, Rudolf A.

    2018-06-01

    The ability to precisely visualize the atomic geometry of the interactions between a drug and its protein target in structural models is critical in predicting the correct modifications in previously identified inhibitors to create more effective next generation drugs. It is currently common practice among medicinal chemists while attempting the above to access the information contained in three-dimensional structures by using two-dimensional projections, which can preclude disclosure of useful features. A more accessible and intuitive visualization of the three-dimensional configuration of the atomic geometry in the models can be achieved through the implementation of immersive virtual reality (VR). While bespoke commercial VR suites are available, in this work, we present a freely available software pipeline for visualising protein structures through VR. New consumer hardware, such as the uc(HTC Vive) and the uc(Oculus Rift) utilized in this study, are available at reasonable prices. As an instructive example, we have combined VR visualization with fast algorithms for simulating intramolecular motions of protein flexibility, in an effort to further improve structure-led drug design by exposing molecular interactions that might be hidden in the less informative static models. This is a paradigmatic test case scenario for many similar applications in computer-aided molecular studies and design.

  12. Touching proteins with virtual bare hands : Visualizing protein-drug complexes and their dynamics in self-made virtual reality using gaming hardware.

    PubMed

    Ratamero, Erick Martins; Bellini, Dom; Dowson, Christopher G; Römer, Rudolf A

    2018-06-07

    The ability to precisely visualize the atomic geometry of the interactions between a drug and its protein target in structural models is critical in predicting the correct modifications in previously identified inhibitors to create more effective next generation drugs. It is currently common practice among medicinal chemists while attempting the above to access the information contained in three-dimensional structures by using two-dimensional projections, which can preclude disclosure of useful features. A more accessible and intuitive visualization of the three-dimensional configuration of the atomic geometry in the models can be achieved through the implementation of immersive virtual reality (VR). While bespoke commercial VR suites are available, in this work, we present a freely available software pipeline for visualising protein structures through VR. New consumer hardware, such as the HTC VIVE and the OCULUS RIFT utilized in this study, are available at reasonable prices. As an instructive example, we have combined VR visualization with fast algorithms for simulating intramolecular motions of protein flexibility, in an effort to further improve structure-led drug design by exposing molecular interactions that might be hidden in the less informative static models. This is a paradigmatic test case scenario for many similar applications in computer-aided molecular studies and design.

  13. Immersive Visualization of the Solid Earth

    NASA Astrophysics Data System (ADS)

    Kreylos, O.; Kellogg, L. H.

    2017-12-01

    Immersive visualization using virtual reality (VR) display technology offers unique benefits for the visual analysis of complex three-dimensional data such as tomographic images of the mantle and higher-dimensional data such as computational geodynamics models of mantle convection or even planetary dynamos. Unlike "traditional" visualization, which has to project 3D scalar data or vectors onto a 2D screen for display, VR can display 3D data in a pseudo-holographic (head-tracked stereoscopic) form, and does therefore not suffer the distortions of relative positions, sizes, distances, and angles that are inherent in 2D projection and interfere with interpretation. As a result, researchers can apply their spatial reasoning skills to 3D data in the same way they can to real objects or environments, as well as to complex objects like vector fields. 3D Visualizer is an application to visualize 3D volumetric data, such as results from mantle convection simulations or seismic tomography reconstructions, using VR display technology and a strong focus on interactive exploration. Unlike other visualization software, 3D Visualizer does not present static visualizations, such as a set of cross-sections at pre-selected positions and orientations, but instead lets users ask questions of their data, for example by dragging a cross-section through the data's domain with their hands and seeing data mapped onto that cross-section in real time, or by touching a point inside the data domain, and immediately seeing an isosurface connecting all points having the same data value as the touched point. Combined with tools allowing 3D measurements of positions, distances, and angles, and with annotation tools that allow free-hand sketching directly in 3D data space, the outcome of using 3D Visualizer is not primarily a set of pictures, but derived data to be used for subsequent analysis. 3D Visualizer works best in virtual reality, either in high-end facility-scale environments such as CAVEs, or using commodity low-cost virtual reality headsets such as HTC's Vive. The recent emergence of high-quality commodity VR means that researchers can buy a complete VR system off the shelf, install it and the 3D Visualizer software themselves, and start using it for data analysis immediately.

  14. Immersive Planetarium Visualizations for Teaching Solar System Moon Concepts to Undergraduates

    ERIC Educational Resources Information Center

    Yu, Ka Chun; Sahami, Kamran; Denn, Grant; Sahami, Victoria; Sessions, Larry C.

    2016-01-01

    Digital video fulldome has long been heralded as a revolutionary educational technology; yet the discipline-based astronomy education research literature showing planetarium effectiveness has been sparse. In order to help understand to what extent immersion impacts learning and the effect of the "narrative journey" model of presentation,…

  15. Immersive cyberspace system

    NASA Technical Reports Server (NTRS)

    Park, Brian V. (Inventor)

    1997-01-01

    An immersive cyberspace system is presented which provides visual, audible, and vibrational inputs to a subject remaining in neutral immersion, and also provides for subject control input. The immersive cyberspace system includes a relaxation chair and a neutral immersion display hood. The relaxation chair supports a subject positioned thereupon, and places the subject in position which merges a neutral body position, the position a body naturally assumes in zero gravity, with a savasana yoga position. The display hood, which covers the subject's head, is configured to produce light images and sounds. An image projection subsystem provides either external or internal image projection. The display hood includes a projection screen moveably attached to an opaque shroud. A motion base supports the relaxation chair and produces vibrational inputs over a range of about 0-30 Hz. The motion base also produces limited translation and rotational movements of the relaxation chair. These limited translational and rotational movements, when properly coordinated with visual stimuli, constitute motion cues which create sensations of pitch, yaw, and roll movements. Vibration transducers produce vibrational inputs from about 20 Hz to about 150 Hz. An external computer, coupled to various components of the immersive cyberspace system, executes a software program and creates the cyberspace environment. One or more neutral hand posture controllers may be coupled to the external computer system and used to control various aspects of the cyberspace environment, or to enter data during the cyberspace experience.

  16. Comparative study on collaborative interaction in non-immersive and immersive systems

    NASA Astrophysics Data System (ADS)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki

    2007-09-01

    This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.

  17. A Proposed Treatment for Visual Field Loss caused by Traumatic Brain Injury using Interactive Visuotactile Virtual Environment

    NASA Astrophysics Data System (ADS)

    Farkas, Attila J.; Hajnal, Alen; Shiratuddin, Mohd F.; Szatmary, Gabriella

    In this paper, we propose a novel approach of using interactive virtual environment technology in Vision Restoration Therapy caused by Traumatic Brain Injury. We called the new system Interactive Visuotactile Virtual Environment and it holds a promise of expanding the scope of already existing rehabilitation techniques. Traditional vision rehabilitation methods are based on passive psychophysical training procedures, and can last up to six months before any modest improvements can be seen in patients. A highly immersive and interactive virtual environment will allow the patient to practice everyday activities such as object identification and object manipulation through the use 3D motion sensoring handheld devices such data glove or the Nintendo Wiimote. Employing both perceptual and action components in the training procedures holds the promise of more efficient sensorimotor rehabilitation. Increased stimulation of visual and sensorimotor areas of the brain should facilitate a comprehensive recovery of visuomotor function by exploiting the plasticity of the central nervous system. Integrated with a motion tracking system and an eye tracking device, the interactive virtual environment allows for the creation and manipulation of a wide variety of stimuli, as well as real-time recording of hand-, eye- and body movements and coordination. The goal of the project is to design a cost-effective and efficient vision restoration system.

  18. Comparing perceived auditory width to the visual image of a performing ensemble in contrasting bi-modal environmentsa)

    PubMed Central

    Valente, Daniel L.; Braasch, Jonas; Myrbeck, Shane A.

    2012-01-01

    Despite many studies investigating auditory spatial impressions in rooms, few have addressed the impact of simultaneous visual cues on localization and the perception of spaciousness. The current research presents an immersive audiovisual environment in which participants were instructed to make auditory width judgments in dynamic bi-modal settings. The results of these psychophysical tests suggest the importance of congruent audio visual presentation to the ecological interpretation of an auditory scene. Supporting data were accumulated in five rooms of ascending volumes and varying reverberation times. Participants were given an audiovisual matching test in which they were instructed to pan the auditory width of a performing ensemble to a varying set of audio and visual cues in rooms. Results show that both auditory and visual factors affect the collected responses and that the two sensory modalities coincide in distinct interactions. The greatest differences between the panned audio stimuli given a fixed visual width were found in the physical space with the largest volume and the greatest source distance. These results suggest, in this specific instance, a predominance of auditory cues in the spatial analysis of the bi-modal scene. PMID:22280585

  19. Virtual reality and telerobotics applications of an Address Recalculation Pipeline

    NASA Technical Reports Server (NTRS)

    Regan, Matthew; Pose, Ronald

    1994-01-01

    The technology described in this paper was designed to reduce latency to user interactions in immersive virtual reality environments. It is also ideally suited to telerobotic applications such as interaction with remote robotic manipulators in space or in deep sea operations. in such circumstances the significant latency is observed response to user stimulus which is due to communications delays, and the disturbing jerkiness due to low and unpredictable frame rates on compressed video user feedback or computationally limited virtual worlds, can be masked by our techniques. The user is provided with highly responsive visual feedback independent of communication or computational delays in providing physical video feedback or in rendering virtual world images. Virtual and physical environments can be combined seamlessly using these techniques.

  20. Expeditious illustration of layer-cake models on and above a tactile surface

    NASA Astrophysics Data System (ADS)

    Lopes, Daniel Simões; Mendes, Daniel; Sousa, Maurício; Jorge, Joaquim

    2016-05-01

    Too often illustrating and visualizing 3D geological concepts are performed by sketching in 2D mediums, which may limit drawing performance of initial concepts. Here, the potential of expeditious geological modeling brought by hand gestures is explored. A spatial interaction system was developed to enable rapid modeling, editing, and exploration of 3D layer-cake objects. User interactions are acquired with motion capture and touch screen technologies. Virtual immersion is guaranteed by using stereoscopic technology. The novelty consists of performing expeditious modeling of coarse geological features with only a limited set of hand gestures. Results from usability-studies show that the proposed system is more efficient when compared to a windows-icon-menu-pointer modeling application.

  1. Immersive volume rendering of blood vessels

    NASA Astrophysics Data System (ADS)

    Long, Gregory; Kim, Han Suk; Marsden, Alison; Bazilevs, Yuri; Schulze, Jürgen P.

    2012-03-01

    In this paper, we present a novel method of visualizing flow in blood vessels. Our approach reads unstructured tetrahedral data, resamples it, and uses slice based 3D texture volume rendering. Due to the sparse structure of blood vessels, we utilize an octree to efficiently store the resampled data by discarding empty regions of the volume. We use animation to convey time series data, wireframe surface to give structure, and utilize the StarCAVE, a 3D virtual reality environment, to add a fully immersive element to the visualization. Our tool has great value in interdisciplinary work, helping scientists collaborate with clinicians, by improving the understanding of blood flow simulations. Full immersion in the flow field allows for a more intuitive understanding of the flow phenomena, and can be a great help to medical experts for treatment planning.

  2. Manipulating the fidelity of lower extremity visual feedback to identify obstacle negotiation strategies in immersive virtual reality.

    PubMed

    Kim, Aram; Zhou, Zixuan; Kretch, Kari S; Finley, James M

    2017-07-01

    The ability to successfully navigate obstacles in our environment requires integration of visual information about the environment with estimates of our body's state. Previous studies have used partial occlusion of the visual field to explore how information about the body and impending obstacles are integrated to mediate a successful clearance strategy. However, because these manipulations often remove information about both the body and obstacle, it remains to be seen how information about the lower extremities alone is utilized during obstacle crossing. Here, we used an immersive virtual reality (VR) interface to explore how visual feedback of the lower extremities influences obstacle crossing performance. Participants wore a head-mounted display while walking on treadmill and were instructed to step over obstacles in a virtual corridor in four different feedback trials. The trials involved: (1) No visual feedback of the lower extremities, (2) an endpoint-only model, (3) a link-segment model, and (4) a volumetric multi-segment model. We found that the volumetric model improved success rate, placed their trailing foot before crossing and leading foot after crossing more consistently, and placed their leading foot closer to the obstacle after crossing compared to no model. This knowledge is critical for the design of obstacle negotiation tasks in immersive virtual environments as it may provide information about the fidelity necessary to reproduce ecologically valid practice environments.

  3. Eye Movements, Visual Search and Scene Memory, in an Immersive Virtual Environment

    PubMed Central

    Sullivan, Brian; Snyder, Kat; Ballard, Dana; Hayhoe, Mary

    2014-01-01

    Visual memory has been demonstrated to play a role in both visual search and attentional prioritization in natural scenes. However, it has been studied predominantly in experimental paradigms using multiple two-dimensional images. Natural experience, however, entails prolonged immersion in a limited number of three-dimensional environments. The goal of the present experiment was to recreate circumstances comparable to natural visual experience in order to evaluate the role of scene memory in guiding eye movements in a natural environment. Subjects performed a continuous visual-search task within an immersive virtual-reality environment over three days. We found that, similar to two-dimensional contexts, viewers rapidly learn the location of objects in the environment over time, and use spatial memory to guide search. Incidental fixations did not provide obvious benefit to subsequent search, suggesting that semantic contextual cues may often be just as efficient, or that many incidentally fixated items are not held in memory in the absence of a specific task. On the third day of the experience in the environment, previous search items changed in color. These items were fixated upon with increased probability relative to control objects, suggesting that memory-guided prioritization (or Surprise) may be a robust mechanisms for attracting gaze to novel features of natural environments, in addition to task factors and simple spatial saliency. PMID:24759905

  4. Intelligent Visualization of Geo-Information on the Future Web

    NASA Astrophysics Data System (ADS)

    Slusallek, P.; Jochem, R.; Sons, K.; Hoffmann, H.

    2012-04-01

    Visualization is a key component of the "Observation Web" and will become even more important in the future as geo data becomes more widely accessible. The common statement that "Data that cannot be seen, does not exist" is especially true for non-experts, like most citizens. The Web provides the most interesting platform for making data easily and widely available. However, today's Web is not well suited for the interactive visualization and exploration that is often needed for geo data. Support for 3D data was added only recently and at an extremely low level (WebGL), but even the 2D visualization capabilities of HTML e.g. (images, canvas, SVG) are rather limited, especially regarding interactivity. We have developed XML3D as an extension to HTML-5. It allows for compactly describing 2D and 3D data directly as elements of an HTML-5 document. All graphics elements are part of the Document Object Model (DOM) and can be manipulated via the same set of DOM events and methods that millions of Web developers use on a daily basis. Thus, XML3D makes highly interactive 2D and 3D visualization easily usable, not only for geo data. XML3D is supported by any WebGL-capable browser but we also provide native implementations in Firefox and Chromium. As an example, we show how OpenStreetMap data can be mapped directly to XML3D and visualized interactively in any Web page. We show how this data can be easily augmented with additional data from the Web via a few lines of Javascript. We also show how embedded semantic data (via RDFa) allows for linking the visualization back to the data's origin, thus providing an immersive interface for interacting with and modifying the original data. XML3D is used as key input for standardization within the W3C Community Group on "Declarative 3D for the Web" chaired by the DFKI and has recently been selected as one of the Generic Enabler for the EU Future Internet initiative.

  5. Image-Based Virtual Tours and 3d Modeling of Past and Current Ages for the Enhancement of Archaeological Parks: the Visualversilia 3d Project

    NASA Astrophysics Data System (ADS)

    Castagnetti, C.; Giannini, M.; Rivola, R.

    2017-05-01

    The research project VisualVersilia 3D aims at offering a new way to promote the territory and its heritage by matching the traditional reading of the document and the potential use of modern communication technologies for the cultural tourism. Recently, the research on the use of new technologies applied to cultural heritage have turned their attention mainly to technologies to reconstruct and narrate the complexity of the territory and its heritage, including 3D scanning, 3D printing and augmented reality. Some museums and archaeological sites already exploit the potential of digital tools to preserve and spread their heritage but interactive services involving tourists in an immersive and more modern experience are still rare. The innovation of the project consists in the development of a methodology for documenting current and past historical ages and integrating their 3D visualizations with rendering capable of returning an immersive virtual reality for a successful enhancement of the heritage. The project implements the methodology in the archaeological complex of Massaciuccoli, one of the best preserved roman site of the Versilia Area (Tuscany, Italy). The activities of the project briefly consist in developing: 1. the virtual tour of the site in its current configuration on the basis of spherical images then enhanced by texts, graphics and audio guides in order to enable both an immersive and remote tourist experience; 2. 3D reconstruction of the evidences and buildings in their current condition for documentation and conservation purposes on the basis of a complete metric survey carried out through laser scanning; 3. 3D virtual reconstructions through the main historical periods on the basis of historical investigation and the analysis of data acquired.

  6. Immersive Virtual Reality Therapy with Myoelectric Control for Treatment-resistant Phantom Limb Pain: Case Report.

    PubMed

    Chau, Brian; Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc

    2017-01-01

    Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain.

  7. Immersive Virtual Reality Therapy with Myoelectric Control for Treatment-resistant Phantom Limb Pain: Case Report

    PubMed Central

    Phelan, Ivan; Ta, Phillip; Humbert, Sarah; Hata, Justin; Tran, Duc

    2017-01-01

    Objective: Phantom limb pain is a condition frequently experienced after amputation. One treatment for phantom limb pain is traditional mirror therapy, yet some patients do not respond to this intervention, and immersive virtual reality mirror therapy offers some potential advantages. We report the case of a patient with severe phantom limb pain following an upper limb amputation and successful treatment with therapy in a custom virtual reality environment. Methods: An interactive 3-D kitchen environment was developed based on the principles of mirror therapy to allow for control of virtual hands while wearing a motion-tracked, head-mounted virtual reality display. The patient used myoelectric control of a virtual hand as well as motion-tracking control in this setting for five therapy sessions. Pain scale measurements and subjective feedback was elicited at each session. Results: Analysis of the measured pain scales showed statistically significant decreases per session [Visual Analog Scale, Short Form McGill Pain Questionnaire, and Wong-Baker FACES pain scores decreased by 55 percent (p=0.0143), 60 percent (p=0.023), and 90 percent (p=0.0024), respectively]. Significant subjective pain relief persisting between sessions was also reported, as well as marked immersion within the virtual environments. On followup at six weeks, the patient noted continued decrease in phantom limb pain symptoms. Conclusions: Currently available immersive virtual reality technology with myolectric and motion tracking control may represent a possible therapy option for treatment-resistant phantom limb pain. PMID:29616149

  8. Immersive visualization for navigation and control of the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Hartman, Frank R.; Cooper, Brian; Maxwell, Scott; Wright, John; Yen, Jeng

    2004-01-01

    The Rover Sequencing and Visualization Program (RSVP) is a suite of tools for sequencing of planetary rovers, which are subject to significant light time delay and thus are unsuitable for teleoperation.

  9. Venus Quadrangle Geological Mapping: Use of Geoscience Data Visualization Systems in Mapping and Training

    NASA Technical Reports Server (NTRS)

    Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil

    2008-01-01

    We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].

  10. How incorporation of scents could enhance immersive virtual experiences

    PubMed Central

    Ischer, Matthieu; Baron, Naëm; Mermoud, Christophe; Cayeux, Isabelle; Porcherot, Christelle; Sander, David; Delplanque, Sylvain

    2014-01-01

    Under normal everyday conditions, senses all work together to create experiences that fill a typical person's life. Unfortunately for behavioral and cognitive researchers who investigate such experiences, standard laboratory tests are usually conducted in a nondescript room in front of a computer screen. They are very far from replicating the complexity of real world experiences. Recently, immersive virtual reality (IVR) environments became promising methods to immerse people into an almost real environment that involves more senses. IVR environments provide many similarities to the complexity of the real world and at the same time allow experimenters to constrain experimental parameters to obtain empirical data. This can eventually lead to better treatment options and/or new mechanistic hypotheses. The idea that increasing sensory modalities improve the realism of IVR environments has been empirically supported, but the senses used did not usually include olfaction. In this technology report, we will present an odor delivery system applied to a state-of-the-art IVR technology. The platform provides a three-dimensional, immersive, and fully interactive visualization environment called “Brain and Behavioral Laboratory—Immersive System” (BBL-IS). The solution we propose can reliably deliver various complex scents during different virtual scenarios, at a precise time and space and without contamination of the environment. The main features of this platform are: (i) the limited cross-contamination between odorant streams with a fast odor delivery (< 500 ms), (ii) the ease of use and control, and (iii) the possibility to synchronize the delivery of the odorant with pictures, videos or sounds. How this unique technology could be used to investigate typical research questions in olfaction (e.g., emotional elicitation, memory encoding or attentional capture by scents) will also be addressed. PMID:25101017

  11. Globe Browsing: Contextualized Spatio-Temporal Planetary Surface Visualization.

    PubMed

    Bladin, Karl; Axelsson, Emil; Broberg, Erik; Emmart, Carter; Ljung, Patric; Bock, Alexander; Ynnerman, Anders

    2017-08-29

    Results of planetary mapping are often shared openly for use in scientific research and mission planning. In its raw format, however, the data is not accessible to non-experts due to the difficulty in grasping the context and the intricate acquisition process. We present work on tailoring and integration of multiple data processing and visualization methods to interactively contextualize geospatial surface data of celestial bodies for use in science communication. As our approach handles dynamic data sources, streamed from online repositories, we are significantly shortening the time between discovery and dissemination of data and results. We describe the image acquisition pipeline, the pre-processing steps to derive a 2.5D terrain, and a chunked level-of-detail, out-of-core rendering approach to enable interactive exploration of global maps and high-resolution digital terrain models. The results are demonstrated for three different celestial bodies. The first case addresses high-resolution map data on the surface of Mars. A second case is showing dynamic processes, such as concurrent weather conditions on Earth that require temporal datasets. As a final example we use data from the New Horizons spacecraft which acquired images during a single flyby of Pluto. We visualize the acquisition process as well as the resulting surface data. Our work has been implemented in the OpenSpace software [8], which enables interactive presentations in a range of environments such as immersive dome theaters, interactive touch tables, and virtual reality headsets.

  12. Along the Virtuality Continuum - Two Showcases on how xR Technologies Transform Geoscience Research and Education

    NASA Astrophysics Data System (ADS)

    Klippel, A.; Zhao, J.; Masrur, A.; Wallgruen, J. O.; La Femina, P. C.

    2017-12-01

    We present work along the virtuality continuum showcasing both AR and VR environments for geoscience applications and research. The AR/VR project focusses on one of the most prominent landmarks on the Penn State campus which, at the same time, is a representation of the geology of Pennsylvania. The Penn State Obelisk is a 32" high, 51 ton monument composed of 281 rocks collected from across Pennsylvania. While information about its origins and composition are scattered in articles and some web databases, we compiled all the available data from the web and archives and curated them as a basis for an immersive xR experience. Tabular data was amended by xR data such as 360° photos, videos, and 3D models (e.g., the Obelisk). Our xR (both AR and VR) prototype provides an immersive analytical environment that supports interactive data visualization and virtual navigation in a natural environment (a campus model of today and of 1896, the year of the Obelisk's installation). This work-in-progress project can provide an interactive immersive learning platform (specifically, for K-12 and introductory level geosciences students) where learning process is enhanced through seamless navigation between 3D data space and physical space. The, second, VR focused application is creating and empirically evaluating virtual reality (VR) experiences for geosciences research, specifically, an interactive volcano experience based on LiDAR and image data of Iceland's Thrihnukar volcano. The prototype addresses the lack of content and tools for immersive virtual reality (iVR) in geoscientific education and research and how to make it easier to integrate iVR into research and classroom experiences. It makes use of environmentally sensed data such that interaction and linked content can be integrated into a single experience. We discuss our workflows as well as methods and authoring tools for iVR analysis and creation of virtual experiences. These methods and tools aim to enhance the utility of geospatial data from repositories such as OpenTopography.org through unlocking treasure-troves of geospatial data for VR applications. Their enhanced accessibility in education and research for the geosciences and beyond will benefit geoscientists and educators who cannot be expected to be VR and 3D application experts.

  13. A GPU-accelerated immersive audio-visual framework for interaction with molecular dynamics using consumer depth sensors.

    PubMed

    Glowacki, David R; O'Connor, Michael; Calabró, Gaetano; Price, James; Tew, Philip; Mitchell, Thomas; Hyde, Joseph; Tew, David P; Coughtrie, David J; McIntosh-Smith, Simon

    2014-01-01

    With advances in computational power, the rapidly growing role of computational/simulation methodologies in the physical sciences, and the development of new human-computer interaction technologies, the field of interactive molecular dynamics seems destined to expand. In this paper, we describe and benchmark the software algorithms and hardware setup for carrying out interactive molecular dynamics utilizing an array of consumer depth sensors. The system works by interpreting the human form as an energy landscape, and superimposing this landscape on a molecular dynamics simulation to chaperone the motion of the simulated atoms, affecting both graphics and sonified simulation data. GPU acceleration has been key to achieving our target of 60 frames per second (FPS), giving an extremely fluid interactive experience. GPU acceleration has also allowed us to scale the system for use in immersive 360° spaces with an array of up to ten depth sensors, allowing several users to simultaneously chaperone the dynamics. The flexibility of our platform for carrying out molecular dynamics simulations has been considerably enhanced by wrappers that facilitate fast communication with a portable selection of GPU-accelerated molecular force evaluation routines. In this paper, we describe a 360° atmospheric molecular dynamics simulation we have run in a chemistry/physics education context. We also describe initial tests in which users have been able to chaperone the dynamics of 10-alanine peptide embedded in an explicit water solvent. Using this system, both expert and novice users have been able to accelerate peptide rare event dynamics by 3-4 orders of magnitude.

  14. Systematic distortions of perceptual stability investigated using immersive virtual reality

    PubMed Central

    Tcheang, Lili; Gilson, Stuart J.; Glennerster, Andrew

    2010-01-01

    Using an immersive virtual reality system, we measured the ability of observers to detect the rotation of an object when its movement was yoked to the observer's own translation. Most subjects had a large bias such that a static object appeared to rotate away from them as they moved. Thresholds for detecting target rotation were similar to those for an equivalent speed discrimination task carried out by static observers, suggesting that visual discrimination is the predominant limiting factor in detecting target rotation. Adding a stable visual reference frame almost eliminated the bias. Varying the viewing distance of the target had little effect, consistent with observers under-estimating distance walked. However, accuracy of walking to a briefly presented visual target was high and not consistent with an under-estimation of distance walked. We discuss implications for theories of a task-independent representation of visual space. PMID:15845248

  15. Foreign language learning in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Chang, Benjamin; Sheldon, Lee; Si, Mei; Hand, Anton

    2012-03-01

    Virtual reality has long been used for training simulations in fields from medicine to welding to vehicular operation, but simulations involving more complex cognitive skills present new design challenges. Foreign language learning, for example, is increasingly vital in the global economy, but computer-assisted education is still in its early stages. Immersive virtual reality is a promising avenue for language learning as a way of dynamically creating believable scenes for conversational training and role-play simulation. Visual immersion alone, however, only provides a starting point. We suggest that the addition of social interactions and motivated engagement through narrative gameplay can lead to truly effective language learning in virtual environments. In this paper, we describe the development of a novel application for teaching Mandarin using CAVE-like VR, physical props, human actors and intelligent virtual agents, all within a semester-long multiplayer mystery game. Students travel (virtually) to China on a class field trip, which soon becomes complicated with intrigue and mystery surrounding the lost manuscript of an early Chinese literary classic. Virtual reality environments such as the Forbidden City and a Beijing teahouse provide the setting for learning language, cultural traditions, and social customs, as well as the discovery of clues through conversation in Mandarin with characters in the game.

  16. The impact of contextualization on immersion in healthcare simulation.

    PubMed

    Engström, Henrik; Andersson Hagiwara, Magnus; Backlund, Per; Lebram, Mikael; Lundberg, Lars; Johannesson, Mikael; Sterner, Anders; Maurin Söderholm, Hanna

    2016-01-01

    The aim of this paper is to explore how contextualization of a healthcare simulation scenarios impacts immersion, by using a novel objective instrument, the Immersion Score Rating Instrument. This instrument consists of 10 triggers that indicate reduced or enhanced immersion among participants in a simulation scenario. Triggers refer to events such as jumps in time or space (sign of reduced immersion) and natural interaction with the manikin (sign of enhanced immersion) and can be used to calculate an immersion score. An experiment using a randomized controlled crossover design was conducted to compare immersion between two simulation training conditions for prehospital care: one basic and one contextualized. The Immersion Score Rating Instrument was used to compare the total immersion score for the whole scenario, the immersion score for individual mission phases, and to analyze differences in trigger occurrences. A paired t test was used to test for significance. The comparison shows that the overall immersion score for the simulation was higher in the contextualized condition. The average immersion score was 2.17 (sd = 1.67) in the contextualized condition and -0.77 (sd = 2.01) in the basic condition ( p  < .001). The immersion score was significantly higher in the contextualized condition in five out of six mission phases. Events that might be disruptive for the simulation participants' immersion, such as interventions of the instructor and illogical jumps in time or space, are present to a higher degree in the basic scenario condition; while events that signal enhanced immersion, such as natural interaction with the manikin, are more frequently observed in the contextualized condition. The results suggest that contextualization of simulation training with respect to increased equipment and environmental fidelity as well as functional task alignment might affect immersion positively and thus contribute to an improved training experience.

  17. Look At That! Video Chat and Joint Visual Attention Development Among Babies and Toddlers.

    PubMed

    McClure, Elisabeth R; Chentsova-Dutton, Yulia E; Holochwost, Steven J; Parrott, W G; Barr, Rachel

    2018-01-01

    Although many relatives use video chat to keep in touch with toddlers, key features of adult-toddler interaction like joint visual attention (JVA) may be compromised in this context. In this study, 25 families with a child between 6 and 24 months were observed using video chat at home with geographically separated grandparents. We define two types of screen-mediated JVA (across- and within-screen) and report age-related increases in the babies' across-screen JVA initiations, and that family JVA usage was positively related to babies' overall attention during video calls. Babies today are immersed in a digital world where formative relationships are often mediated by a screen. Implications for both infant social development and developmental research are discussed. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  18. Research on three-dimensional visualization based on virtual reality and Internet

    NASA Astrophysics Data System (ADS)

    Wang, Zongmin; Yang, Haibo; Zhao, Hongling; Li, Jiren; Zhu, Qiang; Zhang, Xiaohong; Sun, Kai

    2007-06-01

    To disclose and display water information, a three-dimensional visualization system based on Virtual Reality (VR) and Internet is researched for demonstrating "digital water conservancy" application and also for routine management of reservoir. To explore and mine in-depth information, after completion of modeling high resolution DEM with reliable quality, topographical analysis, visibility analysis and reservoir volume computation are studied. And also, some parameters including slope, water level and NDVI are selected to classify easy-landslide zone in water-level-fluctuating zone of reservoir area. To establish virtual reservoir scene, two kinds of methods are used respectively for experiencing immersion, interaction and imagination (3I). First virtual scene contains more detailed textures to increase reality on graphical workstation with virtual reality engine Open Scene Graph (OSG). Second virtual scene is for internet users with fewer details for assuring fluent speed.

  19. Haptics-based immersive telerobotic system for improvised explosive device disposal: Are two hands better than one?

    NASA Astrophysics Data System (ADS)

    Erickson, David; Lacheray, Hervé; Lambert, Jason Michel; Mantegh, Iraj; Crymble, Derry; Daly, John; Zhao, Yan

    2012-06-01

    State-of-the-art robotic explosive ordnance disposal robotics have not, in general, adopted recent advances in control technology and man-machine interfaces and lag many years behind academia. This paper describes the Haptics-based Immersive Telerobotic System project investigating an immersive telepresence envrionment incorporating advanced vehicle control systems, Augmented immersive sensory feedback, dynamic 3D visual information, and haptic feedback for explosive ordnance disposal operators. The project aim is to provide operatiors a more sophisticated interface and expand sensory input to perform complex tasks to defeat improvised explosive devices successfully. The introduction of haptics and immersive teleprescence has the potential to shift the way teleprescence systems work for explosive ordnance disposal tasks or more widely for first responders scenarios involving remote unmanned ground vehicles.

  20. Visualizing Mars Using Virtual Reality: A State of the Art Mapping Technique Used on Mars Pathfinder

    NASA Technical Reports Server (NTRS)

    Stoker, C.; Zbinden, E.; Blackmon, T.; Nguyen, L.

    1999-01-01

    We describe an interactive terrain visualization system which rapidly generates and interactively displays photorealistic three-dimensional (3-D) models produced from stereo images. This product, first demonstrated in Mars Pathfinder, is interactive, 3-D, and can be viewed in an immersive display which qualifies it for the name Virtual Reality (VR). The use of this technology on Mars Pathfinder was the first use of VR for geologic analysis. A primary benefit of using VR to display geologic information is that it provides an improved perception of depth and spatial layout of the remote site. The VR aspect of the display allows an operator to move freely in the environment, unconstrained by the physical limitations of the perspective from which the data were acquired. Virtual Reality offers a way to archive and retrieve information in a way that is intuitively obvious. Combining VR models with stereo display systems can give the user a sense of presence at the remote location. The capability, to interactively perform measurements from within the VR model offers unprecedented ease in performing operations that are normally time consuming and difficult using other techniques. Thus, Virtual Reality can be a powerful a cartographic tool. Additional information is contained in the original extended abstract.

  1. Can walking motions improve visually induced rotational self-motion illusions in virtual reality?

    PubMed

    Riecke, Bernhard E; Freiberg, Jacob B; Grechkin, Timofey Y

    2015-02-04

    Illusions of self-motion (vection) can provide compelling sensations of moving through virtual environments without the need for complex motion simulators or large tracked physical walking spaces. Here we explore the interaction between biomechanical cues (stepping along a rotating circular treadmill) and visual cues (viewing simulated self-rotation) for providing stationary users a compelling sensation of rotational self-motion (circular vection). When tested individually, biomechanical and visual cues were similarly effective in eliciting self-motion illusions. However, in combination they yielded significantly more intense self-motion illusions. These findings provide the first compelling evidence that walking motions can be used to significantly enhance visually induced rotational self-motion perception in virtual environments (and vice versa) without having to provide for physical self-motion or motion platforms. This is noteworthy, as linear treadmills have been found to actually impair visually induced translational self-motion perception (Ash, Palmisano, Apthorp, & Allison, 2013). Given the predominant focus on linear walking interfaces for virtual-reality locomotion, our findings suggest that investigating circular and curvilinear walking interfaces offers a promising direction for future research and development and can help to enhance self-motion illusions, presence and immersion in virtual-reality systems. © 2015 ARVO.

  2. Surgical planning for radical prostatectomies using three-dimensional visualization and a virtual reality display system

    NASA Astrophysics Data System (ADS)

    Kay, Paul A.; Robb, Richard A.; King, Bernard F.; Myers, R. P.; Camp, Jon J.

    1995-04-01

    Thousands of radical prostatectomies for prostate cancer are performed each year. Radical prostatectomy is a challenging procedure due to anatomical variability and the adjacency of critical structures, including the external urinary sphincter and neurovascular bundles that subserve erectile function. Because of this, there are significant risks of urinary incontinence and impotence following this procedure. Preoperative interaction with three-dimensional visualization of the important anatomical structures might allow the surgeon to understand important individual anatomical relationships of patients. Such understanding might decrease the rate of morbidities, especially for surgeons in training. Patient specific anatomic data can be obtained from preoperative 3D MRI diagnostic imaging examinations of the prostate gland utilizing endorectal coils and phased array multicoils. The volumes of the important structures can then be segmented using interactive image editing tools and then displayed using 3-D surface rendering algorithms on standard work stations. Anatomic relationships can be visualized using surface displays and 3-D colorwash and transparency to allow internal visualization of hidden structures. Preoperatively a surgeon and radiologist can interactively manipulate the 3-D visualizations. Important anatomical relationships can better be visualized and used to plan the surgery. Postoperatively the 3-D displays can be compared to actual surgical experience and pathologic data. Patients can then be followed to assess the incidence of morbidities. More advanced approaches to visualize these anatomical structures in support of surgical planning will be implemented on virtual reality (VR) display systems. Such realistic displays are `immersive,' and allow surgeons to simultaneously see and manipulate the anatomy, to plan the procedure and to rehearse it in a realistic way. Ultimately the VR systems will be implemented in the operating room (OR) to assist the surgeon in conducting the surgery. Such an implementation will bring to the OR all of the pre-surgical planning data and rehearsal experience in synchrony with the actual patient and operation to optimize the effectiveness and outcome of the procedure.

  3. 'Putting it on the table': direct-manipulative interaction and multi-user display technologies for semi-immersive environments and augmented reality applications.

    PubMed

    Encarnação, L Miguel; Bimber, Oliver

    2002-01-01

    Collaborative virtual environments for diagnosis and treatment planning are increasingly gaining importance in our global society. Virtual and Augmented Reality approaches promised to provide valuable means for the involved interactive data analysis, but the underlying technologies still create a cumbersome work environment that is inadequate for clinical employment. This paper addresses two of the shortcomings of such technology: Intuitive interaction with multi-dimensional data in immersive and semi-immersive environments as well as stereoscopic multi-user displays combining the advantages of Virtual and Augmented Reality technology.

  4. Proof of concept : examining characteristics of roadway infrastructure in various 3D visualization modes.

    DOT National Transportation Integrated Search

    2015-02-01

    Utilizing enhanced visualization in transportation planning and design gained popularity in the last decade. This work aimed at : demonstrating the concept of utilizing a highly immersive, virtual reality simulation engine for creating dynamic, inter...

  5. Modulation of Excitability in the Temporoparietal Junction Relieves Virtual Reality Sickness.

    PubMed

    Takeuchi, Naoyuki; Mori, Takayuki; Suzukamo, Yoshimi; Izumi, Shin-Ichi

    2018-06-01

    Virtual reality (VR) immersion often provokes subjective discomfort and postural instability, so called VR sickness. The neural mechanism of VR sickness is speculated to be related to visual-vestibular information mismatch and/or postural instability. However, the approaches proposed to relieve VR sickness through modulation of brain activity are poorly understood. Using transcranial direct current stimulation (tDCS), we aimed to investigate whether VR sickness could be relieved by the modulation of cortical excitability in the temporoparietal junction (TPJ), which is known to be involved in processing of both vestibular and visual information. Twenty healthy subjects received tDCS over right TPJ before VR immersion. The order of the three types of tDCS (anodal, cathodal, and sham) was counterbalanced across subjects. We evaluated the subjective symptoms, heart rate, and center of pressure at baseline, after tDCS, and after VR immersion. VR immersion using head-mounted displays provoked subjective discomfort and postural instability. However, anodal tDCS over right TPJ ameliorated subjective disorientation symptoms and postural instability induced by VR immersion compared with sham condition. The amelioration of VR sickness by anodal tDCS over the right TPJ might result from relief of the sensory conflict and/or facilitation of vestibular function. Our result not only has potential clinical implications for the neuromodulation approach of VR sickness but also implies a causal role of the TPJ in VR sickness.

  6. The big picture: effects of surround on immersion and size perception.

    PubMed

    Baranowski, Andreas M; Hecht, Heiko

    2014-01-01

    Despite the fear of the entertainment industry that illegal downloads of films might ruin their business, going to the movies continues to be a popular leisure activity. One reason why people prefer to watch movies in cinemas may be the surround of the movie screen or its physically huge size. To disentangle the factors that might contribute to the size impression, we tested several measures of subjective size and immersion in different viewing environments. For this purpose we built a model cinema that provided visual angle information comparable with that of a real cinema. Subjects watched identical movie clips in a real cinema, a model cinema, and on a display monitor in isolation. Whereas the isolated display monitor was inferior, the addition of a contextual model improved the viewing immersion to the extent that it was comparable with the movie theater experience, provided the viewing angle remained the same. In a further study we built an identical but even smaller model cinema to unconfound visual angle and viewing distance. Both model cinemas produced similar results. There was a trend for the larger screen to be more immersive; however, viewing angle did not play a role in how the movie was evaluated.

  7. Virtually There.

    ERIC Educational Resources Information Center

    Lanier, Jaron

    2001-01-01

    Describes tele-immersion, a new medium for human interaction enabled by digital technologies. It combines the display and interaction techniques of virtual reality with new vision technologies that transcend the traditional limitations of a camera. Tele-immersion stations observe people as moving sculptures without favoring a single point of view.…

  8. Using virtual reality to analyze sports performance.

    PubMed

    Bideau, Benoit; Kulpa, Richard; Vignais, Nicolas; Brault, Sébastien; Multon, Franck; Craig, Cathy

    2010-01-01

    Improving performance in sports can be difficult because many biomechanical, physiological, and psychological factors come into play during competition. A better understanding of the perception-action loop employed by athletes is necessary. This requires isolating contributing factors to determine their role in player performance. Because of its inherent limitations, video playback doesn't permit such in-depth analysis. Interactive, immersive virtual reality (VR) can overcome these limitations and foster a better understanding of sports performance from a behavioral-neuroscience perspective. Two case studies using VR technology and a sophisticated animation engine demonstrate how to use information from visual displays to inform a player's future course of action.

  9. CROSS DRIVE: A New Interactive and Immersive Approach for Exploring 3D Time-Dependent Mars Atmospheric Data in Distributed Teams

    NASA Astrophysics Data System (ADS)

    Gerndt, Andreas M.; Engelke, Wito; Giuranna, Marco; Vandaele, Ann C.; Neary, Lori; Aoki, Shohei; Kasaba, Yasumasa; Garcia, Arturo; Fernando, Terrence; Roberts, David; CROSS DRIVE Team

    2016-10-01

    Atmospheric phenomena of Mars can be highly dynamic and have daily and seasonal variations. Planetary-scale wavelike disturbances, for example, are frequently observed in Mars' polar winter atmosphere. Possible sources of the wave activity were suggested to be dynamical instabilities and quasi-stationary planetary waves, i.e. waves that arise predominantly via zonally asymmetric surface properties. For a comprehensive understanding of these phenomena, single layers of altitude have to be analyzed carefully and relations between different atmospheric quantities and interaction with the surface of Mars have to be considered. The CROSS DRIVE project tries to address the presentation of those data with a global view by means of virtual reality techniques. Complex orbiter data from spectrometer and observation data from Earth are combined with global circulation models and high-resolution terrain data and images available from Mars Express or MRO instruments. Scientists can interactively extract features from those dataset and can change visualization parameters in real-time in order to emphasize findings. Stereoscopic views allow for perception of the actual 3D behavior of Mars's atmosphere. A very important feature of the visualization system is the possibility to connect distributed workspaces together. This enables discussions between distributed working groups. The workspace can scale from virtual reality systems to expert desktop applications to web-based project portals. If multiple virtual environments are connected, the 3D position of each individual user is captured and used to depict the scientist as an avatar in the virtual world. The appearance of the avatar can also scale from simple annotations to complex avatars using tele-presence technology to reconstruct the users in 3D. Any change of the feature set (annotations, cutplanes, volume rendering, etc.) within the VR is immediately exchanged between all connected users. This allows that everybody is always aware of what is visible and discussed. The discussion is supported by audio and interaction is controlled by a moderator managing turn-taking presentations. A use case execution proved a success and showed the potential of this immersive approach.

  10. Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Sims, Michael; Kunz, Clayton; Lees, David; Bowman, Judd

    2005-01-01

    Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of. engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 30 visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 30 terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission.

  11. Virtual Reality Used to Serve the Glenn Engineering Community

    NASA Technical Reports Server (NTRS)

    Carney, Dorothy V.

    2001-01-01

    There are a variety of innovative new visualization tools available to scientists and engineers for the display and analysis of their models. At the NASA Glenn Research Center, we have an ImmersaDesk, a large, single-panel, semi-immersive display device. This versatile unit can interactively display three-dimensional images in visual stereo. Our challenge is to make this virtual reality platform accessible and useful to researchers. An example of a successful application of this computer technology is the display of blade out simulations. NASA Glenn structural dynamicists, Dr. Kelly Carney and Dr. Charles Lawrence, funded by the Ultra Safe Propulsion Project under Base R&T, are researching blade outs, when turbine engines lose a fan blade during operation. Key objectives of this research include minimizing danger to the aircraft via effective blade containment, predicting destructive loads due to the imbalance following a blade loss, and identifying safe, cost-effective designs and materials for future engines.

  12. The Virtual Pelvic Floor, a tele-immersive educational environment.

    PubMed Central

    Pearl, R. K.; Evenhouse, R.; Rasmussen, M.; Dech, F.; Silverstein, J. C.; Prokasy, S.; Panko, W. B.

    1999-01-01

    This paper describes the development of the Virtual Pelvic Floor, a new method of teaching the complex anatomy of the pelvic region utilizing virtual reality and advanced networking technology. Virtual reality technology allows improved visualization of three-dimensional structures over conventional media because it supports stereo vision, viewer-centered perspective, large angles of view, and interactivity. Two or more ImmersaDesk systems, drafting table format virtual reality displays, are networked together providing an environment where teacher and students share a high quality three-dimensional anatomical model, and are able to converse, see each other, and to point in three dimensions to indicate areas of interest. This project was realized by the teamwork of surgeons, medical artists and sculptors, computer scientists, and computer visualization experts. It demonstrates the future of virtual reality for surgical education and applications for the Next Generation Internet. Images Figure 1 Figure 2 Figure 3 PMID:10566378

  13. The interplays among technology and content, immersant and VE

    NASA Astrophysics Data System (ADS)

    Song, Meehae; Gromala, Diane; Shaw, Chris; Barnes, Steven J.

    2010-01-01

    The research program aims to explore and examine the fine balance necessary for maintaining the interplays between technology and the immersant, including identifying qualities that contribute to creating and maintaining a sense of "presence" and "immersion" in an immersive virtual reality (IVR) experience. Building upon and extending previous work, we compare sitting meditation with walking meditation in a virtual environment (VE). The Virtual Meditative Walk, a new work-in-progress, integrates VR and biofeedback technologies with a self-directed, uni-directional treadmill. As immersants learn how to meditate while walking, robust, real-time biofeedback technology continuously measures breathing, skin conductance and heart rate. The physiological states of the immersant will in turn affect the audio and stereoscopic visual media through shutter glasses. We plan to test the potential benefits and limitations of this physically active form of meditation with data from a sitting form of meditation. A mixed-methods approach to testing user outcomes parallels the knowledge bases of the collaborative team: a physician, computer scientists and artists.

  14. Data Visualization in Information Retrieval and Data Mining (SIG VIS).

    ERIC Educational Resources Information Center

    Efthimiadis, Efthimis

    2000-01-01

    Presents abstracts that discuss using data visualization for information retrieval and data mining, including immersive information space and spatial metaphors; spatial data using multi-dimensional matrices with maps; TREC (Text Retrieval Conference) experiments; users' information needs in cartographic information retrieval; and users' relevance…

  15. Digital Immersive Virtual Environments and Instructional Computing

    ERIC Educational Resources Information Center

    Blascovich, Jim; Beall, Andrew C.

    2010-01-01

    This article reviews theory and research relevant to the development of digital immersive virtual environment-based instructional computing systems. The review is organized within the context of a multidimensional model of social influence and interaction within virtual environments that models the interaction of four theoretical factors: theory…

  16. Story Immersion of Videogames for Youth Health Promotion: A Review of Literature

    PubMed Central

    Baranowski, Tom; Thompson, Debbe; Buday, Richard

    2012-01-01

    Abstract This article reviews research in the fields of psychology, literature, communication, human–computer interaction, public health, and consumer behavior on narrative and its potential relationships with videogames and story immersion. It also reviews a narrative's role in complementing behavioral change theories and the potential of story immersion for health promotion through videogames. Videogames have potential for health promotion and may be especially promising when attempting to reach youth. An understudied characteristic of videogames is that many contain a narrative, or story. Story immersion (transportation) is a mechanism through which a narrative influences players' cognition, affect, and, potentially, health behavior. Immersion promotes the suspension of disbelief and the reduction of counterarguments, enables the story experience as a personal experience, and creates the player's deep affection for narrative protagonists. Story immersion complements behavioral change theories, including the Theory of Planned Behavior, Social Cognitive Theory, and Self-Determination Theory. Systematic investigations are needed to realize the powerful potential of interactive narratives within theory-driven research. PMID:24416639

  17. Story Immersion of Videogames for Youth Health Promotion: A Review of Literature.

    PubMed

    Lu, Amy Shirong; Baranowski, Tom; Thompson, Debbe; Buday, Richard

    2012-06-01

    This article reviews research in the fields of psychology, literature, communication, human-computer interaction, public health, and consumer behavior on narrative and its potential relationships with videogames and story immersion. It also reviews a narrative's role in complementing behavioral change theories and the potential of story immersion for health promotion through videogames. Videogames have potential for health promotion and may be especially promising when attempting to reach youth. An understudied characteristic of videogames is that many contain a narrative, or story. Story immersion (transportation) is a mechanism through which a narrative influences players' cognition, affect, and, potentially, health behavior. Immersion promotes the suspension of disbelief and the reduction of counterarguments, enables the story experience as a personal experience, and creates the player's deep affection for narrative protagonists. Story immersion complements behavioral change theories, including the Theory of Planned Behavior, Social Cognitive Theory, and Self-Determination Theory. Systematic investigations are needed to realize the powerful potential of interactive narratives within theory-driven research.

  18. Streamlining Simulation Development using a Commercial Game Engine

    DTIC Science & Technology

    2009-10-01

    few years. The realism is stunning and the Commercial Game Industry fuels the fire of cutting edge advances in hardware and immersive experiences...Technology applies to Military training in more than just the obvious upgrades in game engines and hardware. The increased visual realism and performance...elaborate storytelling and cinematic effects provide a more immersive and compelling experience to the player. The underlying game engine technology

  19. Tackling the challenges of fully immersive head-mounted AR devices

    NASA Astrophysics Data System (ADS)

    Singer, Wolfgang; Hillenbrand, Matthias; Münz, Holger

    2017-11-01

    The optical requirements of fully immersive head mounted AR devices are inherently determined by the human visual system. The etendue of the visual system is large. As a consequence, the requirements for fully immersive head-mounted AR devices exceeds almost any high end optical system. Two promising solutions to achieve the large etendue and their challenges are discussed. Head-mounted augmented reality devices have been developed for decades - mostly for application within aircrafts and in combination with a heavy and bulky helmet. The established head-up displays for applications within automotive vehicles typically utilize similar techniques. Recently, there is the vision of eyeglasses with included augmentation, offering a large field of view, and being unobtrusively all-day wearable. There seems to be no simple solution to reach the functional performance requirements. Known technical solutions paths seem to be a dead-end, and some seem to offer promising perspectives, however with severe limitations. As an alternative, unobtrusively all-day wearable devices with a significantly smaller field of view are already possible.

  20. The World at Your Feet: Immersive Interactive Displays Might Have a Bright Future in Education

    ERIC Educational Resources Information Center

    Simkins, Michael

    2006-01-01

    A reactor is an example of an immersive interactive play in which animated images are projected onto the floor. A reactor allows people to walk on images and interact with them using their feet. With reactors, people can stomp on kernels of popcorn, shoot a pool using their big toes, or wade through a shallow surf on pristine beaches. This…

  1. Interaction between a laminar starting immersed micro-jet and a parallel wall

    NASA Astrophysics Data System (ADS)

    Cabaleiro, Juan Martin; Laborde, Cecilia; Artana, Guillermo

    2015-01-01

    In the present work, we study the starting transient of an immersed micro-jet in close vicinity to a solid wall parallel to its axis. The experiments concern laminar jets (Re < 200) issuing from a 100 μm internal tip diameter glass micro-pipette. The effect of the confinement was studied placing the micro-pipette at different distances from the wall. The characterization of the jet was carried out by visualizations on which the morphology of the vortex head and trajectories was analyzed. Numerical simulations were used as a complementary tool for the analysis. The jet remains stable for very long distances away from the tip allowing for a similarity analysis. The self-similar behavior of the starting jet has been studied in terms of the frontline position with time. A symmetric and a wall dominated regime could be identified. The starting jet in the wall type regime, and in the symmetric regime as well, develops a self-similar behavior that has a relative rapid loss of memory of the preceding condition of the flow. Scaling for both regimes are those that correspond to viscous dominated flows.

  2. Saliency in VR: How Do People Explore Virtual Environments?

    PubMed

    Sitzmann, Vincent; Serrano, Ana; Pavel, Amy; Agrawala, Maneesh; Gutierrez, Diego; Masia, Belen; Wetzstein, Gordon

    2018-04-01

    Understanding how people explore immersive virtual environments is crucial for many applications, such as designing virtual reality (VR) content, developing new compression algorithms, or learning computational models of saliency or visual attention. Whereas a body of recent work has focused on modeling saliency in desktop viewing conditions, VR is very different from these conditions in that viewing behavior is governed by stereoscopic vision and by the complex interaction of head orientation, gaze, and other kinematic constraints. To further our understanding of viewing behavior and saliency in VR, we capture and analyze gaze and head orientation data of 169 users exploring stereoscopic, static omni-directional panoramas, for a total of 1980 head and gaze trajectories for three different viewing conditions. We provide a thorough analysis of our data, which leads to several important insights, such as the existence of a particular fixation bias, which we then use to adapt existing saliency predictors to immersive VR conditions. In addition, we explore other applications of our data and analysis, including automatic alignment of VR video cuts, panorama thumbnails, panorama video synopsis, and saliency-basedcompression.

  3. Simultsonic: A Simulation Tool for Ultrasonic Inspection

    NASA Astrophysics Data System (ADS)

    Krishnamurthy, Adarsh; Karthikeyan, Soumya; Krishnamurthy, C. V.; Balasubramaniam, Krishnan

    2006-03-01

    A simulation program SIMULTSONIC is under development at CNDE to help determine and/or help optimize ultrasonic probe locations for inspection of complex components. SIMULTSONIC provides a ray-trace based assessment initially followed by a displacement or pressure field-based assessment for user-specified probe positions and user-selected component. Immersion and contact modes of inspection are available in SIMULTSONIC. The code written in Visual C++ operating in Microsoft Windows environment provides an interactive user interface. In this paper, the application of SIMULTSONIC to the inspection of very thin-walled pipes (with 450 um wall thickness) is described. Ray trace based assessment was done using SIMULTSONIC to determine the standoff distance and the angle of oblique incidence for an immersion mode focused transducer. A 3-cycle Hanning window pulse was chosen for simulations. Experiments were carried out to validate the simulations. The A-scans and the associated B-Scan images obtained through simulations show good correlation with experimental results, both with the arrival time of the signal as well as with the signal amplitudes. The scope of SIMULTSONIC to deal with parametrically represented surfaces will also be discussed.

  4. Virtual Reality in Neurointervention.

    PubMed

    Ong, Chin Siang; Deib, Gerard; Yesantharao, Pooja; Qiao, Ye; Pakpoor, Jina; Hibino, Narutoshi; Hui, Ferdinand; Garcia, Juan R

    2018-06-01

    Virtual reality (VR) allows users to experience realistic, immersive 3D virtual environments with the depth perception and binocular field of view of real 3D settings. Newer VR technology has now allowed for interaction with 3D objects within these virtual environments through the use of VR controllers. This technical note describes our preliminary experience with VR as an adjunct tool to traditional angiographic imaging in the preprocedural workup of a patient with a complex pseudoaneurysm. Angiographic MRI data was imported and segmented to create 3D meshes of bilateral carotid vasculature. The 3D meshes were then projected into VR space, allowing the operator to inspect the carotid vasculature using a 3D VR headset as well as interact with the pseudoaneurysm (handling, rotation, magnification, and sectioning) using two VR controllers. 3D segmentation of a complex pseudoaneurysm in the distal cervical segment of the right internal carotid artery was successfully performed and projected into VR. Conventional and VR visualization modes were equally effective in identifying and classifying the pathology. VR visualization allowed the operators to manipulate the dataset to achieve a greater understanding of the anatomy of the parent vessel, the angioarchitecture of the pseudoaneurysm, and the surface contours of all visualized structures. This preliminary study demonstrates the feasibility of utilizing VR for preprocedural evaluation in patients with anatomically complex neurovascular disorders. This novel visualization approach may serve as a valuable adjunct tool in deciding patient-specific treatment plans and selection of devices prior to intervention.

  5. VERS: a virtual environment for reconstructive surgery planning

    NASA Astrophysics Data System (ADS)

    Montgomery, Kevin N.

    1997-05-01

    The virtual environment for reconstructive surgery (VERS) project at the NASA Ames Biocomputation Center is applying virtual reality technology to aid surgeons in planning surgeries. We are working with a craniofacial surgeon at Stanford to assemble and visualize the bone structure of patients requiring reconstructive surgery either through developmental abnormalities or trauma. This project is an extension of our previous work in 3D reconstruction, mesh generation, and immersive visualization. The current VR system, consisting of an SGI Onyx RE2, FakeSpace BOOM and ImmersiveWorkbench, Virtual Technologies CyberGlove and Ascension Technologies tracker, is currently in development and has already been used to visualize defects preoperatively. In the near future it will be used to more fully plan the surgery and compute the projected result to soft tissue structure. This paper presents the work in progress and details the production of a high-performance, collaborative, and networked virtual environment.

  6. Realistic realtime illumination of complex environment for immersive systems. A case study: the Parthenon

    NASA Astrophysics Data System (ADS)

    Callieri, M.; Debevec, P.; Pair, J.; Scopigno, R.

    2005-06-01

    Offine rendering techniques have nowadays reached an astonishing level of realism but paying the cost of a long computational time. The new generation of programmable graphic hardware, on the other hand, gives the possibility to implement in realtime some of the visual effects previously available only for cinematographic production. In a collaboration between the Visual Computing Lab (ISTI-CNR) with the Institute for Creative Technologies of the University of Southern California, has been developed a realtime demo that replicate a sequence from the short movie "The Parthenon" presented at Siggraph 2004. The application is designed to run on an immersive reality system, making possible for a user to perceive the virtual environment with a cinematographic visual quality. In this paper we present the principal ideas of the project, discussing design issues and technical solution used for the realtime demo.

  7. The IQ-wall and IQ-station -- harnessing our collective intelligence to realize the potential of ultra-resolution and immersive visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eric A. Wernert; William R. Sherman; Chris Eller

    2012-03-01

    We present a pair of open-recipe, affordably-priced, easy-to-integrate, and easy-to-use visualization systems. The IQ-wall is an ultra-resolution tiled display wall that scales up to 24 screens with a single PC. The IQ-station is a semi-immersive display system that utilizes commodity stereoscopic displays, lower cost tracking systems, and touch overlays. These systems have been designed to support a wide range of research, education, creative activities, and information presentations. They were designed to work equally well as stand-alone installations or as part of a larger distributed visualization ecosystem. We detail the hardware and software components of these systems, describe our deployments andmore » experiences in a variety of research lab and university environments, and share our insights for effective support and community development.« less

  8. Immersive Earth Science: Data Visualization in Virtual Reality

    NASA Astrophysics Data System (ADS)

    Skolnik, S.; Ramirez-Linan, R.

    2017-12-01

    Utilizing next generation technology, Navteca's exploration of 3D and volumetric temporal data in Virtual Reality (VR) takes advantage of immersive user experiences where stakeholders are literally inside the data. No longer restricted by the edges of a screen, VR provides an innovative way of viewing spatially distributed 2D and 3D data that leverages a 360 field of view and positional-tracking input, allowing users to see and experience data differently. These concepts are relevant to many sectors, industries, and fields of study, as real-time collaboration in VR can enhance understanding and mission with VR visualizations that display temporally-aware 3D, meteorological, and other volumetric datasets. The ability to view data that is traditionally "difficult" to visualize, such as subsurface features or air columns, is a particularly compelling use of the technology. Various development iterations have resulted in Navteca's proof of concept that imports and renders volumetric point-cloud data in the virtual reality environment by interfacing PC-based VR hardware to a back-end server and popular GIS software. The integration of the geo-located data in VR and subsequent display of changeable basemaps, overlaid datasets, and the ability to zoom, navigate, and select specific areas show the potential for immersive VR to revolutionize the way Earth data is viewed, analyzed, and communicated.

  9. Altered Perspectives: Immersive Environments

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Webley, P. W.

    2016-12-01

    Immersive environments provide an exciting experiential technology to visualize the natural world. Given the increasing accessibility of 360o cameras and virtual reality headsets we are now able to visualize artistic principles and scientific concepts in a fully immersive environment. The technology has become popular for photographers as well as designers, industry, educational groups, and museums. Here we show a sci-art perspective on the use of optics and light in the capture and manipulation of 360o images and video of geologic phenomena and cultural heritage sites in Alaska, England, and France. Additionally, we will generate intentionally altered perspectives to lend a surrealistic quality to the landscapes. Locations include the Catacombs of Paris, the Palace of Versailles, and the Northern Lights over Fairbanks, Alaska. Some 360o view cameras now use small portable dual lens technology extending beyond the 180o fish eye lens previously used, providing better coverage and image quality. Virtual reality headsets range in level of sophistication and cost, with the most affordable versions using smart phones and Google Cardboard viewers. The equipment used in this presentation includes a Ricoh Theta S spherical imaging camera. Here we will demonstrate the use of 360o imaging with attendees being able to be part of the immersive environment and experience our locations as if they were visiting themselves.

  10. Using Globe Browsing Systems in Planetariums to Take Audiences to Other Worlds.

    NASA Astrophysics Data System (ADS)

    Emmart, C. B.

    2014-12-01

    For the last decade planetariums have been adding capability of "full dome video" systems for both movie playback and interactive display. True scientific data visualization has now come to planetarium audiences as a means to display the actual three dimensional layout of the universe, the time based array of planets, minor bodies and spacecraft across the solar system, and now globe browsing systems to examine planetary bodies to the limits of resolutions acquired. Additionally, such planetarium facilities can be networked for simultaneous display across the world for wider audience and reach to authoritative scientist description and commentary. Data repositories such as NASA's Lunar Mapping and Modeling Project (LMMP), NASA GSFC's LANCE-MODIS, and others conforming to the Open Geospatial Consortium (OGC) standard of Web Map Server (WMS) protocols make geospatial data available for a growing number of dome supporting globe visualization systems. The immersive surround graphics of full dome video replicates our visual system creating authentic virtual scenes effectively placing audiences on location in some cases to other worlds only mapped robotically.

  11. An Immersed Boundary Method for Solving the Compressible Navier-Stokes Equations with Fluid Structure Interaction

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    An immersed boundary method for the compressible Navier-Stokes equation and the additional infrastructure that is needed to solve moving boundary problems and fully coupled fluid-structure interaction is described. All the methods described in this paper were implemented in NASA's LAVA solver framework. The underlying immersed boundary method is based on the locally stabilized immersed boundary method that was previously introduced by the authors. In the present paper this method is extended to account for all aspects that are involved for fluid structure interaction simulations, such as fast geometry queries and stencil computations, the treatment of freshly cleared cells, and the coupling of the computational fluid dynamics solver with a linear structural finite element method. The current approach is validated for moving boundary problems with prescribed body motion and fully coupled fluid structure interaction problems in 2D and 3D. As part of the validation procedure, results from the second AIAA aeroelastic prediction workshop are also presented. The current paper is regarded as a proof of concept study, while more advanced methods for fluid structure interaction are currently being investigated, such as geometric and material nonlinearities, and advanced coupling approaches.

  12. Using Virtual Reality to Help Students with Social Interaction Skills

    ERIC Educational Resources Information Center

    Beach, Jason; Wendt, Jeremy

    2015-01-01

    The purpose of this study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity to…

  13. Avatars mirroring the actual self versus projecting the ideal self: the effects of self-priming on interactivity and immersion in an exergame, Wii Fit.

    PubMed

    Jin, Seung-A Annie

    2009-12-01

    As exergames are increasingly being used as an interventional tool to fight the obesity epidemic in clinical studies, society is absorbing their impact to a more intense degree. Interactivity and immersion are key factors that attract exergame consumers. This research asks, What are the effects of priming the actual self versus the ideal self on users' perceived interactivity and immersion in avatar-based exergame playing? and What are important moderators that play a role in exergame users' self-concept perception? To answer these research questions, this study leveraged the Wii's avatar-creating function (Mii Channel) and exergame feature (Wii Fit) in a controlled, randomized experimental design (N = 126). The results of a 2 x 2 factorial design experiment demonstrated the significant main effect of self-priming on interactivity and the moderating role of the actual-ideal self-concept discrepancy in influencing immersion during exergame playing. Game players who created an avatar reflecting the ideal self reported greater perceived interactivity than those who created a replica avatar mirroring the actual self. A two-way ANOVA demonstrated the moderating role of the actual-ideal self-concept discrepancy in determining the effects of the primed regulatory focus on immersion in the exergame play. The underlying theoretical mechanism is derived from and explained by Higgins's self-concept discrepancy perspective. Practical implications for game developers and managerial implications for the exergame industry are discussed.

  14. Motion parallax in immersive cylindrical display systems

    NASA Astrophysics Data System (ADS)

    Filliard, N.; Reymond, G.; Kemeny, A.; Berthoz, A.

    2012-03-01

    Motion parallax is a crucial visual cue produced by translations of the observer for the perception of depth and selfmotion. Therefore, tracking the observer viewpoint has become inevitable in immersive virtual (VR) reality systems (cylindrical screens, CAVE, head mounted displays) used e.g. in automotive industry (style reviews, architecture design, ergonomics studies) or in scientific studies of visual perception. The perception of a stable and rigid world requires that this visual cue be coherent with other extra-retinal (e.g. vestibular, kinesthetic) cues signaling ego-motion. Although world stability is never questioned in real world, rendering head coupled viewpoint in VR can lead to the perception of an illusory perception of unstable environments, unless a non-unity scale factor is applied on recorded head movements. Besides, cylindrical screens are usually used with static observers due to image distortions when rendering image for viewpoints different from a sweet spot. We developed a technique to compensate in real-time these non-linear visual distortions, in an industrial VR setup, based on a cylindrical screen projection system. Additionally, to evaluate the amount of discrepancies tolerated without perceptual distortions between visual and extraretinal cues, a "motion parallax gain" between the velocity of the observer's head and that of the virtual camera was introduced in this system. The influence of this artificial gain was measured on the gait stability of free-standing participants. Results indicate that, below unity, gains significantly alter postural control. Conversely, the influence of higher gains remains limited, suggesting a certain tolerance of observers to these conditions. Parallax gain amplification is therefore proposed as a possible solution to provide a wider exploration of space to users of immersive virtual reality systems.

  15. Immersive 3D geovisualisation in higher education

    NASA Astrophysics Data System (ADS)

    Philips, Andrea; Walz, Ariane; Bergner, Andreas; Graeff, Thomas; Heistermann, Maik; Kienzler, Sarah; Korup, Oliver; Lipp, Torsten; Schwanghart, Wolfgang; Zeilinger, Gerold

    2014-05-01

    Through geovisualisation we explore spatial data, we analyse it towards a specific questions, we synthesise results, and we present and communicate them to a specific audience (MacEachren & Kraak 1997). After centuries of paper maps, the means to represent and visualise our physical environment and its abstract qualities have changed dramatically since the 1990s - and accordingly the methods how to use geovisualisation in teaching. Whereas some people might still consider the traditional classroom as ideal setting for teaching and learning geographic relationships and its mapping, we used a 3D CAVE (computer-animated virtual environment) as environment for a problem-oriented learning project called "GEOSimulator". Focussing on this project, we empirically investigated, if such a technological advance like the CAVE make 3D visualisation, including 3D geovisualisation, not only an important tool for businesses (Abulrub et al. 2012) and for the public (Wissen et al. 2008), but also for educational purposes, for which it had hardly been used yet. The 3D CAVE is a three-sided visualisation platform, that allows for immersive and stereoscopic visualisation of observed and simulated spatial data. We examined the benefits of immersive 3D visualisation for geographic research and education and synthesized three fundamental technology-based visual aspects: First, the conception and comprehension of space and location does not need to be generated, but is instantaneously and intuitively present through stereoscopy. Second, optical immersion into virtual reality strengthens this spatial perception which is in particular important for complex 3D geometries. And third, a significant benefit is interactivity, which is enhanced through immersion and allows for multi-discursive and dynamic data exploration and knowledge transfer. Based on our problem-oriented learning project, which concentrates on a case study on flood risk management at the Wilde Weisseritz in Germany, a river that significantly contributed to the hundred-year flooding in Dresden in 2002, we empirically evaluated the usefulness of this immersive 3D technology towards learning success. Results show that immersive 3D geovisualisation have educational and content-related advantages compared to 2D geovisualisations through the mentioned benefits. This innovative way of geovisualisation is thus not only entertaining and motivating for students, but can also be constructive for research studies by, for instance, facilitating the study of complex environments or decision-making processes.

  16. Declarative Knowledge Acquisition in Immersive Virtual Learning Environments

    ERIC Educational Resources Information Center

    Webster, Rustin

    2016-01-01

    The author investigated the interaction effect of immersive virtual reality (VR) in the classroom. The objective of the project was to develop and provide a low-cost, scalable, and portable VR system containing purposely designed and developed immersive virtual learning environments for the US Army. The purpose of the mixed design experiment was…

  17. Active Learning through the Use of Virtual Environments

    ERIC Educational Resources Information Center

    Mayrose, James

    2012-01-01

    Immersive Virtual Reality (VR) has seen explosive growth over the last decade. Immersive VR attempts to give users the sensation of being fully immersed in a synthetic environment by providing them with 3D hardware, and allowing them to interact with objects in virtual worlds. The technology is extremely effective for learning and exploration, and…

  18. "To Improve Language, You Have to Mix": Teachers' Perceptions of Language Learning in an Overseas Immersion Environment

    ERIC Educational Resources Information Center

    Roskvist, Annelies; Harvey, Sharon; Corder, Deborah; Stacey, Karen

    2014-01-01

    The overseas immersion environment has long been considered a superior context for language learning, supposedly providing unlimited exposure to target language (TL) input and countless opportunities for authentic interaction with expert users. This article focuses on immersion programmes (IPs) for in-service language teachers--a relatively…

  19. Immersion versus interactivity and analytic field.

    PubMed

    Civitarese, Giuseppe

    2008-04-01

    Losing oneself in a story, a film or a picture is nothing but another step in the suspension of disbelief that permits one to become immersed in the 'novel' of reality. It is not by chance that the text-world metaphor informs classical aesthetics that, more than anything else, emphasizes emotional involvement. On the contrary, as in much of modern art, self-reflexivity and metafictional attention to the rhetoric of the real, to the framework, to the conventions and to the processes of meaning production, all involve a disenchanted, detached and sceptic vision--in short, an aesthetics of the text as game. By analogy, any analytic style or model that aims to produce a transformative experience must satisfactorily resolve the conflict between immersion (the analyst's emotional participation and sticking to the dreamlike or fictional climate of the session, dreaming knowing it's a dream) and interactivity (for the most part, interpretation as an anti-immersive device that 'wakes' one from fiction and demystifies consciousness). In analytic field theory the setting can be defined--because of the weight given to performativity of language, to the sensory matrix of the transference and the transparency of the medium--the place where an ideal balance is sought between immersion and interaction.

  20. Interactive 3D Mars Visualization

    NASA Technical Reports Server (NTRS)

    Powell, Mark W.

    2012-01-01

    The Interactive 3D Mars Visualization system provides high-performance, immersive visualization of satellite and surface vehicle imagery of Mars. The software can be used in mission operations to provide the most accurate position information for the Mars rovers to date. When integrated into the mission data pipeline, this system allows mission planners to view the location of the rover on Mars to 0.01-meter accuracy with respect to satellite imagery, with dynamic updates to incorporate the latest position information. Given this information so early in the planning process, rover drivers are able to plan more accurate drive activities for the rover than ever before, increasing the execution of science activities significantly. Scientifically, this 3D mapping information puts all of the science analyses to date into geologic context on a daily basis instead of weeks or months, as was the norm prior to this contribution. This allows the science planners to judge the efficacy of their previously executed science observations much more efficiently, and achieve greater science return as a result. The Interactive 3D Mars surface view is a Mars terrain browsing software interface that encompasses the entire region of exploration for a Mars surface exploration mission. The view is interactive, allowing the user to pan in any direction by clicking and dragging, or to zoom in or out by scrolling the mouse or touchpad. This set currently includes tools for selecting a point of interest, and a ruler tool for displaying the distance between and positions of two points of interest. The mapping information can be harvested and shared through ubiquitous online mapping tools like Google Mars, NASA WorldWind, and Worldwide Telescope.

  1. Using Auditory Cues to Perceptually Extract Visual Data in Collaborative, Immersive Big-Data Display Systems

    NASA Astrophysics Data System (ADS)

    Lee, Wendy

    The advent of multisensory display systems, such as virtual and augmented reality, has fostered a new relationship between humans and space. Not only can these systems mimic real-world environments, they have the ability to create a new space typology made solely of data. In these spaces, two-dimensional information is displayed in three dimensions, requiring human senses to be used to understand virtual, attention-based elements. Studies in the field of big data have predominately focused on visual representations and extractions of information with little focus on sounds. The goal of this research is to evaluate the most efficient methods of perceptually extracting visual data using auditory stimuli in immersive environments. Using Rensselaer's CRAIVE-Lab, a virtual reality space with 360-degree panorama visuals and an array of 128 loudspeakers, participants were asked questions based on complex visual displays using a variety of auditory cues ranging from sine tones to camera shutter sounds. Analysis of the speed and accuracy of participant responses revealed that auditory cues that were more favorable for localization and were positively perceived were best for data extraction and could help create more user-friendly systems in the future.

  2. Interactive modeling and simulation of peripheral nerve cords in virtual environments

    NASA Astrophysics Data System (ADS)

    Ullrich, Sebastian; Frommen, Thorsten; Eckert, Jan; Schütz, Astrid; Liao, Wei; Deserno, Thomas M.; Ntouba, Alexandre; Rossaint, Rolf; Prescher, Andreas; Kuhlen, Torsten

    2008-03-01

    This paper contributes to modeling, simulation and visualization of peripheral nerve cords. Until now, only sparse datasets of nerve cords can be found. In addition, this data has not yet been used in simulators, because it is only static. To build up a more flexible anatomical structure of peripheral nerve cords, we propose a hierarchical tree data structure where each node represents a nerve branch. The shape of the nerve segments itself is approximated by spline curves. Interactive modeling allows for the creation and editing of control points which are used for branching nerve sections, calculating spline curves and editing spline representations via cross sections. Furthermore, the control points can be attached to different anatomic structures. Through this approach, nerve cords deform in accordance to the movement of the connected structures, e.g., muscles or bones. As a result, we have developed an intuitive modeling system that runs on desktop computers and in immersive environments. It allows anatomical experts to create movable peripheral nerve cords for articulated virtual humanoids. Direct feedback of changes induced by movement or deformation is achieved by visualization in real-time. The techniques and the resulting data are already used for medical simulators.

  3. Correcting Distance Estimates by Interacting With Immersive Virtual Environments: Effects of Task and Available Sensory Information

    ERIC Educational Resources Information Center

    Waller, David; Richardson, Adam R.

    2008-01-01

    The tendency to underestimate egocentric distances in immersive virtual environments (VEs) is not well understood. However, previous research (A. R. Richardson & D. Waller, 2007) has demonstrated that a brief period of interaction with the VE prior to making distance judgments can effectively eliminate subsequent underestimation. Here the authors…

  4. The Effects of Instructor-Avatar Immediacy in Second Life, an Immersive and Interactive Three-Dimensional Virtual Environment

    ERIC Educational Resources Information Center

    Lawless-Reljic, Sabine Karine

    2010-01-01

    Growing interest of educational institutions in desktop 3D graphic virtual environments for hybrid and distance education prompts questions on the efficacy of such tools. Virtual worlds, such as Second Life[R], enable computer-mediated immersion and interactions encompassing multimodal communication channels including audio, video, and text-.…

  5. Computer-Assisted Culture Learning in an Online Augmented Reality Environment Based on Free-Hand Gesture Interaction

    ERIC Educational Resources Information Center

    Yang, Mau-Tsuen; Liao, Wan-Che

    2014-01-01

    The physical-virtual immersion and real-time interaction play an essential role in cultural and language learning. Augmented reality (AR) technology can be used to seamlessly merge virtual objects with real-world images to realize immersions. Additionally, computer vision (CV) technology can recognize free-hand gestures from live images to enable…

  6. Banque de strategies de production orale pour un enseignement explicite de la 1re a la 7e annee: Expose et interaction. Document d'appui pour les programmes d'etudes de francais langue premiere et de francais langue seconde--immersion de 1998 (Interactive Oral Presentation Strategies for Specific Instruction in Grades 1 through 7. Guide in Support of Programs Teaching French as a First Language and French as a Second Language--Immersion 1998).

    ERIC Educational Resources Information Center

    Alberta Learning, Edmonton. Direction de l'education francaise.

    This teacher's guide, intended for the instruction of both French as a first language and French as a second language in an immersion setting, provides a host of strategies for teaching interactive oral presentation skills in the classroom (Grades 1 through 7). Section 1 is designed to bring the teacher's awareness to the training procedure,…

  7. Banque de strategies de production orale pour un enseignement explicite de la 6e a la 12e annee: Expose et interaction. Document d'appui pour les programmes d'etudes de francais langue premiere et de francais langue seconde--immersion de 1998 (Interactive Oral Presentation Strategies for Specific Instruction in Grades 6 through 12. Guide in Support of Programs Teaching French as a First Language and French as a Second Language--Immersion 1998).

    ERIC Educational Resources Information Center

    Northwest Territories Dept. of Education, Culture and Employment, Yellowknife.

    This teacher's guide, intended for the instruction of both French as a first language and French as a second language in an immersion setting, provides a host of strategies for teaching interactive oral presentation skills in the classroom (Grades 6 through 12). Section 1 is designed to bring the teacher's awareness to the training procedure,…

  8. Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution

    PubMed Central

    Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir

    2016-01-01

    Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks–walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience. PMID:26882473

  9. Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution.

    PubMed

    Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir

    2016-01-01

    Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks-walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the potential usefulness of such environments for understanding what visual scenes are supposed to look like and their potential for complex training and suggested many future environments they wished to experience.

  10. Multivariate Gradient Analysis for Evaluating and Visualizing a Learning System Platform for Computer Programming

    ERIC Educational Resources Information Center

    Mather, Richard

    2015-01-01

    This paper explores the application of canonical gradient analysis to evaluate and visualize student performance and acceptance of a learning system platform. The subject of evaluation is a first year BSc module for computer programming. This uses "Ceebot," an animated and immersive game-like development environment. Multivariate…

  11. Learning about the scale of the solar system using digital planetarium visualizations

    NASA Astrophysics Data System (ADS)

    Yu, Ka Chun; Sahami, Kamran; Dove, James

    2017-07-01

    We studied the use of a digital planetarium for teaching relative distances and sizes in introductory undergraduate astronomy classes. Inspired in part by the classic short film The Powers of Ten and large physical scale models of the Solar System that can be explored on foot, we created lectures using virtual versions of these two pedagogical approaches for classes that saw either an immersive treatment in the planetarium or a non-immersive version in the regular classroom (with N = 973 students participating in total). Students who visited the planetarium had not only the greatest learning gains, but their performance increased with time, whereas students who saw the same visuals projected onto a flat display in their classroom showed less retention over time. The gains seen in the students who visited the planetarium reveal that this medium is a powerful tool for visualizing scale over multiple orders of magnitude. However the modest gains for the students in the regular classroom also show the utility of these visualization approaches for the broader category of classroom physics simulations.

  12. LVC interaction within a mixed-reality training system

    NASA Astrophysics Data System (ADS)

    Pollock, Brice; Winer, Eliot; Gilbert, Stephen; de la Cruz, Julio

    2012-03-01

    The United States military is increasingly pursuing advanced live, virtual, and constructive (LVC) training systems for reduced cost, greater training flexibility, and decreased training times. Combining the advantages of realistic training environments and virtual worlds, mixed reality LVC training systems can enable live and virtual trainee interaction as if co-located. However, LVC interaction in these systems often requires constructing immersive environments, developing hardware for live-virtual interaction, tracking in occluded environments, and an architecture that supports real-time transfer of entity information across many systems. This paper discusses a system that overcomes these challenges to empower LVC interaction in a reconfigurable, mixed reality environment. This system was developed and tested in an immersive, reconfigurable, and mixed reality LVC training system for the dismounted warfighter at ISU, known as the Veldt, to overcome LVC interaction challenges and as a test bed for cuttingedge technology to meet future U.S. Army battlefield requirements. Trainees interact physically in the Veldt and virtually through commercial and developed game engines. Evaluation involving military trained personnel found this system to be effective, immersive, and useful for developing the critical decision-making skills necessary for the battlefield. Procedural terrain modeling, model-matching database techniques, and a central communication server process all live and virtual entity data from system components to create a cohesive virtual world across all distributed simulators and game engines in real-time. This system achieves rare LVC interaction within multiple physical and virtual immersive environments for training in real-time across many distributed systems.

  13. Virtual environment display for a 3D audio room simulation

    NASA Astrophysics Data System (ADS)

    Chapin, William L.; Foster, Scott

    1992-06-01

    Recent developments in virtual 3D audio and synthetic aural environments have produced a complex acoustical room simulation. The acoustical simulation models a room with walls, ceiling, and floor of selected sound reflecting/absorbing characteristics and unlimited independent localizable sound sources. This non-visual acoustic simulation, implemented with 4 audio ConvolvotronsTM by Crystal River Engineering and coupled to the listener with a Poihemus IsotrakTM, tracking the listener's head position and orientation, and stereo headphones returning binaural sound, is quite compelling to most listeners with eyes closed. This immersive effect should be reinforced when properly integrated into a full, multi-sensory virtual environment presentation. This paper discusses the design of an interactive, visual virtual environment, complementing the acoustic model and specified to: 1) allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; 2) reinforce the listener's feeling of telepresence into the acoustical environment with visual and proprioceptive sensations; 3) enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and 4) serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations. The installed system implements a head-coupled, wide-angle, stereo-optic tracker/viewer and multi-computer simulation control. The portable demonstration system implements a head-mounted wide-angle, stereo-optic display, separate head and pointer electro-magnetic position trackers, a heterogeneous parallel graphics processing system, and object oriented C++ program code.

  14. Stereoscopic displays for virtual reality in the car manufacturing industry: application to design review and ergonomic studies

    NASA Astrophysics Data System (ADS)

    Moreau, Guillaume; Fuchs, Philippe

    2002-05-01

    In the car manufacturing industry the trend is to drastically reduce the time-to-market by increasing the use of the Digital Mock-up instead of physical prototypes. Design review and ergonomic studies are specific tasks because they involve qualitative or even subjective judgements. In this paper, we present IMAVE (IMmersion Adapted to a VEhicle) designed for immersive styling review, gaps visualization and simple ergonomic studies. We show that stereoscopic displays are necessary and must fulfill several constraints due to the proximity and size of the car dashboard. The duration fo the work sessions forces us to eliminate all vertical parallax, and 1:1 scale is obviously required for a valid immersion. Two demonstrators were realized allowing us to have a large set of testers (over 100). More than 80% of the testers saw an immediate use of the IMAVE system. We discuss the good and bad marks awarded to the system. Future work include being able to use several rear-projected stereo screens for doors and central console visualization, but without the parallax presently visible in some CAVE-like environments.

  15. Interactive Immersive Virtualmuseum: Digital Documentation for Virtual Interaction

    NASA Astrophysics Data System (ADS)

    Clini, P.; Ruggeri, L.; Angeloni, R.; Sasso, M.

    2018-05-01

    Thanks to their playful and educational approach Virtual Museum systems are very effective for the communication of Cultural Heritage. Among the latest technologies Immersive Virtual Reality is probably the most appealing and potentially effective to serve this purpose; nevertheless, due to a poor user-system interaction, caused by an incomplete maturity of a specific technology for museum applications, it is still quite uncommon to find immersive installations in museums. This paper explore the possibilities offered by this technology and presents a workflow that, starting from digital documentation, makes possible an interaction with archaeological finds or any other cultural heritage inside different kinds of immersive virtual reality spaces. Two different cases studies are presented: the National Archaeological Museum of Marche in Ancona and the 3D reconstruction of the Roman Forum of Fanum Fortunae. Two different approaches not only conceptually but also in contents; while the Archaeological Museum is represented in the application simply using spherical panoramas to give the perception of the third dimension, the Roman Forum is a 3D model that allows visitors to move in the virtual space as in the real one. In both cases, the acquisition phase of the artefacts is central; artefacts are digitized with the photogrammetric technique Structure for Motion then they are integrated inside the immersive virtual space using a PC with a HTC Vive system that allows the user to interact with the 3D models turning the manipulation of objects into a fun and exciting experience. The challenge, taking advantage of the latest opportunities made available by photogrammetry and ICT, is to enrich visitors' experience in Real Museum making possible the interaction with perishable, damaged or lost objects and the public access to inaccessible or no longer existing places promoting in this way the preservation of fragile sites.

  16. Virtual reality in anxiety disorders: the past and the future.

    PubMed

    Gorini, Alessandra; Riva, Giuseppe

    2008-02-01

    One of the most effective treatments of anxiety is exposure therapy: a person is exposed to specific feared situations or objects that trigger anxiety. This exposure process may be done through actual exposure, with visualization, by imagination or using virtual reality (VR), that provides users with computer simulated environments with and within which they can interact. VR is made possible by the capability of computers to synthesize a 3D graphical environment from numerical data. Furthermore, because input devices sense the subject's reactions and motions, the computer can modify the synthetic environment accordingly, creating the illusion of interacting with, and thus being immersed within the environment. Starting from 1995, different experimental studies have been conducted in order to investigate the effect of VR exposure in the treatment of subclinical fears and anxiety disorders. This review will discuss their outcome and provide guidelines for the use of VR exposure for the treatment of anxious patients.

  17. Virtual Solar Energy Center: A Case Study of the Use of Advanced Visualization Techniques for the Comprehension of Complex Engineering Products and Processes

    NASA Astrophysics Data System (ADS)

    Ritter, Kenneth August, III

    Industry has a continuing need to train its workforce on recent engineering developments, but many engineering products and processes are hard to explain because of limitations of size, visibility, time scale, cost, and safety. The product or process might be difficult to see because it is either very large or very small, because it is enclosed within an opaque container, or because it happens very fast or very slowly. Some engineering products and processes are also costly or unsafe to use for training purposes, and sometimes the domain expert is not physically available at the training location. All these limitations can potentially be addressed using advanced visualization techniques such as virtual reality. This dissertation describes the development of an immersive virtual reality application using the Six Sigma DMADV process to explain the main equipment and processes used in a concentrating solar power plant. The virtual solar energy center (VEC) application was initially developed and tested in a Cave Automatic Virtual Environment (CAVE) during 2013 and 2014. The software programs used for development were SolidWorks, 3ds Max Design, and Unity 3D. Current hardware and software technologies that could complement this research were analyzed. The NVIDA GRID Visual Computing Appliance (VCA) was chosen as the rendering solution for animating complex CAD models in this application. The MiddleVR software toolkit was selected as the toolkit for VR interactions and CAVE display. A non-immersive 3D version of the VEC application was tested and shown to be an effective training tool in late 2015. An immersive networked version of the VEC allows the user to receive live instruction from a trainer being projected via depth camera imagery from a remote location. Four comparative analysis studies were performed. These studies used the average normalized gain from pre-test scores to determine the effectiveness of the various training methods. With the DMADV approach, solutions were identified and verified during each iteration of the development, which saved valuable time and resulted in better results being achieved in each revision of the application, with the final version having 88% positive responses and same effectiveness as other methods assessed.

  18. A new 3D immersed boundary method for non-Newtonian fluid-structure-interaction with application

    NASA Astrophysics Data System (ADS)

    Zhu, Luoding

    2017-11-01

    Motivated by fluid-structure-interaction (FSI) phenomena in life sciences (e.g., motions of sperm and cytoskeleton in complex fluids), we introduce a new immersed boundary method for FSI problems involving non-Newtonian fluids in three dimensions. The non-Newtonian fluids are modelled by the FENE-P model (including the Oldroyd-B model as an especial case) and numerically solved by a lattice Boltzmann scheme (the D3Q7 model). The fluid flow is modelled by the lattice Boltzmann equations and numerically solved by the D3Q19 model. The deformable structure and the fluid-structure-interaction are handled by the immersed boundary method. As an application, we study a FSI toy problem - interaction of an elastic plate (flapped at its leading edge and restricted nowhere else) with a non-Newtonian fluid in a 3D flow. Thanks to the support of NSF-DMS support under research Grant 1522554.

  19. Overview of Human-Centric Space Situational Awareness (SSA) Science and Technology (S&T)

    NASA Astrophysics Data System (ADS)

    Ianni, J.; Aleva, D.; Ellis, S.

    2012-09-01

    A number of organizations, within the government, industry, and academia, are researching ways to help humans understand and react to events in space. The problem is both helped and complicated by the fact that there are numerous data sources that need to be planned (i.e., tasked), collected, processed, analyzed, and disseminated. A large part of the research is in support of the Joint Space Operational Center (JSpOC), National Air and Space Intelligence Center (NASIC), and similar organizations. Much recent research has been specifically targeting the JSpOC Mission System (JMS) which has provided a unifying software architecture. This paper will first outline areas of science and technology (S&T) related to human-centric space situational awareness (SSA) and space command and control (C2) including: 1. Object visualization - especially data fused from disparate sources. Also satellite catalog visualizations that convey the physical relationships between space objects. 2. Data visualization - improve data trend analysis as in visual analytics and interactive visualization; e.g., satellite anomaly trends over time, space weather visualization, dynamic visualizations 3. Workflow support - human-computer interfaces that encapsulate multiple computer services (i.e., algorithms, programs, applications) into a 4. Command and control - e.g., tools that support course of action (COA) development and selection, tasking for satellites and sensors, etc. 5. Collaboration - improve individuals or teams ability to work with others; e.g., video teleconferencing, shared virtual spaces, file sharing, virtual white-boards, chat, and knowledge search. 6. Hardware/facilities - e.g., optimal layouts for operations centers, ergonomic workstations, immersive displays, interaction technologies, and mobile computing. Secondly we will provide a survey of organizations working these areas and suggest where more attention may be needed. Although no detailed master plan exists for human-centric SSA and C2, we see little redundancy among the groups supporting SSA human factors at this point.

  20. a Low-Cost and Lightweight 3d Interactive Real Estate-Purposed Indoor Virtual Reality Application

    NASA Astrophysics Data System (ADS)

    Ozacar, K.; Ortakci, Y.; Kahraman, I.; Durgut, R.; Karas, I. R.

    2017-11-01

    Interactive 3D architectural indoor design have been more popular after it benefited from Virtual Reality (VR) technologies. VR brings computer-generated 3D content to real life scale and enable users to observe immersive indoor environments so that users can directly modify it. This opportunity enables buyers to purchase a property off-the-plan cheaper through virtual models. Instead of showing property through 2D plan or renders, this visualized interior architecture of an on-sale unbuilt property is demonstrated beforehand so that the investors have an impression as if they were in the physical building. However, current applications either use highly resource consuming software, or are non-interactive, or requires specialist to create such environments. In this study, we have created a real-estate purposed low-cost high quality fully interactive VR application that provides a realistic interior architecture of the property by using free and lightweight software: Sweet Home 3D and Unity. A preliminary study showed that participants generally liked proposed real estate-purposed VR application, and it satisfied the expectation of the property buyers.

  1. Image-guided surgery.

    PubMed

    Wagner, A; Ploder, O; Enislidis, G; Truppe, M; Ewers, R

    1996-04-01

    Interventional video tomography (IVT), a new imaging modality, achieves virtual visualization of anatomic structures in three dimensions for intraoperative stereotactic navigation. Partial immersion into a virtual data space, which is orthotopically coregistered to the surgical field, enhances, by means of a see-through head-mounted display (HMD), the surgeon's visual perception and technique by providing visual access to nonvisual data of anatomy, physiology, and function. The presented cases document the potential of augmented reality environments in maxillofacial surgery.

  2. Narrow band imaging combined with water immersion technique in the diagnosis of celiac disease.

    PubMed

    Valitutti, Francesco; Oliva, Salvatore; Iorfida, Donatella; Aloi, Marina; Gatti, Silvia; Trovato, Chiara Maria; Montuori, Monica; Tiberti, Antonio; Cucchiara, Salvatore; Di Nardo, Giovanni

    2014-12-01

    The "multiple-biopsy" approach both in duodenum and bulb is the best strategy to confirm the diagnosis of celiac disease; however, this increases the invasiveness of the procedure itself and is time-consuming. To evaluate the diagnostic yield of a single biopsy guided by narrow-band imaging combined with water immersion technique in paediatric patients. Prospective assessment of the diagnostic accuracy of narrow-band imaging/water immersion technique-driven biopsy approach versus standard protocol in suspected celiac disease. The experimental approach correctly diagnosed 35/40 children with celiac disease, with an overall diagnostic sensitivity of 87.5% (95% CI: 77.3-97.7). An altered pattern of narrow-band imaging/water immersion technique endoscopic visualization was significantly associated with villous atrophy at guided biopsy (Spearman Rho 0.637, p<0.001). Concordance of narrow-band imaging/water immersion technique endoscopic assessments was high between two operators (K: 0.884). The experimental protocol was highly timesaving compared to the standard protocol. An altered narrow-band imaging/water immersion technique pattern coupled with high anti-transglutaminase antibodies could allow a single guided biopsy to diagnose celiac disease. When no altered mucosal pattern is visible even by narrow-band imaging/water immersion technique, multiple bulbar and duodenal biopsies should be obtained. Copyright © 2014. Published by Elsevier Ltd.

  3. Ground reaction forces in shallow water running are affected by immersion level, running speed and gender.

    PubMed

    Haupenthal, Alessandro; Fontana, Heiliane de Brito; Ruschel, Caroline; dos Santos, Daniela Pacheco; Roesler, Helio

    2013-07-01

    To analyze the effect of depth of immersion, running speed and gender on ground reaction forces during water running. Controlled laboratory study. Twenty adults (ten male and ten female) participated by running at two levels of immersion (hip and chest) and two speed conditions (slow and fast). Data were collected using an underwater force platform. The following variables were analyzed: vertical force peak (Fy), loading rate (LR) and anterior force peak (Fx anterior). Three-factor mixed ANOVA was used to analyze data. Significant effects of immersion level, speed and gender on Fy were observed, without interaction between factors. Fy was greater when females ran fast at the hip level. There was a significant increase in LR with a reduction in the level of immersion regardless of the speed and gender. No effect of speed or gender on LR was observed. Regarding Fx anterior, significant interaction between speed and immersion level was found: in the slow condition, participants presented greater values at chest immersion, whereas, during the fast running condition, greater values were observed at hip level. The effect of gender was only significant during fast water running, with Fx anterior being greater in the men group. Increasing speed raised Fx anterior significantly irrespective of the level of immersion and gender. The magnitude of ground reaction forces during shallow water running are affected by immersion level, running speed and gender and, for this reason, these factors should be taken into account during exercise prescription. Copyright © 2012 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  4. Researching the Ethical Dimensions of Mobile, Ubiquitous and Immersive Technology Enhanced Learning (MUITEL): A Thematic Review and Dialogue

    ERIC Educational Resources Information Center

    Lally, Vic; Sharples, Mike; Tracy, Frances; Bertram, Neil; Masters, Sherriden

    2012-01-01

    In this article, we examine the ethical dimensions of researching the mobile, ubiquitous and immersive technology enhanced learning (MUITEL), with a particular focus on learning in informal settings. We begin with an analysis of the interactions between mobile, ubiquitous and immersive technologies and the wider context of the digital economy. In…

  5. Research and Construction Lunar Stereoscopic Visualization System Based on Chang'E Data

    NASA Astrophysics Data System (ADS)

    Gao, Xingye; Zeng, Xingguo; Zhang, Guihua; Zuo, Wei; Li, ChunLai

    2017-04-01

    With lunar exploration activities carried by Chang'E-1, Chang'E-2 and Chang'E-3 lunar probe, a large amount of lunar data has been obtained, including topographical and image data covering the whole moon, as well as the panoramic image data of the spot close to the landing point of Chang'E-3. In this paper, we constructed immersive virtual moon system based on acquired lunar exploration data by using advanced stereoscopic visualization technology, which will help scholars to carry out research on lunar topography, assist the further exploration of lunar science, and implement the facilitation of lunar science outreach to the public. In this paper, we focus on the building of lunar stereoscopic visualization system with the combination of software and hardware by using binocular stereoscopic display technology, real-time rendering algorithm for massive terrain data, and building virtual scene technology based on panorama, to achieve an immersive virtual tour of the whole moon and local moonscape of Chang'E-3 landing point.

  6. Analysis of brain activity and response during monoscopic and stereoscopic visualization

    NASA Astrophysics Data System (ADS)

    Calore, Enrico; Folgieri, Raffaella; Gadia, Davide; Marini, Daniele

    2012-03-01

    Stereoscopic visualization in cinematography and Virtual Reality (VR) creates an illusion of depth by means of two bidimensional images corresponding to different views of a scene. This perceptual trick is used to enhance the emotional response and the sense of presence and immersivity of the observers. An interesting question is if and how it is possible to measure and analyze the level of emotional involvement and attention of the observers during a stereoscopic visualization of a movie or of a virtual environment. The research aims represent a challenge, due to the large number of sensorial, physiological and cognitive stimuli involved. In this paper we begin this research by analyzing possible differences in the brain activity of subjects during the viewing of monoscopic or stereoscopic contents. To this aim, we have performed some preliminary experiments collecting electroencephalographic (EEG) data of a group of users using a Brain- Computer Interface (BCI) during the viewing of stereoscopic and monoscopic short movies in a VR immersive installation.

  7. Scientific & Intelligence Exascale Visualization Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Money, James H.

    SIEVAS provides an immersive visualization framework for connecting multiple systems in real time for data science. SIEVAS provides the ability to connect multiple COTS and GOTS products in a seamless fashion for data fusion, data analysis, and viewing. It provides this capability by using a combination of micro services, real time messaging, and web service compliant back-end system.

  8. Postural Hypo-Reactivity in Autism Is Contingent on Development and Visual Environment: A Fully Immersive Virtual Reality Study

    ERIC Educational Resources Information Center

    Greffou, Selma; Bertone, Armando; Hahler, Eva-Maria; Hanssens, Jean-Marie; Mottron, Laurent; Faubert, Jocelyn

    2012-01-01

    Although atypical motor behaviors have been associated with autism, investigations regarding their possible origins are scarce. This study assessed the visual and vestibular components involved in atypical postural reactivity in autism. Postural reactivity and stability were measured for younger (12-15 years) and older (16-33 years) autistic…

  9. Virtual reality training improves balance function.

    PubMed

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-09-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function.

  10. Virtual reality training improves balance function

    PubMed Central

    Mao, Yurong; Chen, Peiming; Li, Le; Huang, Dongfeng

    2014-01-01

    Virtual reality is a new technology that simulates a three-dimensional virtual world on a computer and enables the generation of visual, audio, and haptic feedback for the full immersion of users. Users can interact with and observe objects in three-dimensional visual space without limitation. At present, virtual reality training has been widely used in rehabilitation therapy for balance dysfunction. This paper summarizes related articles and other articles suggesting that virtual reality training can improve balance dysfunction in patients after neurological diseases. When patients perform virtual reality training, the prefrontal, parietal cortical areas and other motor cortical networks are activated. These activations may be involved in the reconstruction of neurons in the cerebral cortex. Growing evidence from clinical studies reveals that virtual reality training improves the neurological function of patients with spinal cord injury, cerebral palsy and other neurological impairments. These findings suggest that virtual reality training can activate the cerebral cortex and improve the spatial orientation capacity of patients, thus facilitating the cortex to control balance and increase motion function. PMID:25368651

  11. Heats of immersion of titania powders in primer solutions

    NASA Technical Reports Server (NTRS)

    Siriwardane, R.; Wightman, J. P.

    1983-01-01

    The oxide layer present on titanium alloys can play an important role in determining the strength and durability of adhesive bonds. Here, three titania powders in different crystalline phases, rutile-R1, anatase-A1, and anatase-A2, are characterized by several techniques. These include microelectrophoresis, X-ray diffractometry, surface area pore volume analysis, X-ray photoelectron spectroscopy, and measurements of the heats of immersion. Of the three powders, R1 has the highest heat of immersion in water, while the interaction between water and A1 powder is low. Experimental data also suggest a specific preferential interaction of polyphenylquinoxaline with anatase.

  12. Medical Student Bias and Care Recommendations for an Obese versus Non-Obese Virtual Patient

    PubMed Central

    Persky, Susan; Eccleston, Collette P.

    2010-01-01

    Objective This study examined the independent effect of a patient's weight on medical students' attitudes, beliefs, and interpersonal behavior toward the patient, in addition to the clinical recommendations they make for her care. Design Seventy-six clinical-level medical students were randomly assigned to interact with a digital, virtual female patient who was visibly either obese or non-obese. Methods Interactions with the patient took place in an immersive virtual clinical environment (i.e., virtual reality) which allowed standardization of all patient behaviors and characteristics except for weight. Visual contact behavior was automatically recorded during the interaction. Afterward, participants filled out a battery of self-report questionnaires. Results Analyses revealed more negative stereotyping, less anticipated patient adherence, worse perceived health, more responsibility attributed for potentially weight-related presenting complaints, and less visual contact directed toward the obese version of a virtual patient than the non-obese version of the patient. In contrast, there was no clear evidence of bias in clinical recommendations made for the patient's care. Conclusion Biases in attitudes, beliefs, and interpersonal behavior have important implications because they can influence the tone of clinical encounters and rapport in the patient-provider relationship, which can have important downstream consequences. Gaining a clear understanding of the nature and source of weight bias in the clinical encounter is an important first step toward development of strategies to address it. PMID:20820169

  13. Sculpting 3D worlds with music: advanced texturing techniques

    NASA Astrophysics Data System (ADS)

    Greuel, Christian; Bolas, Mark T.; Bolas, Niko; McDowall, Ian E.

    1996-04-01

    Sound within the virtual environment is often considered to be secondary to the graphics. In a typical scenario, either audio cues are locally associated with specific 3D objects or a general aural ambiance is supplied in order to alleviate the sterility of an artificial experience. This paper discusses a completely different approach, in which cues are extracted from live or recorded music in order to create geometry and control object behaviors within a computer- generated environment. Advanced texturing techniques used to generate complex stereoscopic images are also discussed. By analyzing music for standard audio characteristics such as rhythm and frequency, information is extracted and repackaged for processing. With the Soundsculpt Toolkit, this data is mapped onto individual objects within the virtual environment, along with one or more predetermined behaviors. Mapping decisions are implemented with a user definable schedule and are based on the aesthetic requirements of directors and designers. This provides for visually active, immersive environments in which virtual objects behave in real-time correlation with the music. The resulting music-driven virtual reality opens up several possibilities for new types of artistic and entertainment experiences, such as fully immersive 3D `music videos' and interactive landscapes for live performance.

  14. A succinct overview of virtual reality technology use in Alzheimer's disease.

    PubMed

    García-Betances, Rebeca I; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer's disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers' education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments.

  15. The nature of the (visualization) game: Challenges and opportunities from computational geophysics

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them to better understand the nature of complex, multiscale geoscience data.

  16. Effect of visual distortion on postural balance in a full immersion stereoscopic environment

    NASA Astrophysics Data System (ADS)

    Faubert, Jocelyn; Allard, Remy

    2004-05-01

    This study attempted to determine the influence of non-linear visual movements on our capacity to maintain postural control. An 8x8x8 foot CAVE immersive virtual environment was used. Body sway recordings were obtained for both head and lower back (lumbar 2-3) positions. The subjects were presented with visual stimuli for periods of 62.5 seconds. Subjects were asked to stand still on one foot while viewing stimuli consisting of multiplied sine waves generating movement undulation of a textured surface (waves moving in checkerboard pattern). Three wave amplitudes were tested: 4 feet, 2 feet, and 1 foot. Two viewing conditions were also used; observers looking at 36 inches in front of their feet; observers looking at a distance near the horizon. The results were compiled using an instability index and the data showed a profound and consistent effect of visual disturbances on postural balance in particular for the x (side-to-side) movement. We have demonstrated that non-linear visual distortions similar to those generated by progressive ophthalmic lenses of the kind used for presbyopia corrections, can generate significant postural instability. This instability is particularly evident for the side-to-side body movement and is most evident for the near viewing condition.

  17. VRML metabolic network visualizer.

    PubMed

    Rojdestvenski, Igor

    2003-03-01

    A successful date collection visualization should satisfy a set of many requirements: unification of diverse data formats, support for serendipity research, support of hierarchical structures, algorithmizability, vast information density, Internet-readiness, and other. Recently, virtual reality has made significant progress in engineering, architectural design, entertainment and communication. We experiment with the possibility of using the immersive abstract three-dimensional visualizations of the metabolic networks. We present the trial Metabolic Network Visualizer software, which produces graphical representation of a metabolic network as a VRML world from a formal description written in a simple SGML-type scripting language.

  18. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  19. Eye height scaling of absolute size in immersive and nonimmersive displays

    NASA Technical Reports Server (NTRS)

    Dixon, M. W.; Wraga, M.; Proffitt, D. R.; Williams, G. C.; Kaiser, M. K. (Principal Investigator)

    2000-01-01

    Eye-height (EH) scaling of absolute height was investigated in three experiments. In Experiment 1, standing observers viewed cubes in an immersive virtual environment. Observers' center of projection was placed at actual EH and at 0.7 times actual EH. Observers' size judgments revealed that the EH manipulation was 76.8% effective. In Experiment 2, seated observers viewed the same cubes on an interactive desktop display; however, no effect of EH was found in response to the simulated EH manipulation. Experiment 3 tested standing observers in the immersive environment with the field of view reduced to match that of the desktop. Comparable to Experiment 1, the effect of EH was 77%. These results suggest that EH scaling is not generally used when people view an interactive desktop display because the altitude of the center of projection is indeterminate. EH scaling is spontaneously evoked, however, in immersive environments.

  20. A novel shape-changing haptic table-top display

    NASA Astrophysics Data System (ADS)

    Wang, Jiabin; Zhao, Lu; Liu, Yue; Wang, Yongtian; Cai, Yi

    2018-01-01

    A shape-changing table-top display with haptic feedback allows its users to perceive 3D visual and texture displays interactively. Since few existing devices are developed as accurate displays with regulatory haptic feedback, a novel attentive and immersive shape changing mechanical interface (SCMI) consisting of image processing unit and transformation unit was proposed in this paper. In order to support a precise 3D table-top display with an offset of less than 2 mm, a custommade mechanism was developed to form precise surface and regulate the feedback force. The proposed image processing unit was capable of extracting texture data from 2D picture for rendering shape-changing surface and realizing 3D modeling. The preliminary evaluation result proved the feasibility of the proposed system.

  1. Stability and activity of lactate dehydrogenase on biofunctional layers deposited by activated vapor silanization (AVS) and immersion silanization (IS)

    NASA Astrophysics Data System (ADS)

    Calvo, Jorge Nieto-Márquez; Elices, Manuel; Guinea, Gustavo V.; Pérez-Rigueiro, José; Arroyo-Hernández, María

    2017-09-01

    The interaction between surfaces and biological elements, in particular, proteins is critical for the performance of biomaterials and biosensors. This interaction can be controlled by modifying the surface in a process known as biofunctionalization. In this work, the enzyme lactate dehydrogenase (LDH) is used to study the stability of the interaction between a functional protein and amine-functionalized surfaces. Two different functionalization procedures were compared: Activated Vapor Silanization (AVS) and Immersion Silanization (IS). Adsorption kinetics is shown to follow the Langmuir model for AVS-functionalized samples, while IS-functionalized samples show a certain instability if immersed in an aqueous medium for several hours. In turn, the enzymatic activity of LDH is preserved for longer times by using glutaraldehyde as crosslinker between the AVS biofunctional surface and the enzyme.

  2. [Water immersion as an anti-g protection for pilot. Pro et contra].

    PubMed

    Barer, A S

    2007-01-01

    In the period of 1988 - 1990 the ZVEZDA Aerospace medicine Department fulfilled comprehensive physiological investigations in order to assess the prospects for water immersion as an anti-g gear for pilots of high-maneuver aircraft. Both dry and open water immersion methods were used. More than 150 centrifuge runs were conducted to define limits for the acceleration value and time of 9-g tolerance. Volunteered subjects in the pilot's posture were inclined at 35 degrees and 55 degrees to the total inertial force vector. The obvious subjective discomfort felt during acceleration and absence of clinical aftereffect were qualified as a positive outcome. The subjects were ready for repeated runs even after a very brief repose. The main impediment to the professional anti-g use of immersion is visual disorders which in this case are not predictors of coming loss of consciousness and attributed to alterations in regional hemodynamics. The authors assert that there is a good reason to continue search for implementation of the immersion principle in g-protection of pilots to reduce the rate of professional pathologies and to intensify flights.

  3. Substantive hemodynamic and thermal strain upon completing lower-limb hot-water immersion; comparisons with treadmill running.

    PubMed

    Thomas, Kate N; van Rij, André M; Lucas, Samuel J E; Gray, Andrew R; Cotter, James D

    2016-01-01

    Exercise induces arterial flow patterns that promote functional and structural adaptations, improving functional capacity and reducing cardiovascular risk. While heat is produced by exercise, local and whole-body passive heating have recently been shown to generate favorable flow profiles and associated vascular adaptations in the upper limb. Flow responses to acute heating in the lower limbs have not yet been assessed, or directly compared to exercise, and other cardiovascular effects of lower-limb heating have not been fully characterized. Lower-limb heating by hot-water immersion (30 min at 42°C, to the waist) was compared to matched-duration treadmill running (65-75% age-predicted heart rate maximum) in 10 healthy, young adult volunteers. Superficial femoral artery shear rate assessed immediately upon completion was increased to a greater extent following immersion (mean ± SD: immersion +252 ± 137% vs. exercise +155 ± 69%, interaction: p = 0.032), while superficial femoral artery flow-mediated dilation was unchanged in either intervention. Immersion increased heart rate to a lower peak than during exercise (immersion +38 ± 3 beats·min -1 vs. exercise +87 ± 3 beats·min -1 , interaction: p < 0.001), whereas only immersion reduced mean arterial pressure after exposure (-8 ± 3 mmHg, p = 0.012). Core temperature increased twice as much during immersion as exercise (+1.3 ± 0.4°C vs. +0.6 ± 0.4°C, p < 0.001). These data indicate that acute lower-limb hot-water immersion has potential to induce favorable shear stress patterns and cardiovascular responses within vessels prone to atherosclerosis. Whether repetition of lower-limb heating has long-term beneficial effects in such vasculature remains unexplored.

  4. Effects of virtual reality immersion and audiovisual distraction techniques for patients with pruritus

    PubMed Central

    Leibovici, Vera; Magora, Florella; Cohen, Sarale; Ingber, Arieh

    2009-01-01

    BACKGROUND: Virtual reality immersion (VRI), an advanced computer-generated technique, decreased subjective reports of pain in experimental and procedural medical therapies. Furthermore, VRI significantly reduced pain-related brain activity as measured by functional magnetic resonance imaging. Resemblance between anatomical and neuroendocrine pathways of pain and pruritus may prove VRI to be a suitable adjunct for basic and clinical studies of the complex aspects of pruritus. OBJECTIVES: To compare effects of VRI with audiovisual distraction (AVD) techniques for attenuation of pruritus in patients with atopic dermatitis and psoriasis vulgaris. METHODS: Twenty-four patients suffering from chronic pruritus – 16 due to atopic dermatitis and eight due to psoriasis vulgaris – were randomly assigned to play an interactive computer game using a special visor or a computer screen. Pruritus intensity was self-rated before, during and 10 min after exposure using a visual analogue scale ranging from 0 to 10. The interviewer rated observed scratching on a three-point scale during each distraction program. RESULTS: Student’s t tests were significant for reduction of pruritus intensity before and during VRI and AVD (P=0.0002 and P=0.01, respectively) and were significant only between ratings before and after VRI (P=0.017). Scratching was mostly absent or mild during both programs. CONCLUSIONS: VRI and AVD techniques demonstrated the ability to diminish itching sensations temporarily. Further studies on the immediate and late effects of interactive computer distraction techniques to interrupt itching episodes will open potential paths for future pruritus research. PMID:19714267

  5. Habituation of the cold shock response may include a significant perceptual component.

    PubMed

    Barwood, Martin J; Corbett, Jo; Wagstaff, Christopher R D

    2014-02-01

    Accidental immersion in cold water is a risk factor for many occupations. Habituation to cold-water immersion (CWI) is one practical means of reducing the cold shock response (CSR) on immersion. We investigated whether repeated thermoneutral water immersion (TWI) induced a perceptual habituation (i.e., could lessen perceived threat and anxiety) and consequently reduce the CSR on subsequent CWI. There were 12 subjects who completed seven 7-min head-out immersions. Immersions one and seven were CWls [15.0 (0.1) degrees C], and immersions two to six were TWI [34.9 (0.10) degrees C]. Anxiety 120-cm visual analogue scale) and the cardiorespiratory responses [heart rate (f(C)), respiratory frequency (f(R)), tidal volume (V(T)), and minute ventilation (V(E))] to immersion were measured throughout. Data were compared within subject between conditions using ANOVA to an alpha level of 0.05. Acute anxiety was significantly reduced after repeated exposure to the immersion scenario (i.e., TWI): CWI-1: 6.3 (4.4) cm; and CWI-2: 4.5 (4.0) cm [condition mean (SD)]. These differences did not influence the peak in the CSR. The f(C), f(R), and V(E) responses were similar between CWI-1 and CWI-2. V(T) response was significantly lower in CWI-2; mean (SD) across the immersion: CWI-1 1.27 (0.17) vs. CWI-2 1.11 0.21 L. Repeated TWI lessened the anxiety associated with CWI (perceptual habituation). This had a negligible effect on the primary components of the CSR, but did lower VT, which may reduce the volume of any aspirated water in an emergency situation. Reducing the threat appraisal of an environmental stressor may be a useful biproduct of survival training, thereby minimizing psychophysiological strain.

  6. Why Does the Buddha Laugh? Exploring Ethnic Visual Culture

    ERIC Educational Resources Information Center

    Shin, Ryan

    2010-01-01

    As an art educator and a native Korean immersed in Asian culture until 30 years of age, and one who has gained some insights into the two cultures of East Asia and America, the author is constantly thinking of what students will learn from embracing Asian visuals and objects in art curriculum. He asks if their history, identity, form and function,…

  7. Coercive Narratives, Motivation and Role Playing in Virtual Worlds

    DTIC Science & Technology

    2002-01-01

    resource for making immersive virtual environments highly engaging. Interaction also appeals to our natural desire to discover. Reading a book contains...participation in an open-ended Virtual Environment (VE). I intend to take advantage of a participants’ natural tendency to prefer interaction when possible...I hope this work will expand the potential of experience within virtual worlds. K e y w o r d s : Immersive Environments , Virtual Environments

  8. Immersive 3D Visualization of Astronomical Data

    NASA Astrophysics Data System (ADS)

    Schaaff, A.; Berthier, J.; Da Rocha, J.; Deparis, N.; Derriere, S.; Gaultier, P.; Houpin, R.; Normand, J.; Ocvirk, P.

    2015-09-01

    The immersive-3D visualization, or Virtual Reality in our study, was previously dedicated to specific uses (research, flight simulators, etc.) The investment in infrastructure and its cost was reserved to large laboratories or companies. Lately we saw the development of immersive-3D masks intended for wide distribution, for example the Oculus Rift and the Sony Morpheus projects. The usual reaction is to say that these tools are primarily intended for games since it is easy to imagine a player in a virtual environment and the added value to conventional 2D screens. Yet it is likely that there are many applications in the professional field if these tools are becoming common. Introducing this technology into existing applications or new developments makes sense only if interest is properly evaluated. The use in Astronomy is clear for education, it is easy to imagine mobile and light planetariums or to reproduce poorly accessible environments (e.g., large instruments). In contrast, in the field of professional astronomy the use is probably less obvious and it requires to conduct studies to determine the most appropriate ones and to assess the contributions compared to the other display modes.

  9. Computed microtomography visualization and quantification of mouse ischemic brain lesion by nonionic radio contrast agents

    PubMed Central

    Dobrivojević, Marina; Bohaček, Ivan; Erjavec, Igor; Gorup, Dunja; Gajović, Srećko

    2013-01-01

    Aim To explore the possibility of brain imaging by microcomputed tomography (microCT) using x-ray contrasting methods to visualize mouse brain ischemic lesions after middle cerebral artery occlusion (MCAO). Methods Isolated brains were immersed in ionic or nonionic radio contrast agent (RCA) for 5 days and subsequently scanned using microCT scanner. To verify whether ex-vivo microCT brain images can be used to characterize ischemic lesions, they were compared to Nissl stained serial histological sections of the same brains. To verify if brains immersed in RCA may be used afterwards for other methods, subsequent immunofluorescent labeling with anti-NeuN was performed. Results Nonionic RCA showed better gray to white matter contrast in the brain, and therefore was selected for further studies. MicroCT measurement of ischemic lesion size and cerebral edema significantly correlated with the values determined by Nissl staining (ischemic lesion size: P=0.0005; cerebral edema: P=0.0002). Brain immersion in nonionic RCA did not affect subsequent immunofluorescent analysis and NeuN immunoreactivity. Conclusion MicroCT method was proven to be suitable for delineation of the ischemic lesion from the non-infarcted tissue, and quantification of lesion volume and cerebral edema. PMID:23444240

  10. Computed microtomography visualization and quantification of mouse ischemic brain lesion by nonionic radio contrast agents.

    PubMed

    Dobrivojević, Marina; Bohaček, Ivan; Erjavec, Igor; Gorup, Dunja; Gajović, Srećko

    2013-02-01

    To explore the possibility of brain imaging by microcomputed tomography (microCT) using x-ray contrasting methods to visualize mouse brain ischemic lesions after middle cerebral artery occlusion (MCAO). Isolated brains were immersed in ionic or nonionic radio contrast agent (RCA) for 5 days and subsequently scanned using microCT scanner. To verify whether ex-vivo microCT brain images can be used to characterize ischemic lesions, they were compared to Nissl stained serial histological sections of the same brains. To verify if brains immersed in RCA may be used afterwards for other methods, subsequent immunofluorescent labeling with anti-NeuN was performed. Nonionic RCA showed better gray to white matter contrast in the brain, and therefore was selected for further studies. MicroCT measurement of ischemic lesion size and cerebral edema significantly correlated with the values determined by Nissl staining (ischemic lesion size: P=0.0005; cerebral edema: P=0.0002). Brain immersion in nonionic RCA did not affect subsequent immunofluorescent analysis and NeuN immunoreactivity. MicroCT method was proven to be suitable for delineation of the ischemic lesion from the non-infarcted tissue, and quantification of lesion volume and cerebral edema.

  11. Software for math and science education for the deaf.

    PubMed

    Adamo-Villani, Nicoletta; Wilbur, Ronnie

    2010-01-01

    In this article, we describe the development of two novel approaches to teaching math and science concepts to deaf children using 3D animated interactive software. One approach, Mathsigner, is non-immersive and the other, SMILE, is a virtual reality immersive environment. The content is curriculum-based, and the animated signing characters are constructed with state-of-the art technology and design. We report preliminary development findings of usability and appeal based on programme features (e.g. 2D/3D, immersiveness, interaction type, avatar and interface design) and subject features (hearing status, gender and age). Programme features of 2D/3D, immersiveness and interaction type were very much affected by subject features. Among subject features, we find significant effects of hearing status (deaf children take longer time and make more mistakes than hearing children) and gender (girls take longer than boys; girls prefer immersive environments rather than desktop presentation; girls are more interested in content than technology compared to boys). For avatar type, we found a preference for seamless, deformable characters over segmented ones. For interface comparisons, there were no subject effects, but an animated interface resulted in reduced time to task completion compared to static interfaces with and without sound and highlighting. These findings identify numerous features that affect software design and appeal and suggest that designers must be careful in their assumptions during programme development.

  12. Tactile Radar: experimenting a computer game with visually disabled.

    PubMed

    Kastrup, Virgínia; Cassinelli, Alvaro; Quérette, Paulo; Bergstrom, Niklas; Sampaio, Eliana

    2017-09-18

    Visually disabled people increasingly use computers in everyday life, thanks to novel assistive technologies better tailored to their cognitive functioning. Like sighted people, many are interested in computer games - videogames and audio-games. Tactile-games are beginning to emerge. The Tactile Radar is a device through which a visually disabled person is able to detect distal obstacles. In this study, it is connected to a computer running a tactile-game. The game consists in finding and collecting randomly arranged coins in a virtual room. The study was conducted with nine congenital blind people including both sexes, aged 20-64 years old. Complementary methods of first and third person were used: the debriefing interview and the quasi-experimental design. The results indicate that the Tactile Radar is suitable for the creation of computer games specifically tailored for visually disabled people. Furthermore, the device seems capable of eliciting a powerful immersive experience. Methodologically speaking, this research contributes to the consolidation and development of first and third person complementary methods, particularly useful in disabled people research field, including the evaluation by users of the Tactile Radar effectiveness in a virtual reality context. Implications for rehabilitation Despite the growing interest in virtual games for visually disabled people, they still find barriers to access such games. Through the development of assistive technologies such as the Tactile Radar, applied in virtual games, we can create new opportunities for leisure, socialization and education for visually disabled people. The results of our study indicate that the Tactile Radar is adapted to the creation of video games for visually disabled people, providing a playful interaction with the players.

  13. New Visualization Techniques to Analyze Ultra-High Resolution Four-dimensional Surface Deformation Imagery Collected With Ground-based Tripod LiDAR

    NASA Astrophysics Data System (ADS)

    Kreylos, O.; Bawden, G. W.; Kellogg, L. H.

    2005-12-01

    We are developing a visualization application to display and interact with very large (tens of millions of points) four-dimensional point position datasets in an immersive environment such that point groups from repeated Tripod LiDAR (Light Detection And Ranging) surveys can be selected, measured, and analyzed for land surface change using 3D~interactions. Ground-based tripod or terrestrial LiDAR (T-LiDAR) can remotely collect ultra-high resolution (centimeter to subcentimeter) and accurate (± 4 mm) digital imagery of the scanned target, and at scanning rates of 2,000 (x, y, z, i) (3D~position~+ intensity) points per second over 7~million points can be collected for a given target in an hour. We developed a multiresolution point set data representation based on octrees to display large T-LiDAR point cloud datasets at the frame rates required for immersive display (between 60 Hz and 120 Hz). Data inside an observer's region of interest is shown in full detail, whereas data outside the field of view or far away from the observer is shown at reduced resolution to provide context. Using 3D input devices at the University of California Davis KeckCAVES, users can navigate large point sets, accurately select related point groups in two or more point sets by sweeping regions of space, and guide the software in deriving positional information from point groups to compute their displacements between surveys. We used this new software application in the KeckCAVES to analyze 4D T-LiDAR imagery from the June~1, 2005 Blue Bird Canyon landslide in Laguna Beach, southern California. Over 50~million (x, y, z, i) data points were collected between 10 and 21~days after the landslide to evaluate T-LiDAR as a natural hazards response tool. The visualization of the T-LiDAR scans within the immediate landslide showed minor readjustments in the weeks following the primarily landslide with no observable continued motion on the primary landslide. Recovery and demolition efforts across the landslide, such as the building of new roads and removal of unstable structures, are easily identified and assessed with the new software through the differencing of aligned imagery.

  14. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World's Largest Open Source Data Sets

    NASA Astrophysics Data System (ADS)

    Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.

    2017-10-01

    Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  15. Direct manipulation of virtual objects

    NASA Astrophysics Data System (ADS)

    Nguyen, Long K.

    Interacting with a Virtual Environment (VE) generally requires the user to correctly perceive the relative position and orientation of virtual objects. For applications requiring interaction in personal space, the user may also need to accurately judge the position of the virtual object relative to that of a real object, for example, a virtual button and the user's real hand. This is difficult since VEs generally only provide a subset of the cues experienced in the real world. Complicating matters further, VEs presented by currently available visual displays may be inaccurate or distorted due to technological limitations. Fundamental physiological and psychological aspects of vision as they pertain to the task of object manipulation were thoroughly reviewed. Other sensory modalities -- proprioception, haptics, and audition -- and their cross-interactions with each other and with vision are briefly discussed. Visual display technologies, the primary component of any VE, were canvassed and compared. Current applications and research were gathered and categorized by different VE types and object interaction techniques. While object interaction research abounds in the literature, pockets of research gaps remain. Direct, dexterous, manual interaction with virtual objects in Mixed Reality (MR), where the real, seen hand accurately and effectively interacts with virtual objects, has not yet been fully quantified. An experimental test bed was designed to provide the highest accuracy attainable for salient visual cues in personal space. Optical alignment and user calibration were carefully performed. The test bed accommodated the full continuum of VE types and sensory modalities for comprehensive comparison studies. Experimental designs included two sets, each measuring depth perception and object interaction. The first set addressed the extreme end points of the Reality-Virtuality (R-V) continuum -- Immersive Virtual Environment (IVE) and Reality Environment (RE). This validated, linked, and extended several previous research findings, using one common test bed and participant pool. The results provided a proven method and solid reference points for further research. The second set of experiments leveraged the first to explore the full R-V spectrum and included additional, relevant sensory modalities. It consisted of two full-factorial experiments providing for rich data and key insights into the effect of each type of environment and each modality on accuracy and timeliness of virtual object interaction. The empirical results clearly showed that mean depth perception error in personal space was less than four millimeters whether the stimuli presented were real, virtual, or mixed. Likewise, mean error for the simple task of pushing a button was less than four millimeters whether the button was real or virtual. Mean task completion time was less than one second. Key to the high accuracy and quick task performance time observed was the correct presentation of the visual cues, including occlusion, stereoscopy, accommodation, and convergence. With performance results already near optimal level with accurate visual cues presented, adding proprioception, audio, and haptic cues did not significantly improve performance. Recommendations for future research include enhancement of the visual display and further experiments with more complex tasks and additional control variables.

  16. User Directed Tools for Exploiting Expert Knowledge in an Immersive Segmentation and Visualization Environment

    NASA Technical Reports Server (NTRS)

    Senger, Steven O.

    1998-01-01

    Volumetric data sets have become common in medicine and many sciences through technologies such as computed x-ray tomography (CT), magnetic resonance (MR), positron emission tomography (PET), confocal microscopy and 3D ultrasound. When presented with 2D images humans immediately and unconsciously begin a visual analysis of the scene. The viewer surveys the scene identifying significant landmarks and building an internal mental model of presented information. The identification of features is strongly influenced by the viewers expectations based upon their expert knowledge of what the image should contain. While not a conscious activity, the viewer makes a series of choices about how to interpret the scene. These choices occur in parallel with viewing the scene and effectively change the way the viewer sees the image. It is this interaction of viewing and choice which is the basis of many familiar visual illusions. This is especially important in the interpretation of medical images where it is the expert knowledge of the radiologist which interprets the image. For 3D data sets this interaction of view and choice is frustrated because choices must precede the visualization of the data set. It is not possible to visualize the data set with out making some initial choices which determine how the volume of data is presented to the eye. These choices include, view point orientation, region identification, color and opacity assignments. Further compounding the problem is the fact that these visualization choices are defined in terms of computer graphics as opposed to language of the experts knowledge. The long term goal of this project is to develop an environment where the user can interact with volumetric data sets using tools which promote the utilization of expert knowledge by incorporating visualization and choice into a tight computational loop. The tools will support activities involving the segmentation of structures, construction of surface meshes and local filtering of the data set. To conform to this environment tools should have several key attributes. First, they should be only rely on computations over a local neighborhood of the probe position. Second, they should operate iteratively over time converging towards a limit behavior. Third, they should adapt to user input modifying they operational parameters with time.

  17. Can Simulator Immersion Change Cognitive Style? Results from a Cross-Sectional Study of Field-Dependence--Independence in Air Traffic Control Students

    ERIC Educational Resources Information Center

    Van Eck, Richard N.; Fu, Hongxia; Drechsel, Paul V. J.

    2015-01-01

    Air traffic control (ATC) operations are critical to the U.S. aviation infrastructure, making ATC training a critical area of study. Because ATC performance is heavily dependent on visual processing, it is important to understand how to screen for or promote relevant visual processing abilities. While conventional wisdom has maintained that such…

  18. A Succinct Overview of Virtual Reality Technology Use in Alzheimer’s Disease

    PubMed Central

    García-Betances, Rebeca I.; Arredondo Waldmeyer, María Teresa; Fico, Giuseppe; Cabrera-Umpiérrez, María Fernanda

    2015-01-01

    We provide a brief review and appraisal of recent and current virtual reality (VR) technology for Alzheimer’s disease (AD) applications. We categorize them according to their intended purpose (e.g., diagnosis, patient cognitive training, caregivers’ education, etc.), focus feature (e.g., spatial impairment, memory deficit, etc.), methodology employed (e.g., tasks, games, etc.), immersion level, and passive or active interaction. Critical assessment indicates that most of them do not yet take full advantage of virtual environments with high levels of immersion and interaction. Many still rely on conventional 2D graphic displays to create non-immersive or semi-immersive VR scenarios. Important improvements are needed to make VR a better and more versatile assessment and training tool for AD. The use of the latest display technologies available, such as emerging head-mounted displays and 3D smart TV technologies, together with realistic multi-sensorial interaction devices, and neuro-physiological feedback capacity, are some of the most beneficial improvements this mini-review suggests. Additionally, it would be desirable that such VR applications for AD be easily and affordably transferable to in-home and nursing home environments. PMID:26029101

  19. Subjective experiences of watching stereoscopic Avatar and U2 3D in a cinema

    NASA Astrophysics Data System (ADS)

    Pölönen, Monika; Salmimaa, Marja; Takatalo, Jari; Häkkinen, Jukka

    2012-01-01

    A stereoscopic 3-D version of the film Avatar was shown to 85 people who subsequently answered questions related to sickness, visual strain, stereoscopic image quality, and sense of presence. Viewing Avatar for 165 min induced some symptoms of visual strain and sickness, but the symptom levels remained low. A comparison between Avatar and previously published results for the film U2 3D showed that sickness and visual strain levels were similar despite the films' runtimes. The genre of the film had a significant effect on the viewers' opinions and sense of presence. Avatar, which has been described as a combination of action, adventure, and sci-fi genres, was experienced as more immersive and engaging than the music documentary U2 3D. However, participants in both studies were immersed, focused, and absorbed in watching the stereoscopic 3-D (S3-D) film and were pleased with the film environments. The results also showed that previous stereoscopic 3-D experience significantly reduced the amount of reported eye strain and complaints about the weight of the viewing glasses.

  20. Visible Geology - Interactive online geologic block modelling

    NASA Astrophysics Data System (ADS)

    Cockett, R.

    2012-12-01

    Geology is a highly visual science, and many disciplines require spatial awareness and manipulation. For example, interpreting cross-sections, geologic maps, or plotting data on a stereonet all require various levels of spatial abilities. These skills are often not focused on in undergraduate geoscience curricula and many students struggle with spatial relations, manipulations, and penetrative abilities (e.g. Titus & Horsman, 2009). A newly developed program, Visible Geology, allows for students to be introduced to many geologic concepts and spatial skills in a virtual environment. Visible Geology is a web-based, three-dimensional environment where students can create and interrogate their own geologic block models. The program begins with a blank model, users then add geologic beds (with custom thickness and color) and can add geologic deformation events like tilting, folding, and faulting. Additionally, simple intrusive dikes can be modelled, as well as unconformities. Students can also explore the interaction of geology with topography by drawing elevation contours to produce their own topographic models. Students can not only spatially manipulate their model, but can create cross-sections and boreholes to practice their visual penetrative abilities. Visible Geology is easy to access and use, with no downloads required, so it can be incorporated into current, paper-based, lab activities. Sample learning activities are being developed that target introductory and structural geology curricula with learning objectives such as relative geologic history, fault characterization, apparent dip and thickness, interference folding, and stereonet interpretation. Visible Geology provides a richly interactive, and immersive environment for students to explore geologic concepts and practice their spatial skills.; Screenshot of Visible Geology showing folding and faulting interactions on a ridge topography.

  1. Effect of cold-water immersion on skeletal muscle contractile properties in soccer players.

    PubMed

    García-Manso, Juan Manuel; Rodríguez-Matoso, Darío; Rodríguez-Ruiz, David; Sarmiento, Samuel; de Saa, Yves; Calderón, Javier

    2011-05-01

    This study was designed to analyze changes in muscle response after cold-water immersion. The vastus lateralis of the dominant leg was analyzed in 12 professional soccer players from the Spanish 2nd Division B using tensiomyography, before and after four cold-water immersions at 4°C lasting 4 mins each. Core temperature, skin temperature, and heart rate were monitored. A significant interaction (P ≤ 0.05) was found in muscle deformation between control conditions (5.12 ± 2.27 mm) and (1) immersion 3 (3.64 ± 2.27 mm) and (2) immersion 4 (3.38 ± 1.34 mm). A steady decrease was also observed in response velocity (immersion 1, -7.3%; immersion 2, -25.9%; immersion 3, -30.0%; immersion 4, -36.6%) and contraction velocity (immersion 1, -11.5%; immersion 2, -22.1%; immersion 3, -35.0%; immersion 4, -41.9%), with statistically significant differences (P ≤ 0.05) in relation to the reference values commencing with the third immersion. No significant differences were found between control conditions in subsequent exposures to cold water for the values of response time and contraction time. Sustained time and reaction time showed an increase during repeated exposures and with longer exposure time, although the increase was not statistically significant. This study shows that repeated cold-water immersions (4 × 4 mins at 4°C) cause considerable alterations to muscle behavior. These alterations significantly affect the state of muscles and their response capacity, particularly in relation to muscle stiffness and muscle contraction velocity.

  2. 3d visualization of atomistic simulations on every desktop

    NASA Astrophysics Data System (ADS)

    Peled, Dan; Silverman, Amihai; Adler, Joan

    2013-08-01

    Once upon a time, after making simulations, one had to go to a visualization center with fancy SGI machines to run a GL visualization and make a movie. More recently, OpenGL and its mesa clone have let us create 3D on simple desktops (or laptops), whether or not a Z-buffer card is present. Today, 3D a la Avatar is a commodity technique, presented in cinemas and sold for home TV. However, only a few special research centers have systems large enough for entire classes to view 3D, or special immersive facilities like visualization CAVEs or walls, and not everyone finds 3D immersion easy to view. For maximum physics with minimum effort a 3D system must come to each researcher and student. So how do we create 3D visualization cheaply on every desktop for atomistic simulations? After several months of attempts to select commodity equipment for a whole room system, we selected an approach that goes back a long time, even predating GL. The old concept of anaglyphic stereo relies on two images, slightly displaced, and viewed through colored glasses, or two squares of cellophane from a regular screen/projector or poster. We have added this capability to our AViz atomistic visualization code in its new, 6.1 version, which is RedHat, CentOS and Ubuntu compatible. Examples using data from our own research and that of other groups will be given.

  3. The Effects of Vision-Related Aspects on Noise Perception of Wind Turbines in Quiet Areas

    PubMed Central

    Maffei, Luigi; Iachini, Tina; Masullo, Massimiliano; Aletta, Francesco; Sorrentino, Francesco; Senese, Vincenzo Paolo; Ruotolo, Francesco

    2013-01-01

    Preserving the soundscape and geographic extension of quiet areas is a great challenge against the wide-spreading of environmental noise. The E.U. Environmental Noise Directive underlines the need to preserve quiet areas as a new aim for the management of noise in European countries. At the same time, due to their low population density, rural areas characterized by suitable wind are considered appropriate locations for installing wind farms. However, despite the fact that wind farms are represented as environmentally friendly projects, these plants are often viewed as visual and audible intruders, that spoil the landscape and generate noise. Even though the correlations are still unclear, it is obvious that visual impacts of wind farms could increase due to their size and coherence with respect to the rural/quiet environment. In this paper, by using the Immersive Virtual Reality technique, some visual and acoustical aspects of the impact of a wind farm on a sample of subjects were assessed and analyzed. The subjects were immersed in a virtual scenario that represented a situation of a typical rural outdoor scenario that they experienced at different distances from the wind turbines. The influence of the number and the colour of wind turbines on global, visual and auditory judgment were investigated. The main results showed that, regarding the number of wind turbines, the visual component has a weak effect on individual reactions, while the colour influences both visual and auditory individual reactions, although in a different way. PMID:23624578

  4. The effects of vision-related aspects on noise perception of wind turbines in quiet areas.

    PubMed

    Maffei, Luigi; Iachini, Tina; Masullo, Massimiliano; Aletta, Francesco; Sorrentino, Francesco; Senese, Vincenzo Paolo; Ruotolo, Francesco

    2013-04-26

    Preserving the soundscape and geographic extension of quiet areas is a great challenge against the wide-spreading of environmental noise. The E.U. Environmental Noise Directive underlines the need to preserve quiet areas as a new aim for the management of noise in European countries. At the same time, due to their low population density, rural areas characterized by suitable wind are considered appropriate locations for installing wind farms. However, despite the fact that wind farms are represented as environmentally friendly projects, these plants are often viewed as visual and audible intruders, that spoil the landscape and generate noise. Even though the correlations are still unclear, it is obvious that visual impacts of wind farms could increase due to their size and coherence with respect to the rural/quiet environment. In this paper, by using the Immersive Virtual Reality technique, some visual and acoustical aspects of the impact of a wind farm on a sample of subjects were assessed and analyzed. The subjects were immersed in a virtual scenario that represented a situation of a typical rural outdoor scenario that they experienced at different distances from the wind turbines. The influence of the number and the colour of wind turbines on global, visual and auditory judgment were investigated. The main results showed that, regarding the number of wind turbines, the visual component has a weak effect on individual reactions, while the colour influences both visual and auditory individual reactions, although in a different way.

  5. Impact of immersion oils and mounting media on the confocal imaging of dendritic spines

    PubMed Central

    Peterson, Brittni M.; Mermelstein, Paul G.; Meisel, Robert L.

    2015-01-01

    Background Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. New Method Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Results Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Comparison with Existing Method Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Conclusion Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. PMID:25601477

  6. Impact of immersion oils and mounting media on the confocal imaging of dendritic spines.

    PubMed

    Peterson, Brittni M; Mermelstein, Paul G; Meisel, Robert L

    2015-03-15

    Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Interactive 3D visualization of structural changes in the brain of a person with corticobasal syndrome

    PubMed Central

    Hänel, Claudia; Pieperhoff, Peter; Hentschel, Bernd; Amunts, Katrin; Kuhlen, Torsten

    2014-01-01

    The visualization of the progression of brain tissue loss in neurodegenerative diseases like corticobasal syndrome (CBS) can provide not only information about the localization and distribution of the volume loss, but also helps to understand the course and the causes of this neurodegenerative disorder. The visualization of such medical imaging data is often based on 2D sections, because they show both internal and external structures in one image. Spatial information, however, is lost. 3D visualization of imaging data is capable to solve this problem, but it faces the difficulty that more internally located structures may be occluded by structures near the surface. Here, we present an application with two designs for the 3D visualization of the human brain to address these challenges. In the first design, brain anatomy is displayed semi-transparently; it is supplemented by an anatomical section and cortical areas for spatial orientation, and the volumetric data of volume loss. The second design is guided by the principle of importance-driven volume rendering: A direct line-of-sight to the relevant structures in the deeper parts of the brain is provided by cutting out a frustum-like piece of brain tissue. The application was developed to run in both, standard desktop environments and in immersive virtual reality environments with stereoscopic viewing for improving the depth perception. We conclude, that the presented application facilitates the perception of the extent of brain degeneration with respect to its localization and affected regions. PMID:24847243

  8. Story immersion of videogames for youth health promotion: A review of literature

    USDA-ARS?s Scientific Manuscript database

    This article reviews research in the fields of psychology, literature, communication, human–computer interaction, public health, and consumer behavior on narrative and its potential relationships with videogames and story immersion. It also reviews a narrative's role in complementing behavioral chan...

  9. Game engines and immersive displays

    NASA Astrophysics Data System (ADS)

    Chang, Benjamin; Destefano, Marc

    2014-02-01

    While virtual reality and digital games share many core technologies, the programming environments, toolkits, and workflows for developing games and VR environments are often distinct. VR toolkits designed for applications in visualization and simulation often have a different feature set or design philosophy than game engines, while popular game engines often lack support for VR hardware. Extending a game engine to support systems such as the CAVE gives developers a unified development environment and the ability to easily port projects, but involves challenges beyond just adding stereo 3D visuals. In this paper we outline the issues involved in adapting a game engine for use with an immersive display system including stereoscopy, tracking, and clustering, and present example implementation details using Unity3D. We discuss application development and workflow approaches including camera management, rendering synchronization, GUI design, and issues specific to Unity3D, and present examples of projects created for a multi-wall, clustered, stereoscopic display.

  10. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality

    PubMed Central

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-01-01

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545

  11. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.

    PubMed

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-05-17

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.

  12. Applied virtual reality at the Research Triangle Institute

    NASA Technical Reports Server (NTRS)

    Montoya, R. Jorge

    1994-01-01

    Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.

  13. Fish in the matrix: motor learning in a virtual world.

    PubMed

    Engert, Florian

    2012-01-01

    One of the large remaining challenges in the field of zebrafish neuroscience is the establishment of techniques and preparations that permit the recording and perturbation of neural activity in animals that can interact meaningfully with the environment. Since it is very difficult to do this in freely behaving zebrafish, I describe here two alternative approaches that meet this goal via tethered preparations. The first uses head-fixation in agarose in combination with online imaging and analysis of tail motion. In the second method, paralyzed fish are suspended with suction pipettes in mid-water and nerve root recordings serve as indicators for intended locomotion. In both cases, fish can be immersed into a virtual environment and allowed to interact with this virtual world via real or fictive tail motions. The specific examples given in this review focus primarily on the role of visual feedback~- but the general principles certainly extend to other modalities, including proprioception, hearing, balance, and somatosensation.

  14. Fish in the matrix: motor learning in a virtual world

    PubMed Central

    Engert, Florian

    2013-01-01

    One of the large remaining challenges in the field of zebrafish neuroscience is the establishment of techniques and preparations that permit the recording and perturbation of neural activity in animals that can interact meaningfully with the environment. Since it is very difficult to do this in freely behaving zebrafish, I describe here two alternative approaches that meet this goal via tethered preparations. The first uses head-fixation in agarose in combination with online imaging and analysis of tail motion. In the second method, paralyzed fish are suspended with suction pipettes in mid-water and nerve root recordings serve as indicators for intended locomotion. In both cases, fish can be immersed into a virtual environment and allowed to interact with this virtual world via real or fictive tail motions. The specific examples given in this review focus primarily on the role of visual feedback~– but the general principles certainly extend to other modalities, including proprioception, hearing, balance, and somatosensation. PMID:23355810

  15. A software system for evaluation and training of spatial reasoning and neuroanatomical knowledge in a virtual environment.

    PubMed

    Armstrong, Ryan; de Ribaupierre, Sandrine; Eagleson, Roy

    2014-04-01

    This paper describes the design and development of a software tool for the evaluation and training of surgical residents using an interactive, immersive, virtual environment. Our objective was to develop a tool to evaluate user spatial reasoning skills and knowledge in a neuroanatomical context, as well as to augment their performance through interactivity. In the visualization, manually segmented anatomical surface images of MRI scans of the brain were rendered using a stereo display to improve depth cues. A magnetically tracked wand was used as a 3D input device for localization tasks within the brain. The movement of the wand was made to correspond to movement of a spherical cursor within the rendered scene, providing a reference for localization. Users can be tested on their ability to localize structures within the 3D scene, and their ability to place anatomical features at the appropriate locations within the rendering. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Effect of sodium hypochlorite and peracetic acid on the surface roughness of acrylic resin polymerized by heated water for short and long cycles.

    PubMed

    Sczepanski, Felipe; Sczepanski, Claudia Roberta Brunnquell; Berger, Sandrine Bittencourt; Consani, Rafael Leonardo Xediek; Gonini-Júnior, Alcides; Guiraldo, Ricardo Danil

    2014-10-01

    To evaluate the surface roughness of acrylic resin submitted to chemical disinfection via 1% sodium hypochlorite (NaClO) or 1% peracetic acid (C2H4O3). The disc-shaped resin specimens (30 mm diameter ×4 mm height) were polymerized by heated water using two cycles (short cycle: 1 h at 74°C and 30 min at 100°C; conventional long cycle: 9 h at 74°C). The release of substances by these specimens in water solution was also quantified. Specimens were fabricated, divided into four groups (n = 10) depending on the polymerization time and disinfectant. After polishing, the specimens were stored in distilled deionized water. Specimens were immersed in 1% NaClO or 1% C2H4O3 for 30 min, and then were immersed in distilled deionized water for 20 min. The release of C2H4O3 and NaClO was measured via visual colorimetric analysis. Roughness was measured before and after disinfection. Roughness data were subjected to two-way ANOVA and Tukey's test. There was no interaction between polymerization time and disinfectant in influencing the average surface roughness (Ra, P = 0.957). Considering these factors independently, there were significant differences between short and conventional long cycles (P = 0.012), but no significant difference between the disinfectants hypochlorite and C2H4O3 (P = 0.366). Visual colorimetric analysis did not detect release of substances. It was concluded that there was the difference in surface roughness between short and conventional long cycles, and disinfection at acrylic resins polymerized by heated water using a short cycle modified the properties of roughness.

  17. Effect of sodium hypochlorite and peracetic acid on the surface roughness of acrylic resin polymerized by heated water for short and long cycles

    PubMed Central

    Sczepanski, Felipe; Sczepanski, Claudia Roberta Brunnquell; Berger, Sandrine Bittencourt; Consani, Rafael Leonardo Xediek; Gonini-Júnior, Alcides; Guiraldo, Ricardo Danil

    2014-01-01

    Objective: To evaluate the surface roughness of acrylic resin submitted to chemical disinfection via 1% sodium hypochlorite (NaClO) or 1% peracetic acid (C2H4O3). Materials and Methods: The disc-shaped resin specimens (30 mm diameter ×4 mm height) were polymerized by heated water using two cycles (short cycle: 1 h at 74°C and 30 min at 100°C; conventional long cycle: 9 h at 74°C). The release of substances by these specimens in water solution was also quantified. Specimens were fabricated, divided into four groups (n = 10) depending on the polymerization time and disinfectant. After polishing, the specimens were stored in distilled deionized water. Specimens were immersed in 1% NaClO or 1% C2H4O3 for 30 min, and then were immersed in distilled deionized water for 20 min. The release of C2H4O3 and NaClO was measured via visual colorimetric analysis. Roughness was measured before and after disinfection. Roughness data were subjected to two-way ANOVA and Tukey's test. Results: There was no interaction between polymerization time and disinfectant in influencing the average surface roughness (Ra, P = 0.957). Considering these factors independently, there were significant differences between short and conventional long cycles (P = 0.012), but no significant difference between the disinfectants hypochlorite and C2H4O3 (P = 0.366). Visual colorimetric analysis did not detect release of substances. Conclusion: It was concluded that there was the difference in surface roughness between short and conventional long cycles, and disinfection at acrylic resins polymerized by heated water using a short cycle modified the properties of roughness. PMID:25512737

  18. Immersive Visual Data Analysis For Geoscience Using Commodity VR Hardware

    NASA Astrophysics Data System (ADS)

    Kreylos, O.; Kellogg, L. H.

    2017-12-01

    Immersive visualization using virtual reality (VR) display technology offers tremendous benefits for the visual analysis of complex three-dimensional data like those commonly obtained from geophysical and geological observations and models. Unlike "traditional" visualization, which has to project 3D data onto a 2D screen for display, VR can side-step this projection and display 3D data directly, in a pseudo-holographic (head-tracked stereoscopic) form, and does therefore not suffer the distortions of relative positions, sizes, distances, and angles that are inherent in 2D projection. As a result, researchers can apply their spatial reasoning skills to virtual data in the same way they can to real objects or environments. The UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES, http://keckcaves.org) has been developing VR methods for data analysis since 2005, but the high cost of VR displays has been preventing large-scale deployment and adoption of KeckCAVES technology. The recent emergence of high-quality commodity VR, spearheaded by the Oculus Rift and HTC Vive, has fundamentally changed the field. With KeckCAVES' foundational VR operating system, Vrui, now running natively on the HTC Vive, all KeckCAVES visualization software, including 3D Visualizer, LiDAR Viewer, Crusta, Nanotech Construction Kit, and ProtoShop, are now available to small labs, single researchers, and even home users. LiDAR Viewer and Crusta have been used for rapid response to geologic events including earthquakes and landslides, to visualize the impacts of sealevel rise, to investigate reconstructed paleooceanographic masses, and for exploration of the surface of Mars. The Nanotech Construction Kit is being used to explore the phases of carbon in Earth's deep interior, while ProtoShop can be used to construct and investigate protein structures.

  19. Numerical investigation of nonlinear fluid-structure interaction dynamic behaviors under a general Immersed Boundary-Lattice Boltzmann-Finite Element method

    NASA Astrophysics Data System (ADS)

    Gong, Chun-Lin; Fang, Zhe; Chen, Gang

    A numerical approach based on the immersed boundary (IB), lattice Boltzmann and nonlinear finite element method (FEM) is proposed to simulate hydrodynamic interactions of very flexible objects. In the present simulation framework, the motion of fluid is obtained by solving the discrete lattice Boltzmann equations on Eulerian grid, the behaviors of flexible objects are calculated through nonlinear dynamic finite element method, and the interactive forces between them are implicitly obtained using velocity correction IB method which satisfies the no-slip conditions well at the boundary points. The efficiency and accuracy of the proposed Immersed Boundary-Lattice Boltzmann-Finite Element method is first validated by a fluid-structure interaction (F-SI) benchmark case, in which a flexible filament flaps behind a cylinder in channel flow, then the nonlinear vibration mechanism of the cylinder-filament system is investigated by altering the Reynolds number of flow and the material properties of filament. The interactions between two tandem and side-by-side identical objects in a uniform flow are also investigated, and the in-phase and out-of-phase flapping behaviors are captured by the proposed method.

  20. Development of 3D interactive visual objects using the Scripps Institution of Oceanography's Visualization Center

    NASA Astrophysics Data System (ADS)

    Kilb, D.; Reif, C.; Peach, C.; Keen, C. S.; Smith, B.; Mellors, R. J.

    2003-12-01

    Within the last year scientists and educators at the Scripps Institution of Oceanography (SIO), the Birch Aquarium at Scripps and San Diego State University have collaborated with education specialists to develop 3D interactive graphic teaching modules for use in the classroom and in teacher workshops at the SIO Visualization center (http://siovizcenter.ucsd.edu). The unique aspect of the SIO Visualization center is that the center is designed around a 120 degree curved Panoram floor-to-ceiling screen (8'6" by 28'4") that immerses viewers in a virtual environment. The center is powered by an SGI 3400 Onyx computer that is more powerful, by an order of magnitude in both speed and memory, than typical base systems currently used for education and outreach presentations. This technology allows us to display multiple 3D data layers (e.g., seismicity, high resolution topography, seismic reflectivity, draped interferometric synthetic aperture radar (InSAR) images, etc.) simultaneously, render them in 3D stereo, and take a virtual flight through the data as dictated on the spot by the user. This system can also render snapshots, images and movies that are too big for other systems, and then export smaller size end-products to more commonly used computer systems. Since early 2002, we have explored various ways to provide informal education and outreach focusing on current research presented directly by the researchers doing the work. The Center currently provides a centerpiece for instruction on southern California seismology for K-12 students and teachers for various Scripps education endeavors. Future plans are in place to use the Visualization Center at Scripps for extended K-12 and college educational programs. In particular, we will be identifying K-12 curriculum needs, assisting with teacher education, developing assessments of our programs and products, producing web-accessible teaching modules and facilitating the development of appropriate teaching tools to be used directly by classroom teachers.

  1. Headlines: Planet Earth: Improving Climate Literacy with Short Format News Videos

    NASA Astrophysics Data System (ADS)

    Tenenbaum, L. F.; Kulikov, A.; Jackson, R.

    2012-12-01

    One of the challenges of communicating climate science is the sense that climate change is remote and unconnected to daily life--something that's happening to someone else or in the future. To help face this challenge, NASA's Global Climate Change website http://climate.nasa.gov has launched a new video series, "Headlines: Planet Earth," which focuses on current climate news events. This rapid-response video series uses 3D video visualization technology combined with real-time satellite data and images, to throw a spotlight on real-world events.. The "Headlines: Planet Earth" news video products will be deployed frequently, ensuring timeliness. NASA's Global Climate Change Website makes extensive use of interactive media, immersive visualizations, ground-based and remote images, narrated and time-lapse videos, time-series animations, and real-time scientific data, plus maps and user-friendly graphics that make the scientific content both accessible and engaging to the public. The site has also won two consecutive Webby Awards for Best Science Website. Connecting climate science to current real-world events will contribute to improving climate literacy by making climate science relevant to everyday life.

  2. Defense applications of the CAVE (CAVE automatic virtual environment)

    NASA Astrophysics Data System (ADS)

    Isabelle, Scott K.; Gilkey, Robert H.; Kenyon, Robert V.; Valentino, George; Flach, John M.; Spenny, Curtis H.; Anderson, Timothy R.

    1997-07-01

    The CAVE is a multi-person, room-sized, high-resolution, 3D video and auditory environment, which can be used to present very immersive virtual environment experiences. This paper describes the CAVE technology and the capability of the CAVE system as originally developed at the Electronics Visualization Laboratory of the University of Illinois- Chicago and as more recently implemented by Wright State University (WSU) in the Armstrong Laboratory at Wright- Patterson Air Force Base (WPAFB). One planned use of the WSU/WPAFB CAVE is research addressing the appropriate design of display and control interfaces for controlling uninhabited aerial vehicles. The WSU/WPAFB CAVE has a number of features that make it well-suited to this work: (1) 360 degrees surround, plus floor, high resolution visual displays, (2) virtual spatialized audio, (3) the ability to integrate real and virtual objects, and (4) rapid and flexible reconfiguration. However, even though the CAVE is likely to have broad utility for military applications, it does have certain limitations that may make it less well- suited to applications that require 'natural' haptic feedback, vestibular stimulation, or an ability to interact with close detailed objects.

  3. Habituation of the cold shock response is inhibited by repeated anxiety: Implications for safety behaviour on accidental cold water immersions.

    PubMed

    Barwood, Martin J; Corbett, Jo; Tipton, Mike; Wagstaff, Christopher; Massey, Heather

    2017-05-15

    Accidental cold-water immersion (CWI) triggers the life-threatening cold shock response (CSR) which is a precursor to sudden death on immersion. One practical means of reducing the CSR is to induce an habituation by undergoing repeated short CWIs. Habituation of the CSR is known to be partially reversed by the concomitant experience of acute anxiety, raising the possibility that repeated anxiety could prevent CSR habituation; we tested this hypothesis. Sixteen participants (12 male, 4 female) completed seven, seven-minute immersions in to cold water (15°C). Immersion one acted as a control (CON1). During immersions two to five, which would ordinarily induce an habituation, anxiety levels were repeatedly increased (CWI-ANX rep ) by deception and a demanding mathematical task. Immersions six and seven were counter-balanced with another high anxiety condition (CWI-ANX rep ) or a further control (CON2). Anxiety (20cm visual analogue scale) and cardiorespiratory responses (cardiac frequency [f c ], respiratory frequency [f R ], tidal volume [V T ], minute ventilation [V̇ E ]) were measured. Comparisons were made between experimental immersions (CON1, final CWI-ANX rep , CON2), across habituation immersions and with data from a previous study. Anxiety levels were sustained at a similar level throughout the experimental and habituation immersions (mean [SD] CON1: 7.0 [4.0] cm; CON2: 5.8 [5.2] cm cf CWI-ANX rep : 7.3 [5.5] cm; p>0.05). This culminated in failure of the CSR to habituate even when anxiety levels were not manipulated (i.e. CON2). These data were different (p<0.05) to previous studies where anxiety levels were allowed to fall across habituation immersions and the CSR consequently habituated. Repeated anxiety prevented CSR habituation. A protective strategy that includes inducing habituation for those at risk should include techniques to lower anxiety associated with the immersion event or habituation may not be beneficial in the emergency scenario. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Cultural immersion alters emotion perception: Neurophysiological evidence from Chinese immigrants to Canada.

    PubMed

    Liu, Pan; Rigoulot, Simon; Pell, Marc D

    2017-12-01

    To explore how cultural immersion modulates emotion processing, this study examined how Chinese immigrants to Canada process multisensory emotional expressions, which were compared to existing data from two groups, Chinese and North Americans. Stroop and Oddball paradigms were employed to examine different stages of emotion processing. The Stroop task presented face-voice pairs expressing congruent/incongruent emotions and participants actively judged the emotion of one modality while ignoring the other. A significant effect of cultural immersion was observed in the immigrants' behavioral performance, which showed greater interference from to-be-ignored faces, comparable with what was observed in North Americans. However, this effect was absent in their N400 data, which retained the same pattern as the Chinese. In the Oddball task, where immigrants passively viewed facial expressions with/without simultaneous vocal emotions, they exhibited a larger visual MMN for faces accompanied by voices, again mirroring patterns observed in Chinese. Correlation analyses indicated that the immigrants' living duration in Canada was associated with neural patterns (N400 and visual mismatch negativity) more closely resembling North Americans. Our data suggest that in multisensory emotion processing, adopting to a new culture first leads to behavioral accommodation followed by alterations in brain activities, providing new evidence on human's neurocognitive plasticity in communication.

  5. Evaluating an immersive virtual environment prototyping and simulation system

    NASA Astrophysics Data System (ADS)

    Nemire, Kenneth

    1997-05-01

    An immersive virtual environment (IVE) modeling and simulation tool is being developed for designing advanced weapon and training systems. One unique feature of the tool is that the design, and not just visualization of the design is accomplished with the IVE tool. Acceptance of IVE tools requires comparisons with current commercial applications. In this pilot study, expert users of a popular desktop 3D graphics application performed identical modeling and simulation tasks using both the desktop and IVE applications. The IVE tool consisted of a head-mounted display, 3D spatialized sound, spatial trackers on head and hands, instrumented gloves, and a simulated speech recognition system. The results are preliminary because performance from only four users has been examined. When using the IVE system, users completed the tasks to criteria in less time than when using the desktop application. Subjective ratings of the visual displays in each system were similar. Ratings for the desktop controls were higher than for the IVE controls. Ratings of immersion and user enjoyment were higher for the IVE than for the desktop application. These results are particular remarkable because participants had used the desktop application regularly for three to five years and the prototype IVE tool for only three to six hours.

  6. Immersive viewing engine

    NASA Astrophysics Data System (ADS)

    Schonlau, William J.

    2006-05-01

    An immersive viewing engine providing basic telepresence functionality for a variety of application types is presented. Augmented reality, teleoperation and virtual reality applications all benefit from the use of head mounted display devices that present imagery appropriate to the user's head orientation at full frame rates. Our primary application is the viewing of remote environments, as with a camera equipped teleoperated vehicle. The conventional approach where imagery from a narrow field camera onboard the vehicle is presented to the user on a small rectangular screen is contrasted with an immersive viewing system where a cylindrical or spherical format image is received from a panoramic camera on the vehicle, resampled in response to sensed user head orientation and presented via wide field eyewear display, approaching 180 degrees of horizontal field. Of primary interest is the user's enhanced ability to perceive and understand image content, even when image resolution parameters are poor, due to the innate visual integration and 3-D model generation capabilities of the human visual system. A mathematical model for tracking user head position and resampling the panoramic image to attain distortion free viewing of the region appropriate to the user's current head pose is presented and consideration is given to providing the user with stereo viewing generated from depth map information derived using stereo from motion algorithms.

  7. "Immersive Education" Submerges Students in Online Worlds Made for Learning

    ERIC Educational Resources Information Center

    Foster, Andrea L.

    2007-01-01

    Immersive Education is a multimillion-dollar project devoted to build virtual-reality software exclusively for education within commercial and nonprofit fantasy spaces like Second Life. The project combines interactive three-dimensional graphics, Web cameras, Internet-based telephony, and other digital media. Some critics have complained that…

  8. High-immersion three-dimensional display of the numerical computer model

    NASA Astrophysics Data System (ADS)

    Xing, Shujun; Yu, Xunbo; Zhao, Tianqi; Cai, Yuanfa; Chen, Duo; Chen, Zhidong; Sang, Xinzhu

    2013-08-01

    High-immersion three-dimensional (3D) displays making them valuable tools for many applications, such as designing and constructing desired building houses, industrial architecture design, aeronautics, scientific research, entertainment, media advertisement, military areas and so on. However, most technologies provide 3D display in the front of screens which are in parallel with the walls, and the sense of immersion is decreased. To get the right multi-view stereo ground image, cameras' photosensitive surface should be parallax to the public focus plane and the cameras' optical axes should be offset to the center of public focus plane both atvertical direction and horizontal direction. It is very common to use virtual cameras, which is an ideal pinhole camera to display 3D model in computer system. We can use virtual cameras to simulate the shooting method of multi-view ground based stereo image. Here, two virtual shooting methods for ground based high-immersion 3D display are presented. The position of virtual camera is determined by the people's eye position in the real world. When the observer stand in the circumcircle of 3D ground display, offset perspective projection virtual cameras is used. If the observer stands out the circumcircle of 3D ground display, offset perspective projection virtual cameras and the orthogonal projection virtual cameras are adopted. In this paper, we mainly discussed the parameter setting of virtual cameras. The Near Clip Plane parameter setting is the main point in the first method, while the rotation angle of virtual cameras is the main point in the second method. In order to validate the results, we use the D3D and OpenGL to render scenes of different viewpoints and generate a stereoscopic image. A realistic visualization system for 3D models is constructed and demonstrated for viewing horizontally, which provides high-immersion 3D visualization. The displayed 3D scenes are compared with the real objects in the real world.

  9. Do you see what I hear: experiments in multi-channel sound and 3D visualization for network monitoring?

    NASA Astrophysics Data System (ADS)

    Ballora, Mark; Hall, David L.

    2010-04-01

    Detection of intrusions is a continuing problem in network security. Due to the large volumes of data recorded in Web server logs, analysis is typically forensic, taking place only after a problem has occurred. This paper describes a novel method of representing Web log information through multi-channel sound, while simultaneously visualizing network activity using a 3-D immersive environment. We are exploring the detection of intrusion signatures and patterns, utilizing human aural and visual pattern recognition ability to detect intrusions as they occur. IP addresses and return codes are mapped to an informative and unobtrusive listening environment to act as a situational sound track of Web traffic. Web log data is parsed and formatted using Python, then read as a data array by the synthesis language SuperCollider [1], which renders it as a sonification. This can be done either for the study of pre-existing data sets or in monitoring Web traffic in real time. Components rendered aurally include IP address, geographical information, and server Return Codes. Users can interact with the data, speeding or slowing the speed of representation (for pre-existing data sets) or "mixing" sound components to optimize intelligibility for tracking suspicious activity.

  10. Cognitive Aspects of Collaboration in 3d Virtual Environments

    NASA Astrophysics Data System (ADS)

    Juřík, V.; Herman, L.; Kubíček, P.; Stachoň, Z.; Šašinka, Č.

    2016-06-01

    Human-computer interaction has entered the 3D era. The most important models representing spatial information — maps — are transferred into 3D versions regarding the specific content to be displayed. Virtual worlds (VW) become promising area of interest because of possibility to dynamically modify content and multi-user cooperation when solving tasks regardless to physical presence. They can be used for sharing and elaborating information via virtual images or avatars. Attractiveness of VWs is emphasized also by possibility to measure operators' actions and complex strategies. Collaboration in 3D environments is the crucial issue in many areas where the visualizations are important for the group cooperation. Within the specific 3D user interface the operators' ability to manipulate the displayed content is explored regarding such phenomena as situation awareness, cognitive workload and human error. For such purpose, the VWs offer a great number of tools for measuring the operators' responses as recording virtual movement or spots of interest in the visual field. Study focuses on the methodological issues of measuring the usability of 3D VWs and comparing them with the existing principles of 2D maps. We explore operators' strategies to reach and interpret information regarding the specific type of visualization and different level of immersion.

  11. Effects of immersion depth on super-resolution properties of index-different microsphere-assisted nanoimaging

    NASA Astrophysics Data System (ADS)

    Zhou, Yi; Tang, Yan; He, Yu; Liu, Xi; Hu, Song

    2018-03-01

    In related applications of microsphere-assisted super-resolution imaging in biomedical visualization and microfluidic detection, liquids are widely used as background media. For the first time, we quantitatively demonstrate that the maximum irradiances, focal lengths, and waists of photonic nanojets (PNJs) will logically vary with different immersion depths (IMDs). The experimental observations also numerically illustrate the trends of the lateral magnification and field of view (FOV) with the gradual evaporation of ethyl alcohol. This work can provide exact quantitative information for the proper selection of microspheres and IMD for the high-quality discernment of nanostructures.

  12. Three-Dimensional User Interfaces for Immersive Virtual Reality

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1997-01-01

    The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.

  13. Story Immersion in a Health Videogame for Childhood Obesity Prevention.

    PubMed

    Lu, Amy Shirong; Thompson, Debbe; Baranowski, Janice; Buday, Richard; Baranowski, Tom

    2012-02-15

    Stories can serve as powerful tools for health interventions. Story immersion refers to the experience of being absorbed in a story. This is among the first studies to analyze story immersion's role in health videogames among children by addressing two main questions: Will children be more immersed when the main characters are similar to them? Do increased levels of immersion relate to more positive health outcomes? Eighty-seven 10-12-year-old African-American, Caucasian, and Hispanic children from Houston, TX, played a health videogame, "Escape from Diab" (Archimage, Houston, TX), featuring a protagonist with both African-American and Hispanic phenotypic features. Children's demographic information, immersion, and health outcomes (i.e., preference, motivation, and self-efficacy) were recorded and then correlated and analyzed. African-American and Hispanic participants reported higher immersion scores than Caucasian participants ( P = 0.01). Story immersion correlated positively ( P values < 0.03) with an increase in Fruit and Vegetable Preference ( r = 0.27), Intrinsic Motivation for Water ( r = 0.29), Vegetable Self-Efficacy ( r = 0.24), and Physical Activity Self-Efficacy ( r = 0.32). Ethnic similarity between videogame characters and players enhanced immersion and several health outcomes. Effectively embedding characters with similar phenotypic features to the target population in interactive health videogame narratives may be important when motivating children to adopt obesity prevention behaviors.

  14. A Huygens immersed-finite-element particle-in-cell method for modeling plasma-surface interactions with moving interface

    NASA Astrophysics Data System (ADS)

    Cao, Huijun; Cao, Yong; Chu, Yuchuan; He, Xiaoming; Lin, Tao

    2018-06-01

    Surface evolution is an unavoidable issue in engineering plasma applications. In this article an iterative method for modeling plasma-surface interactions with moving interface is proposed and validated. In this method, the plasma dynamics is simulated by an immersed finite element particle-in-cell (IFE-PIC) method, and the surface evolution is modeled by the Huygens wavelet method which is coupled with the iteration of the IFE-PIC method. Numerical experiments, including prototypical engineering applications, such as the erosion of Hall thruster channel wall, are presented to demonstrate features of this Huygens IFE-PIC method for simulating the dynamic plasma-surface interactions.

  15. Top coat or no top coat for immersion lithography?

    NASA Astrophysics Data System (ADS)

    Stepanenko, N.; Kim, Hyun-Woo; Kishimura, S.; Van Den Heuvel, D.; Vandenbroeck, N.; Kocsis, M.; Foubert, P.; Maenhoudt, M.; Ercken, M.; Van Roey, F.; Gronheid, R.; Pollentier, I.; Vangoidsenhoven, D.; Delvaux, C.; Baerts, C.; O'Brien, S.; Fyen, W.; Wells, G.

    2006-03-01

    Since the moment immersion lithography appeared in the roadmaps of IC manufacturers, the question whether to use top coats has become one of the important topics for discussions. The top coats used in immersion lithography have proved to serve as good protectors from leaching of the resist components (PAGs, bases) into the water. However their application complicates the process and may lead to two side effects. First, top coats can affect the process window and resist profile depending on the material's refractive index, thickness, acidity, chemical interaction with the resist and the soaking time. Second, the top coat application may increase the total amount of defects on the wafer. Having an immersion resist which could work without the top coat would be a preferable solution. Still, it is quite challenging to make such a resist as direct water/resist interaction may also result in process window changes, CD variations, generation of additional defects. We have performed a systematic evaluation of a large number of immersion resist and top coat combinations, using the ASML XT:1250Di scanner at IMEC. The samples for the experiments were provided by all the leading resist and top coat suppliers. Particular attention was paid to how the resist and top coat materials from different vendors interacted with each other. Among the factors which could influence the total amount of defects or CD variations on the wafer were: the material's dynamic contact angle and its interaction with the scanner stage speed, top coat thickness and intermixing layer formation, water uptake and leaching. We have examined the importance of all mentioned factors, using such analytical techniques as Resist Development Analyser (RDA), Quartz Crystal Microbalance (QCM), Mass Spectroscopy (MS) and scatterometry. We have also evaluated the influence of the pre- and pos- exposure rinse processes on the defectivity. In this paper we will present the data on imaging and defectivity performance of the resists with and without the use of top coats. So far we can conclude that top coat/resist approach used in immersion lithography needs some more improvements (i.e. process, materials properties) in order to be implemented in high volume manufacturing.

  16. Effect of dehydration on thirst and drinking during immersion in men

    NASA Technical Reports Server (NTRS)

    Sagawa, S.; Miki, K.; Tajima, F.; Tanaka, H.; Choi, J. K.; Keil, L. C.; Shiraki, K.; Greenleaf, J. E.

    1992-01-01

    The effect of water immersion on voluntary water intake, subjective evaluations of thirst and gastrointestinal state, and associated fluid-electrolyte and hormonal interaction were investigated. Eight men (19-25 yrs of age) were immersed to the neck while sitting for three hours at 34.5 C or in air at 28 C when euhydrated and hypohydrated by 3.6 percent body weight loss. Within the first ten minutes of immersion the significant reduction in drinking in the hypo-H2O experiment was associated with unchanged plasma Na(+), plasma osmolality, heart rates, and mean arterial pressures. Different responses increased cardiac output, plasma volume, and atrial natriuretic peptides and decreased plasma renin activity and arginine vasopressin. It is concluded that the extracellular pathway, as opposed to the osmotic pathway, appears to be the major mechanism for immersion-induced suppression of drinking.

  17. Testing geoscience data visualization systems for geological mapping and training

    NASA Astrophysics Data System (ADS)

    Head, J. W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Senthil Kumar, P.

    2008-09-01

    Traditional methods of planetary geological mapping have relied on photographic hard copy and light-table tracing and mapping. In the last several decades this has given way to the availability and analysis of multiple digital data sets, and programs and platforms that permit the viewing and manipulation of multiple annotated layers of relevant information. This has revolutionized the ability to incorporate important new data into the planetary mapping process at all scales. Information on these developments and approaches can be obtained at http://astrogeology.usgs. gov/ Technology/. The processes is aided by Geographic Information Systems (GIS) (see http://astrogeology. usgs.gov/Technology/) and excellent analysis packages (such as ArcGIS) that permit co-registration, rapid viewing, and analysis of multiple data sets on desktop displays (see http://astrogeology.usgs.gov/Projects/ webgis/). We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment", or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks. There is still much to learn and understand, however, about how the varying degrees of immersive displays affect task performance. For example, in using a 1280x1024 desktop monitor to explore an image, the mapper wastes a lot of time in image zooming/panning to balance the analysis-driven need for both detail as well as context. Therefore, we have spent a considerable amount of time exploring higher-resolution media, such as an IBM Bertha display 3840x2400 or a tiled wall with multiple projectors. We have found through over a year of weekly meetings and assessment that they definitely improve the efficiency of analysis and mapping. Here we outline briefly the nature of the major systems and our initial assessment of these in 1:5M Scale NASA-USGS Venus Geological Mapping Program (http://astrogeology.usgs. gov/Projects/PlanetaryMapping/MapStatus/VenusStatus/V enus_Status.html). 1. Immersive Virtual Reality (Cave): ADVISER System Description: Our Cave system is an 8'x8'x8' cube with four projection surfaces (three walls and the floor). Four linux machines (identical in performance to the desktop machine) provide data for the Cave. Users utilize a handheld 3D tracked input device to navigate. Our 3D input device has a joystick and is simple to use. To navigate, the user simply points in the direction he/she wants to fly and pushes the joystick forward or backward to move relative to that direction. The user can push the joystick to the left and right to rotate his/her position in the virtual world. A collision detection algorithm is used to prevent the user from going underneath the surface. We have developed ADVISER (ADvanced VIsualization for Solar system Exploration) [1,2] as a tool for taking planetary geologists virtually "into the field" in the IVR Cave environment in support of several scientific themes and have assessed its application to geological mapping of Venus. ADVISER aims to create a field experience by integrating multiple data sources and presenting them as a unified environment to the scientist. Additionally, we have developed a virtual field kit, tailored to supporting research tasks dictated by scientific and mapping themes. Technically, ADVISER renders high-resolution topographic and image datasets (8192x8192 samples) in stereo at interactive frame-rates (25+ frames-per-second). The system is based on a state-of-the-art terrain rendering system and is highly interactive; for example, vertical exaggeration, lighting geometry, image contrast, and contour lines can be modified by the user in real time. High-resolution image data can be overlaid on the terrain and other data can be rendered in this context. A detailed description and case studies of ADVISER are available.

  18. Combining photorealistic immersive geovisualization and high-resolution geospatial data to enhance human-scale viewshed modelling

    NASA Astrophysics Data System (ADS)

    Tabrizian, P.; Petrasova, A.; Baran, P.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.

    2017-12-01

    Viewshed modelling- a process of defining, parsing and analysis of landscape visual space's structure within GIS- has been commonly used in applications ranging from landscape planning and ecosystem services assessment to geography and archaeology. However, less effort has been made to understand whether and to what extent these objective analyses predict actual on-the-ground perception of human observer. Moreover, viewshed modelling at the human-scale level require incorporation of fine-grained landscape structure (eg., vegetation) and patterns (e.g, landcover) that are typically omitted from visibility calculations or unrealistically simulated leading to significant error in predicting visual attributes. This poster illustrates how photorealistic Immersive Virtual Environments and high-resolution geospatial data can be used to integrate objective and subjective assessments of visual characteristics at the human-scale level. We performed viewshed modelling for a systematically sampled set of viewpoints (N=340) across an urban park using open-source GIS (GRASS GIS). For each point a binary viewshed was computed on a 3D surface model derived from high-density leaf-off LIDAR (QL2) points. Viewshed map was combined with high-resolution landcover (.5m) derived through fusion of orthoimagery, lidar vegetation, and vector data. Geo-statistics and landscape structure analysis was performed to compute topological and compositional metrics for visual-scale (e.g., openness), complexity (pattern, shape and object diversity), and naturalness. Based on the viewshed model output, a sample of 24 viewpoints representing the variation of visual characteristics were selected and geolocated. For each location, 360o imagery were captured using a DSL camera mounted on a GIGA PAN robot. We programmed a virtual reality application through which human subjects (N=100) immersively experienced a random representation of selected environments via a head-mounted display (Oculus Rift CV1), and rated each location on perceived openness, naturalness and complexity. Regression models were performed to correlate model outputs with participants' responses. The results indicated strong, significant correlations for openness, and naturalness and moderate correlation for complexity estimations.

  19. Using Immersive Visualizations to Improve Decision Making and Enhancing Public Understanding of Earth Resource and Climate Issues

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Raynolds, R. G.; Dechesne, M.

    2008-12-01

    New visualization technologies, from ArcGIS to Google Earth, have allowed for the integration of complex, disparate data sets to produce visually rich and compelling three-dimensional models of sub-surface and surface resource distribution patterns. The rendering of these models allows the public to quickly understand complicated geospatial relationships that would otherwise take much longer to explain using traditional media. We have impacted the community through topical policy presentations at both state and city levels, adult education classes at the Denver Museum of Nature and Science (DMNS), and public lectures at DMNS. We have constructed three-dimensional models from well data and surface observations which allow policy makers to better understand the distribution of groundwater in sandstone aquifers of the Denver Basin. Our presentations to local governments in the Denver metro area have allowed resource managers to better project future ground water depletion patterns, and to encourage development of alternative sources. DMNS adult education classes on water resources, geography, and regional geology, as well as public lectures on global issues such as earthquakes, tsunamis, and resource depletion, have utilized the visualizations developed from these research models. In addition to presenting GIS models in traditional lectures, we have also made use of the immersive display capabilities of the digital "fulldome" Gates Planetarium at DMNS. The real-time Uniview visualization application installed at Gates was designed for teaching astronomy, but it can be re-purposed for displaying our model datasets in the context of the Earth's surface. The 17-meter diameter dome of the Gates Planetarium allows an audience to have an immersive experience---similar to virtual reality CAVEs employed by the oil exploration industry---that would otherwise not be available to the general public. Public lectures in the dome allow audiences of over 100 people to comprehend dynamically- changing geospatial datasets in an exciting and engaging fashion. In our presentation, we will demonstrate how new software tools like Uniview can be used to dramatically enhance and accelerate public comprehension of complex, multi-scale geospatial phenomena.

  20. Designing for Learning Conversations: How Parents Support Children's Science Learning within an Immersive Simulation

    ERIC Educational Resources Information Center

    Tscholl, Michael; Lindgren, Robb

    2016-01-01

    This research investigates the social learning affordances of a room-sized, immersive, and interactive augmented reality simulation environment designed to support children's understanding of basic physics concepts in a science center. Conversations between 97 parent-child pairs were analyzed in relation to categories of talk through which…

  1. Influence of Sound Immersion and Communicative Interaction on the Lombard Effect

    ERIC Educational Resources Information Center

    Garnier, Maeva; Henrich, Nathalie; Dubois, Daniele

    2010-01-01

    Purpose: To examine the influence of sound immersion techniques and speech production tasks on speech adaptation in noise. Method: In Experiment 1, we compared the modification of speakers' perception and speech production in noise when noise is played into headphones (with and without additional self-monitoring feedback) or over loudspeakers. We…

  2. Examining Peer Language Use and Investment in a Distinct North American Immersion Context

    ERIC Educational Resources Information Center

    Ballinger, Susan

    2017-01-01

    Previous studies have shown that immersion students tend to speak the majority language during peer interactions, regardless of the language of instruction or their dominant language. Researchers have argued that the societal status of the majority language presents an obstacle to providing equitable support for both languages of instruction. To…

  3. Measuring Flow Experience in an Immersive Virtual Environment for Collaborative Learning

    ERIC Educational Resources Information Center

    van Schaik, P.; Martin, S.; Vallance, M.

    2012-01-01

    In contexts other than immersive virtual environments, theoretical and empirical work has identified flow experience as a major factor in learning and human-computer interaction. Flow is defined as a "holistic sensation that people feel when they act with total involvement". We applied the concept of flow to modeling the experience of…

  4. Comparisons of ice packs, hot water immersion, and analgesia injection for the treatment of centipede envenomations in Taiwan.

    PubMed

    Chaou, Chung-Hsien; Chen, Chian-Kuang; Chen, Jih-Chang; Chiu, Te-Fa; Lin, Chih-Chuan

    2009-08-01

    To compare the effectiveness of ice packs and hot water immersion for the treatment of centipede envenomations. Sixty patients envenomated by centipedes were randomized into three groups and were treated with ice packs, hot water immersion, or analgesia injection. The visual analog score (VAS) for pain was measured before the treatment and 15 min afterward. Demographic data and data on local and systemic effects after centipede bites were collected. The VAS scores and the pain decrease (DeltaVAS) were compared between the three groups. All patients suffered from pain at the affected sites; other local effects included redness (n = 49, 81.7%), swelling (n = 32, 53.3%), heat (n = 14, 23.3%), itchiness (n = 5, 8.3), and bullae formation (n = 3, 5.0%). Rare systemic effects were reported. All three groups had similar VAS scores before and after treatment. They also had similar effectiveness in reducing pain caused by centipedes bites (DeltaVAS = 2.55 +/- 1.88, 2.33 +/- 1.78, and 1.55 +/- 1.68, with ice packs, analgesia, and hot water immersion, respectively, p = 0.165). Ice packs, hot water immersion, and analgesics all improved the pain from centipede envenomation. Ice pack treatment is a safe, inexpensive, and non-invasive method for pre-hospital management in patients with centipede envenomation.

  5. Usage of stereoscopic visualization in the learning contents of rotational motion.

    PubMed

    Matsuura, Shu

    2013-01-01

    Rotational motion plays an essential role in physics even at an introductory level. In addition, the stereoscopic display of three-dimensional graphics includes is advantageous for the presentation of rotational motions, particularly for depth recognition. However, the immersive visualization of rotational motion has been known to lead to dizziness and even nausea for some viewers. Therefore, the purpose of this study is to examine the onset of nausea and visual fatigue when learning rotational motion through the use of a stereoscopic display. The findings show that an instruction method with intermittent exposure of the stereoscopic display and a simplification of its visual components reduced the onset of nausea and visual fatigue for the viewers, which maintained the overall effect of instantaneous spatial recognition.

  6. Bending it like Beckham: how to visually fool the goalkeeper.

    PubMed

    Dessing, Joost C; Craig, Cathy M

    2010-10-06

    As bending free-kicks becomes the norm in modern day soccer, implications for goalkeepers have largely been ignored. Although it has been reported that poor sensitivity to visual acceleration makes it harder for expert goalkeepers to perceptually judge where the curved free-kicks will cross the goal line, it is unknown how this affects the goalkeeper's actual movements. Here, an in-depth analysis of goalkeepers' hand movements in immersive, interactive virtual reality shows that they do not fully account for spin-induced lateral ball acceleration. Hand movements were found to be biased in the direction of initial ball heading, and for curved free-kicks this resulted in biases in a direction opposite to those necessary to save the free-kick. These movement errors result in less time to cover a now greater distance to stop the ball entering the goal. These and other details of the interceptive behaviour are explained using a simple mathematical model which shows how the goalkeeper controls his movements online with respect to the ball's current heading direction. Furthermore our results and model suggest how visual landmarks, such as the goalposts in this instance, may constrain the extent of the movement biases. While it has previously been shown that humans can internalize the effects of gravitational acceleration, these results show that it is much more difficult for goalkeepers to account for spin-induced visual acceleration, which varies from situation to situation. The limited sensitivity of the human visual system for detecting acceleration, suggests that curved free-kicks are an important goal-scoring opportunity in the game of soccer.

  7. Bending It Like Beckham: How to Visually Fool the Goalkeeper

    PubMed Central

    2010-01-01

    Background As bending free-kicks becomes the norm in modern day soccer, implications for goalkeepers have largely been ignored. Although it has been reported that poor sensitivity to visual acceleration makes it harder for expert goalkeepers to perceptually judge where the curved free-kicks will cross the goal line, it is unknown how this affects the goalkeeper's actual movements. Methodology/Principal Findings Here, an in-depth analysis of goalkeepers' hand movements in immersive, interactive virtual reality shows that they do not fully account for spin-induced lateral ball acceleration. Hand movements were found to be biased in the direction of initial ball heading, and for curved free-kicks this resulted in biases in a direction opposite to those necessary to save the free-kick. These movement errors result in less time to cover a now greater distance to stop the ball entering the goal. These and other details of the interceptive behaviour are explained using a simple mathematical model which shows how the goalkeeper controls his movements online with respect to the ball's current heading direction. Furthermore our results and model suggest how visual landmarks, such as the goalposts in this instance, may constrain the extent of the movement biases. Conclusions While it has previously been shown that humans can internalize the effects of gravitational acceleration, these results show that it is much more difficult for goalkeepers to account for spin-induced visual acceleration, which varies from situation to situation. The limited sensitivity of the human visual system for detecting acceleration, suggests that curved free-kicks are an important goal-scoring opportunity in the game of soccer. PMID:20949130

  8. 3D movies for teaching seafloor bathymetry, plate tectonics, and ocean circulation in large undergraduate classes

    NASA Astrophysics Data System (ADS)

    Peterson, C. D.; Lisiecki, L. E.; Gebbie, G.; Hamann, B.; Kellogg, L. H.; Kreylos, O.; Kronenberger, M.; Spero, H. J.; Streletz, G. J.; Weber, C.

    2015-12-01

    Geologic problems and datasets are often 3D or 4D in nature, yet projected onto a 2D surface such as a piece of paper or a projection screen. Reducing the dimensionality of data forces the reader to "fill in" that collapsed dimension in their minds, creating a cognitive challenge for the reader, especially new learners. Scientists and students can visualize and manipulate 3D datasets using the virtual reality software developed for the immersive, real-time interactive 3D environment at the KeckCAVES at UC Davis. The 3DVisualizer software (Billen et al., 2008) can also operate on a desktop machine to produce interactive 3D maps of earthquake epicenter locations and 3D bathymetric maps of the seafloor. With 3D projections of seafloor bathymetry and ocean circulation proxy datasets in a virtual reality environment, we can create visualizations of carbon isotope (δ13C) records for academic research and to aid in demonstrating thermohaline circulation in the classroom. Additionally, 3D visualization of seafloor bathymetry allows students to see features of seafloor most people cannot observe first-hand. To enhance lessons on mid-ocean ridges and ocean basin genesis, we have created movies of seafloor bathymetry for a large-enrollment undergraduate-level class, Introduction to Oceanography. In the past four quarters, students have enjoyed watching 3D movies, and in the fall quarter (2015), we will assess how well 3D movies enhance learning. The class will be split into two groups, one who learns about the Mid-Atlantic Ridge from diagrams and lecture, and the other who learns with a supplemental 3D visualization. Both groups will be asked "what does the seafloor look like?" before and after the Mid-Atlantic Ridge lesson. Then the whole class will watch the 3D movie and respond to an additional question, "did the 3D visualization enhance your understanding of the Mid-Atlantic Ridge?" with the opportunity to further elaborate on the effectiveness of the visualization.

  9. Optic variables used to judge future ball arrival position in expert and novice soccer players.

    PubMed

    Craig, Cathy M; Goulon, Cédric; Berton, Eric; Rao, Guillaume; Fernandez, Laure; Bootsma, Reinoud J

    2009-04-01

    Although many studies have looked at the perceptual-cognitive strategies used to make anticipatory judgments in sport, few have examined the informational invariants that our visual system may be attuned to. Using immersive interactive virtual reality to simulate the aerodynamics of the trajectory of a ball with and without sidespin, the present study examined the ability of expert and novice soccer players to make judgments about the ball's future arrival position. An analysis of their judgment responses showed how participants were strongly influenced by the ball's trajectory. The changes in trajectory caused by sidespin led to erroneous predictions about the ball's future arrival position. An analysis of potential informational variables that could explain these results points to the use of a first-order compound variable combining optical expansion and optical displacement.

  10. Advanced telepresence surgery system development.

    PubMed

    Jensen, J F; Hill, J W

    1996-01-01

    SRI International is currently developing a prototype remote telepresence surgery system, for the Advanced Research Projects Agency (ARPA), that will bring life-saving surgical care to wounded soldiers in the zone of combat. Remote surgery also has potentially important applications in civilian medicine. In addition, telepresence will find wide medical use in local surgery, in endoscopic, laparoscopic, and microsurgery applications. Key elements of the telepresence technology now being developed for ARPA, including the telepresence surgeon's workstation (TSW) and associated servo control systems, will have direct application to these areas of minimally invasive surgery. The TSW technology will also find use in surgical training, where it will provide an immersive visual and haptic interface for interaction with computer-based anatomical models. In this paper, we discuss our ongoing development of the MEDFAST telesurgery system, focusing on the TSW man-machine interface and its associated servo control electronics.

  11. Emotion Telepresence: Emotion Augmentation through Affective Haptics and Visual Stimuli

    NASA Astrophysics Data System (ADS)

    Tsetserukou, D.; Neviarouskaya, A.

    2012-03-01

    The paper focuses on a novel concept of emotional telepresence. The iFeel_IM! system which is in the vanguard of this technology integrates 3D virtual world Second Life, intelligent component for automatic emotion recognition from text messages, and innovative affective haptic interfaces providing additional nonverbal communication channels through simulation of emotional feedback and social touch (physical co-presence). Users can not only exchange messages but also emotionally and physically feel the presence of the communication partner (e.g., family member, friend, or beloved person). The next prototype of the system will include the tablet computer. The user can realize haptic interaction with avatar, and thus influence its mood and emotion of the partner. The finger gesture language will be designed for communication with avatar. This will bring new level of immersion of on-line communication.

  12. Visualization Center Dedicated

    NASA Image and Video Library

    2003-10-17

    The dedication ceremony of the University of Southern Mississippi Center of Higher Learning (CHL) High-Performance Visualization Center at SSC was held Oct. 17. The center's RAVE II 3-D visualization system, available to both on- and off-site scientists, turns data into a fully immersive environment for the user. Cutting the ribbon are, from left, Rear Adm. Thomas Donaldson, commander of the Naval Meteorology and Oceanography Command; Jim Meredith, former director of the CHL; USM President Dr. Shelby Thames; Lt. Gov. Amy Tuck; Dr. Peter Ranelli, director of the CHL; Dewey Herring, chairman of the policy board for the CHL; and former Sen. Cecil Burge.

  13. Visualization Center Dedicated

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The dedication ceremony of the University of Southern Mississippi Center of Higher Learning (CHL) High-Performance Visualization Center at SSC was held Oct. 17. The center's RAVE II 3-D visualization system, available to both on- and off-site scientists, turns data into a fully immersive environment for the user. Cutting the ribbon are, from left, Rear Adm. Thomas Donaldson, commander of the Naval Meteorology and Oceanography Command; Jim Meredith, former director of the CHL; USM President Dr. Shelby Thames; Lt. Gov. Amy Tuck; Dr. Peter Ranelli, director of the CHL; Dewey Herring, chairman of the policy board for the CHL; and former Sen. Cecil Burge.

  14. Lubricated immersed boundary method in two dimensions

    NASA Astrophysics Data System (ADS)

    Fai, Thomas G.; Rycroft, Chris H.

    2018-03-01

    Many biological examples of fluid-structure interaction, including the transit of red blood cells through the narrow slits in the spleen and the intracellular trafficking of vesicles into dendritic spines, involve the near-contact of elastic structures separated by thin layers of fluid. Motivated by such problems, we introduce an immersed boundary method that uses elements of lubrication theory to resolve thin fluid layers between immersed boundaries. We demonstrate 2nd-order accurate convergence for simple two-dimensional flows with known exact solutions to showcase the increased accuracy of this method compared to the standard immersed boundary method. Motivated by the phenomenon of wall-induced migration, we apply the lubricated immersed boundary method to simulate an elastic vesicle near a wall in shear flow. We also simulate the dynamics of a vesicle traveling through a narrow channel and observe the ability of the lubricated method to capture the vesicle motion on relatively coarse fluid grids.

  15. The influence of visual characteristics of barriers on railway noise perception.

    PubMed

    Maffei, Luigi; Masullo, Massimiliano; Aletta, Francesco; Di Gabriele, Maria

    2013-02-15

    Noise annoyance is considered as the main effect of noise, it is a complex and multifaceted psychological concept dealing with immediate behavioral and evaluative aspects. In the last decades the research has intensely investigated the correlation between noise exposure and noise annoyance, nevertheless recent studies confirm that non-auditory factors influence the noise perception of individuals. In particular audio-video interaction can play a fundamental role. Today Immersive Virtual Reality (IVR) systems allow building laboratory test providing realistic experiences of the surrounding environment to detect more accurate information about the reactions of the local population. Regarding the interventions for environmental noise control the barriers represent the main solution; however some aspects related to their visual characteristic have to be further investigated. This paper presented a case study, where a sample of residents living close to a railway line assessed noise-related aspects for several barriers with different visual characteristics in an IVR laboratory test. In particular, three main factors were analyzed: the barrier type concerning the visibility of the noise source through the screen, the visual aspect of the barrier concerning some aesthetic issues and the noise level at the receiver concerning the acoustic performance of the barrier and the magnitude of the sound source. The main results of the ANOVA analysis showed that for transparent barriers Perceived Loudness and Noise Annoyance were judged lower than for opaque barriers; this difference increased as noise level increased. Copyright © 2012. Published by Elsevier B.V.

  16. Story Immersion in a Health Videogame for Childhood Obesity Prevention

    PubMed Central

    Thompson, Debbe; Baranowski, Janice; Buday, Richard; Baranowski, Tom

    2012-01-01

    Abstract Objective Stories can serve as powerful tools for health interventions. Story immersion refers to the experience of being absorbed in a story. This is among the first studies to analyze story immersion's role in health videogames among children by addressing two main questions: Will children be more immersed when the main characters are similar to them? Do increased levels of immersion relate to more positive health outcomes? Subjects and Methods Eighty-seven 10–12-year-old African-American, Caucasian, and Hispanic children from Houston, TX, played a health videogame, “Escape from Diab” (Archimage, Houston, TX), featuring a protagonist with both African-American and Hispanic phenotypic features. Children's demographic information, immersion, and health outcomes (i.e., preference, motivation, and self-efficacy) were recorded and then correlated and analyzed. Results African-American and Hispanic participants reported higher immersion scores than Caucasian participants (P=0.01). Story immersion correlated positively (P values<0.03) with an increase in Fruit and Vegetable Preference (r=0.27), Intrinsic Motivation for Water (r=0.29), Vegetable Self-Efficacy (r=0.24), and Physical Activity Self-Efficacy (r=0.32). Conclusion Ethnic similarity between videogame characters and players enhanced immersion and several health outcomes. Effectively embedding characters with similar phenotypic features to the target population in interactive health videogame narratives may be important when motivating children to adopt obesity prevention behaviors. PMID:24066276

  17. Object Creation and Human Factors Evaluation for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1998-01-01

    The main objective of this project is to provide test objects for simulated environments utilized by the recently established Army/NASA Virtual Innovations Lab (ANVIL) at Marshall Space Flight Center, Huntsville, Al. The objective of the ANVIL lab is to provide virtual reality (VR) models and environments and to provide visualization and manipulation methods for the purpose of training and testing. Visualization equipment used in the ANVIL lab includes head-mounted and boom-mounted immersive virtual reality display devices. Objects in the environment are manipulated using data glove, hand controller, or mouse. These simulated objects are solid or surfaced three dimensional models. They may be viewed or manipulated from any location within the environment and may be viewed on-screen or via immersive VR. The objects are created using various CAD modeling packages and are converted into the virtual environment using dVise. This enables the object or environment to be viewed from any angle or distance for training or testing purposes.

  18. In situ real-time monitoring of biomolecular interactions based on resonating microcantilevers immersed in a viscous fluid

    NASA Astrophysics Data System (ADS)

    Kwon, Tae Yun; Eom, Kilho; Park, Jae Hong; Yoon, Dae Sung; Kim, Tae Song; Lee, Hong Lim

    2007-05-01

    The authors report the precise (noise-free) in situ real-time monitoring of a specific protein antigen-antibody interaction by using a resonating microcantilever immersed in a viscous fluid. In this work, they utilized a resonating piezoelectric thick film microcantilever, which exhibits the high quality factor (e.g., Q =15) in a viscous liquid at a viscosity comparable to that of human blood serum. This implies a great potential of the resonating microcantilever to in situ biosensor applications. It is shown that the microcantilever enables them to monitor the C reactive protein antigen-antibody interactions in real time, providing an insight into the protein binding kinetics.

  19. Effects of immersion media and repolishing on color stability and superficial morphology of nanofilled composite resin.

    PubMed

    de Oliveira, Ana Luísa Botta Martins; Botta, Ana Carolina; Campos, Juliana Álvares Duarte Bonini; Garcia, Patrícia Petromilli Nordi Sasso

    2014-08-01

    This study evaluated the influence of fluoride mouth rinses and repolishing on the superficial morphology and color stability of nanofilled resin. About 150 specimens were prepared and polished using aluminum oxide discs for 15 s with a pressure of 2 kg. The experimental groups were divided according to the immersion medium (artificial saliva, 0.5% sodium fluoride, Fluordent Reach, Oral B, Fluorgard) and repolishing procedure (without and with). The specimens were continuously immersed for 1 week. Thereafter, half of each sample was repolished. A color reading was performed after 24 h of immersion in the artificial saliva baseline, after continuous immersion, and after repolishing. The superficial morphology was examined using scanning electron microscopy (SEM) in a qualitative way. Color change (∆E) data were submitted to a mixed analysis of variance using a Shapiro-Wilk test (p>0.05 for the different immersion media) and Sidak's test (p<0.05 for the differences between groups). In the interaction between the repolishing and the immersion media, Fluorgard showed a statistical difference between the ∆E values with and without repolishing (p<0.0001). On the SEM observations, both Fluordent Reach and Fluorgard caused degradation of the superficial resinous matrix of the composite after continuous immersion. This matrix was removed after repolishing.

  20. Depth Camera-Based 3D Hand Gesture Controls with Immersive Tactile Feedback for Natural Mid-Air Gesture Interactions

    PubMed Central

    Kim, Kwangtaek; Kim, Joongrock; Choi, Jaesung; Kim, Junghyun; Lee, Sangyoun

    2015-01-01

    Vision-based hand gesture interactions are natural and intuitive when interacting with computers, since we naturally exploit gestures to communicate with other people. However, it is agreed that users suffer from discomfort and fatigue when using gesture-controlled interfaces, due to the lack of physical feedback. To solve the problem, we propose a novel complete solution of a hand gesture control system employing immersive tactile feedback to the user's hand. For this goal, we first developed a fast and accurate hand-tracking algorithm with a Kinect sensor using the proposed MLBP (modified local binary pattern) that can efficiently analyze 3D shapes in depth images. The superiority of our tracking method was verified in terms of tracking accuracy and speed by comparing with existing methods, Natural Interaction Technology for End-user (NITE), 3D Hand Tracker and CamShift. As the second step, a new tactile feedback technology with a piezoelectric actuator has been developed and integrated into the developed hand tracking algorithm, including the DTW (dynamic time warping) gesture recognition algorithm for a complete solution of an immersive gesture control system. The quantitative and qualitative evaluations of the integrated system were conducted with human subjects, and the results demonstrate that our gesture control with tactile feedback is a promising technology compared to a vision-based gesture control system that has typically no feedback for the user's gesture inputs. Our study provides researchers and designers with informative guidelines to develop more natural gesture control systems or immersive user interfaces with haptic feedback. PMID:25580901

  1. Depth camera-based 3D hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions.

    PubMed

    Kim, Kwangtaek; Kim, Joongrock; Choi, Jaesung; Kim, Junghyun; Lee, Sangyoun

    2015-01-08

    Vision-based hand gesture interactions are natural and intuitive when interacting with computers, since we naturally exploit gestures to communicate with other people. However, it is agreed that users suffer from discomfort and fatigue when using gesture-controlled interfaces, due to the lack of physical feedback. To solve the problem, we propose a novel complete solution of a hand gesture control system employing immersive tactile feedback to the user's hand. For this goal, we first developed a fast and accurate hand-tracking algorithm with a Kinect sensor using the proposed MLBP (modified local binary pattern) that can efficiently analyze 3D shapes in depth images. The superiority of our tracking method was verified in terms of tracking accuracy and speed by comparing with existing methods, Natural Interaction Technology for End-user (NITE), 3D Hand Tracker and CamShift. As the second step, a new tactile feedback technology with a piezoelectric actuator has been developed and integrated into the developed hand tracking algorithm, including the DTW (dynamic time warping) gesture recognition algorithm for a complete solution of an immersive gesture control system. The quantitative and qualitative evaluations of the integrated system were conducted with human subjects, and the results demonstrate that our gesture control with tactile feedback is a promising technology compared to a vision-based gesture control system that has typically no feedback for the user's gesture inputs. Our study provides researchers and designers with informative guidelines to develop more natural gesture control systems or immersive user interfaces with haptic feedback.

  2. Indium tin oxide based chip for optical and electrochemical characterization of protein-cell interaction

    NASA Astrophysics Data System (ADS)

    Choi, Yong Hyun; Min, Junhong; Cho, Sungbo

    2015-06-01

    Analysis on the interaction between proteins and cells is required for understanding the cellular behaviour and response. In this article, we characterized the adhesion and growth of 293/GFP cells on fetal bovine serum (FBS) coated indium tin oxide (ITO) electrode. Using optical and electrochemical measurement, it was able to detect the adsorption of the protein on the surface of the ITO electrode dependent on the concentration of the protein in the immersing solution or the immersing time. An increase in the amount of the adsorbed serum protein resulted in a decrease in anodic peak current and an increase in the charge transfer resistance extracted from the equivalent circuit fitting analysis. More cells adhered and proliferated on the ITO electrode which was pre-immersed in FBS medium rather than bare electrode. The effect of the FBS on cell behaviors was reflected in the impedance monitoring of cells at 21.5 kHz.

  3. Diffusive interaction of multiple surface nanobubbles: shrinkage, growth, and coarsening.

    PubMed

    Zhu, Xiaojue; Verzicco, Roberto; Zhang, Xuehua; Lohse, Detlef

    2018-03-14

    Surface nanobubbles are nanoscopic spherical-cap shaped gaseous domains on immersed substrates which are stable, even for days. After the stability of a single surface nanobubble has been theoretically explained, i.e. contact line pinning and gas oversaturation are required to stabilize it against diffusive dissolution [Lohse and Zhang, Phys. Rev. E, 2015, 91, 031003(R)], here we focus on the collective diffusive interaction of multiple nanobubbles. For that purpose we develop a finite difference scheme for the diffusion equation with the appropriate boundary conditions and with the immersed boundary method used to represent the growing or shrinking bubbles. After validation of the scheme against the exact results of Epstein and Plesset for a bulk bubble [J. Chem. Phys., 1950, 18, 1505] and of Lohse and Zhang for a surface bubble, the framework of these simulations is used to describe the coarsening process of competitively growing nanobubbles. The coarsening process for such diffusively interacting nanobubbles slows down with advancing time and increasing bubble distance. The present results for surface nanobubbles are also applicable for immersed surface nanodroplets, for which better controlled experimental results of the coarsening process exist.

  4. An immersed-boundary method for flow–structure interaction in biological systems with application to phonation

    PubMed Central

    Luo, Haoxiang; Mittal, Rajat; Zheng, Xudong; Bielamowicz, Steven A.; Walsh, Raymond J.; Hahn, James K.

    2008-01-01

    A new numerical approach for modeling a class of flow–structure interaction problems typically encountered in biological systems is presented. In this approach, a previously developed, sharp-interface, immersed-boundary method for incompressible flows is used to model the fluid flow and a new, sharp-interface Cartesian grid, immersed boundary method is devised to solve the equations of linear viscoelasticity that governs the solid. The two solvers are coupled to model flow–structure interaction. This coupled solver has the advantage of simple grid generation and efficient computation on simple, single-block structured grids. The accuracy of the solid-mechanics solver is examined by applying it to a canonical problem. The solution methodology is then applied to the problem of laryngeal aerodynamics and vocal fold vibration during human phonation. This includes a three-dimensional eigen analysis for a multi-layered vocal fold prototype as well as two-dimensional, flow-induced vocal fold vibration in a modeled larynx. Several salient features of the aerodynamics as well as vocal-fold dynamics are presented. PMID:19936017

  5. Generating Contextual Descriptions of Virtual Reality (VR) Spaces

    NASA Astrophysics Data System (ADS)

    Olson, D. M.; Zaman, C. H.; Sutherland, A.

    2017-12-01

    Virtual reality holds great potential for science communication, education, and research. However, interfaces for manipulating data and environments in virtual worlds are limited and idiosyncratic. Furthermore, speech and vision are the primary modalities by which humans collect information about the world, but the linking of visual and natural language domains is a relatively new pursuit in computer vision. Machine learning techniques have been shown to be effective at image and speech classification, as well as at describing images with language (Karpathy 2016), but have not yet been used to describe potential actions. We propose a technique for creating a library of possible context-specific actions associated with 3D objects in immersive virtual worlds based on a novel dataset generated natively in virtual reality containing speech, image, gaze, and acceleration data. We will discuss the design and execution of a user study in virtual reality that enabled the collection and the development of this dataset. We will also discuss the development of a hybrid machine learning algorithm linking vision data with environmental affordances in natural language. Our findings demonstrate that it is possible to develop a model which can generate interpretable verbal descriptions of possible actions associated with recognized 3D objects within immersive VR environments. This suggests promising applications for more intuitive user interfaces through voice interaction within 3D environments. It also demonstrates the potential to apply vast bodies of embodied and semantic knowledge to enrich user interaction within VR environments. This technology would allow for applications such as expert knowledge annotation of 3D environments, complex verbal data querying and object manipulation in virtual spaces, and computer-generated, dynamic 3D object affordances and functionality during simulations.

  6. Media and Literacy: What's Good?

    ERIC Educational Resources Information Center

    Newkirk, Thomas

    2006-01-01

    For schools to effectively teach literacy, they should work with, not against, the cultural tools that students bring to school. Outside school, students' lives are immersed in visually mediated narratives. By tapping into the cultural, artistic, and linguistic resources of popular culture and multimedia, teachers can create more willing readers…

  7. 3D visualization of optical ray aberration and its broadcasting to smartphones by ray aberration generator

    NASA Astrophysics Data System (ADS)

    Hellman, Brandon; Bosset, Erica; Ender, Luke; Jafari, Naveed; McCann, Phillip; Nguyen, Chris; Summitt, Chris; Wang, Sunglin; Takashima, Yuzuru

    2017-11-01

    The ray formalism is critical to understanding light propagation, yet current pedagogy relies on inadequate 2D representations. We present a system in which real light rays are visualized through an optical system by using a collimated laser bundle of light and a fog chamber. Implementation for remote and immersive access is enabled by leveraging a commercially available 3D viewer and gesture-based remote controlling of the tool via bi-directional communication over the Internet.

  8. Analysis of isothermal and cooling rate dependent immersion freezing by a unifying stochastic ice nucleation model

    NASA Astrophysics Data System (ADS)

    Alpert, P. A.; Knopf, D. A.

    2015-05-01

    Immersion freezing is an important ice nucleation pathway involved in the formation of cirrus and mixed-phase clouds. Laboratory immersion freezing experiments are necessary to determine the range in temperature (T) and relative humidity (RH) at which ice nucleation occurs and to quantify the associated nucleation kinetics. Typically, isothermal (applying a constant temperature) and cooling rate dependent immersion freezing experiments are conducted. In these experiments it is usually assumed that the droplets containing ice nuclei (IN) all have the same IN surface area (ISA), however the validity of this assumption or the impact it may have on analysis and interpretation of the experimental data is rarely questioned. A stochastic immersion freezing model based on first principles of statistics is presented, which accounts for variable ISA per droplet and uses physically observable parameters including the total number of droplets (Ntot) and the heterogeneous ice nucleation rate coefficient, Jhet(T). This model is applied to address if (i) a time and ISA dependent stochastic immersion freezing process can explain laboratory immersion freezing data for different experimental methods and (ii) the assumption that all droplets contain identical ISA is a valid conjecture with subsequent consequences for analysis and interpretation of immersion freezing. The simple stochastic model can reproduce the observed time and surface area dependence in immersion freezing experiments for a variety of methods such as: droplets on a cold-stage exposed to air or surrounded by an oil matrix, wind and acoustically levitated droplets, droplets in a continuous flow diffusion chamber (CFDC), the Leipzig aerosol cloud interaction simulator (LACIS), and the aerosol interaction and dynamics in the atmosphere (AIDA) cloud chamber. Observed time dependent isothermal frozen fractions exhibiting non-exponential behavior with time can be readily explained by this model considering varying ISA. An apparent cooling rate dependence ofJhet is explained by assuming identical ISA in each droplet. When accounting for ISA variability, the cooling rate dependence of ice nucleation kinetics vanishes as expected from classical nucleation theory. The model simulations allow for a quantitative experimental uncertainty analysis for parameters Ntot, T, RH, and the ISA variability. In an idealized cloud parcel model applying variability in ISAs for each droplet, the model predicts enhanced immersion freezing temperatures and greater ice crystal production compared to a case when ISAs are uniform in each droplet. The implications of our results for experimental analysis and interpretation of the immersion freezing process are discussed.

  9. GROTTO visualization for decision support

    NASA Astrophysics Data System (ADS)

    Lanzagorta, Marco O.; Kuo, Eddy; Uhlmann, Jeffrey K.

    1998-08-01

    In this paper we describe the GROTTO visualization projects being carried out at the Naval Research Laboratory. GROTTO is a CAVE-like system, that is, a surround-screen, surround- sound, immersive virtual reality device. We have explored the GROTTO visualization in a variety of scientific areas including oceanography, meteorology, chemistry, biochemistry, computational fluid dynamics and space sciences. Research has emphasized the applications of GROTTO visualization for military, land and sea-based command and control. Examples include the visualization of ocean current models for the simulation and stud of mine drifting and, inside our computational steering project, the effects of electro-magnetic radiation on missile defense satellites. We discuss plans to apply this technology to decision support applications involving the deployment of autonomous vehicles into contaminated battlefield environments, fire fighter control and hostage rescue operations.

  10. Wayfinding and Glaucoma: A Virtual Reality Experiment.

    PubMed

    Daga, Fábio B; Macagno, Eduardo; Stevenson, Cory; Elhosseiny, Ahmed; Diniz-Filho, Alberto; Boer, Erwin R; Schulze, Jürgen; Medeiros, Felipe A

    2017-07-01

    Wayfinding, the process of determining and following a route between an origin and a destination, is an integral part of everyday tasks. The purpose of this study was to investigate the impact of glaucomatous visual field loss on wayfinding behavior using an immersive virtual reality (VR) environment. This cross-sectional study included 31 glaucomatous patients and 20 healthy subjects without evidence of overall cognitive impairment. Wayfinding experiments were modeled after the Morris water maze navigation task and conducted in an immersive VR environment. Two rooms were built varying only in the complexity of the visual scene in order to promote allocentric-based (room A, with multiple visual cues) versus egocentric-based (room B, with single visual cue) spatial representations of the environment. Wayfinding tasks in each room consisted of revisiting previously visible targets that subsequently became invisible. For room A, glaucoma patients spent on average 35.0 seconds to perform the wayfinding task, whereas healthy subjects spent an average of 24.4 seconds (P = 0.001). For room B, no statistically significant difference was seen on average time to complete the task (26.2 seconds versus 23.4 seconds, respectively; P = 0.514). For room A, each 1-dB worse binocular mean sensitivity was associated with 3.4% (P = 0.001) increase in time to complete the task. Glaucoma patients performed significantly worse on allocentric-based wayfinding tasks conducted in a VR environment, suggesting visual field loss may affect the construction of spatial cognitive maps relevant to successful wayfinding. VR environments may represent a useful approach for assessing functional vision endpoints for clinical trials of emerging therapies in ophthalmology.

  11. Locomotive Recalibration and Prism Adaptation of Children and Teens in Immersive Virtual Environments.

    PubMed

    Adams, Haley; Narasimham, Gayathri; Rieser, John; Creem-Regehr, Sarah; Stefanucci, Jeanine; Bodenheimer, Bobby

    2018-04-01

    As virtual reality expands in popularity, an increasingly diverse audience is gaining exposure to immersive virtual environments (IVEs). A significant body of research has demonstrated how perception and action work in such environments, but most of this work has been done studying adults. Less is known about how physical and cognitive development affect perception and action in IVEs, particularly as applied to preteen and teenage children. Accordingly, in the current study we assess how preteens (children aged 8-12 years) and teenagers (children aged 15-18 years) respond to mismatches between their motor behavior and the visual information presented by an IVE. Over two experiments, we evaluate how these individuals recalibrate their actions across functionally distinct systems of movement. The first experiment analyzed forward walking recalibration after exposure to an IVE with either increased or decreased visual flow. Visual flow during normal bipedal locomotion was manipulated to be either twice or half as fast as the physical gait. The second experiment leveraged a prism throwing adaptation paradigm to test the effect of recalibration on throwing movement. In the first experiment, our results show no differences across age groups, although subjects generally experienced a post-exposure effect of shortened distance estimation after experiencing visually faster flow and longer distance estimation after experiencing visually slower flow. In the second experiment, subjects generally showed the typical prism adaptation behavior of a throwing after-effect error. The error lasted longer for preteens than older children. Our results have implications for the design of virtual systems with children as a target audience.

  12. Surface interactions between silica particles and water and organic solvents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douillard, J.M.; Elwafir, M.; Partyka, S.

    1994-04-01

    A silica sample has been studied by vapor adsorption and by microcalorimetric methods. The combination of these two methods in the case of water allows one to calculate all the thermodynamic terms related to the adhesion on this silica. Adhesion between silica and miscellaneous solvents has been studied by immersion microcalorimetry. The silica is slightly hydrophobic, but the enthalpy of immersion into water is the most energetic one of all the solvents studied. It appears a clear graduation of the enthalpies of immersion due to the presence of delocalized electrons in the studied solvents.

  13. Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing

    PubMed Central

    Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T.

    2016-01-01

    In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user’s hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning. PMID:26999151

  14. "Eyes On The Solar System": A Real-time, 3D-Interactive Tool to Teach the Wonder of Planetary Science

    NASA Astrophysics Data System (ADS)

    Hussey, K. J.

    2011-10-01

    NASA's Jet Propulsion Laboratory is using videogame technology to immerse students, the general public and mission personnel in our solar system and beyond. "Eyes on the Solar System," a cross-platform, real-time, 3D-interactive application that runs inside a Web browser, was released worldwide late last year (solarsystem.nasa.gov/eyes). It gives users an extraordinary view of our solar system by virtually transporting them across space and time to make first-person observations of spacecraft and NASA/ESA missions in action. Key scientific results illustrated with video presentations and supporting imagery are imbedded contextually into the solar system. The presentation will include a detailed demonstration of the software along with a description/discussion of how this technology can be adapted for education and public outreach, as well as a preview of coming attractions. This work is being conducted by the Visualization Technology Applications and Development Group at NASA's Jet Propulsion Laboratory, the same team responsible for "Eyes on the Earth 3D," which can be viewed at climate.nasa.gov/Eyes.html.

  15. Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing.

    PubMed

    Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T

    2016-03-18

    In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user's hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning.

  16. The relationship between resonance scattering and the formation of an acoustojet under the interaction of ultrasound with a dielectric sphere immersed in water

    NASA Astrophysics Data System (ADS)

    Minin, I. V.; Minin, O. V.; Tseplyaev, I. S.

    2017-08-01

    We demonstrated for the first time the influence of the main parameters of dielectric spherical cavity, immersed in water, to transformation of whispering gallery mode into acoustojet (acoustic jets) by interaction of acoustic plane wave scatterer. It has been shown that the relative speed of sound in the material, the relative density of the material and the radius of particle significantly affect the condition for the formation of WGM resonance. However, the "more sensitive" parameter is the relative speed of sound.

  17. Children's Perception of Gap Affordances: Bicycling Across Traffic-Filled Intersections in an Immersive Virtual Environment

    ERIC Educational Resources Information Center

    Plumert, Jodie M.; Kearney, Joseph K.; Cremer, James F.

    2004-01-01

    This study examined gap choices and crossing behavior in children and adults using an immersive, interactive bicycling simulator. Ten- and 12-year-olds and adults rode a bicycle mounted on a stationary trainer through a virtual environment consisting of a street with 6 intersections. Participants faced continuous cross traffic traveling at 25mph…

  18. A Fully Immersive Set-Up for Remote Interaction and Neurorehabilitation Based on Virtual Body Ownership

    PubMed Central

    Perez-Marcos, Daniel; Solazzi, Massimiliano; Steptoe, William; Oyekoya, Oyewole; Frisoli, Antonio; Weyrich, Tim; Steed, Anthony; Tecchia, Franco; Slater, Mel; Sanchez-Vives, Maria V.

    2012-01-01

    Although telerehabilitation systems represent one of the most technologically appealing clinical solutions for the immediate future, they still present limitations that prevent their standardization. Here we propose an integrated approach that includes three key and novel factors: (a) fully immersive virtual environments, including virtual body representation and ownership; (b) multimodal interaction with remote people and virtual objects including haptic interaction; and (c) a physical representation of the patient at the hospital through embodiment agents (e.g., as a physical robot). The importance of secure and rapid communication between the nodes is also stressed and an example implemented solution is described. Finally, we discuss the proposed approach with reference to the existing literature and systems. PMID:22787454

  19. A cognitive prosthesis for complex decision-making.

    PubMed

    Tremblay, Sébastien; Gagnon, Jean-François; Lafond, Daniel; Hodgetts, Helen M; Doiron, Maxime; Jeuniaux, Patrick P J M H

    2017-01-01

    While simple heuristics can be ecologically rational and effective in naturalistic decision making contexts, complex situations require analytical decision making strategies, hypothesis-testing and learning. Sub-optimal decision strategies - using simplified as opposed to analytic decision rules - have been reported in domains such as healthcare, military operational planning, and government policy making. We investigate the potential of a computational toolkit called "IMAGE" to improve decision-making by developing structural knowledge and increasing understanding of complex situations. IMAGE is tested within the context of a complex military convoy management task through (a) interactive simulations, and (b) visualization and knowledge representation capabilities. We assess the usefulness of two versions of IMAGE (desktop and immersive) compared to a baseline. Results suggest that the prosthesis helped analysts in making better decisions, but failed to increase their structural knowledge about the situation once the cognitive prosthesis is removed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Interactive physically-based sound simulation

    NASA Astrophysics Data System (ADS)

    Raghuvanshi, Nikunj

    The realization of interactive, immersive virtual worlds requires the ability to present a realistic audio experience that convincingly compliments their visual rendering. Physical simulation is a natural way to achieve such realism, enabling deeply immersive virtual worlds. However, physically-based sound simulation is very computationally expensive owing to the high-frequency, transient oscillations underlying audible sounds. The increasing computational power of desktop computers has served to reduce the gap between required and available computation, and it has become possible to bridge this gap further by using a combination of algorithmic improvements that exploit the physical, as well as perceptual properties of audible sounds. My thesis is a step in this direction. My dissertation concentrates on developing real-time techniques for both sub-problems of sound simulation: synthesis and propagation. Sound synthesis is concerned with generating the sounds produced by objects due to elastic surface vibrations upon interaction with the environment, such as collisions. I present novel techniques that exploit human auditory perception to simulate scenes with hundreds of sounding objects undergoing impact and rolling in real time. Sound propagation is the complementary problem of modeling the high-order scattering and diffraction of sound in an environment as it travels from source to listener. I discuss my work on a novel numerical acoustic simulator (ARD) that is hundred times faster and consumes ten times less memory than a high-accuracy finite-difference technique, allowing acoustic simulations on previously-intractable spaces, such as a cathedral, on a desktop computer. Lastly, I present my work on interactive sound propagation that leverages my ARD simulator to render the acoustics of arbitrary static scenes for multiple moving sources and listener in real time, while accounting for scene-dependent effects such as low-pass filtering and smooth attenuation behind obstructions, reverberation, scattering from complex geometry and sound focusing. This is enabled by a novel compact representation that takes a thousand times less memory than a direct scheme, thus reducing memory footprints to fit within available main memory. To the best of my knowledge, this is the only technique and system in existence to demonstrate auralization of physical wave-based effects in real-time on large, complex 3D scenes.

  1. Real-time visualization of magnetic flux densities for transcranial magnetic stimulation on commodity and fully immersive VR systems

    NASA Astrophysics Data System (ADS)

    Kalivarapu, Vijay K.; Serrate, Ciro; Hadimani, Ravi L.

    2017-05-01

    Transcranial Magnetic Stimulation (TMS) is a non-invasive procedure that uses time varying short pulses of magnetic fields to stimulate nerve cells in the brain. In this method, a magnetic field generator ("TMS coil") produces small electric fields in the region of the brain via electromagnetic induction. This technique can be used to excite or inhibit firing of neurons, which can then be used for treatment of various neurological disorders such as Parkinson's disease, stroke, migraine, and depression. It is however challenging to focus the induced electric field from TMS coils to smaller regions of the brain. Since electric and magnetic fields are governed by laws of electromagnetism, it is possible to numerically simulate and visualize these fields to accurately determine the site of maximum stimulation and also to develop TMS coils that can focus the fields on the targeted regions. However, current software to compute and visualize these fields are not real-time and can work for only one position/orientation of TMS coil, severely limiting their usage. This paper describes the development of an application that computes magnetic flux densities (h-fields) and visualizes their distribution for different TMS coil position/orientations in real-time using GPU shaders. The application is developed for desktop, commodity VR (HTC Vive), and fully immersive VR CAVETM systems, for use by researchers, scientists, and medical professionals to quickly and effectively view the distribution of h-fields from MRI brain scans.

  2. Long-Term Audience Impacts of Live Fulldome Planetarium Lectures for Earth Science and Global Change Education

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Champlin, D. M.; Goldsworth, D. A.; Raynolds, R. G.; Dechesne, M.

    2011-09-01

    Digital Earth visualization technologies, from ArcGIS to Google Earth, have allowed for the integration of complex, disparate data sets to produce visually rich and compelling three-dimensional models of sub-surface and surface resource distribution patterns. The rendering of these models allows the public to quickly understand complicated geospatial relationships that would otherwise take much longer to explain using traditional media. At the Denver Museum of Nature & Science (DMNS), we have used such visualization technologies, including real-time virtual reality software running in the immersive digital "fulldome" Gates Planetarium, to impact the community through topical policy presentations. DMNS public lectures have covered regional issues like water resources, as well as global topics such as earthquakes, tsunamis, and resource depletion. The Gates Planetarium allows an audience to have an immersive experience-similar to virtual reality "CAVE" environments found in academia-that would otherwise not be available to the general public. Public lectures in the dome allow audiences of over 100 people to comprehend dynamically changing geospatial datasets in an exciting and engaging fashion. Surveys and interviews show that these talks are effective in heightening visitor interest in the subjects weeks or months after the presentation. Many visitors take additional steps to learn more, while one was so inspired that she actively worked to bring the same programming to her children's school. These preliminary findings suggest that fulldome real-time visualizations can have a substantial long-term impact on an audience's engagement and interest in science topics.

  3. Virtual Reality to Train Diagnostic Skills in Eating Disorders. Comparison of two Low Cost Systems.

    PubMed

    Gutiérrez-Maldonado, José; Ferrer-García, Marta; Plasanjuanelo, Joana; Andrés-Pueyo, Antonio; Talarn-Caparrós, Antoni

    2015-01-01

    Enhancing the ability to perform differential diagnosis and psychopathological exploration is important for students who wish to work in the clinical field, as well as for professionals already working in this area. Virtual reality (VR) simulations can immerse students totally in educational experiences in a way that is not possible using other methods. Learning in a VR environment can also be more effective and motivating than usual classroom practices. Traditionally, immersion has been considered central to the quality of a VR system; immersive VR is considered a special and unique experience that cannot achieved by three-dimensional (3D) interactions on desktop PCs. However, some authors have suggested that if the content design is emotionally engaging, immersive systems are not always necessary. The main purpose of this study is to compare the efficacy and usability of two low-cost VR systems, offering different levels of immersion, in order to develop the ability to perform diagnostic interviews in eating disorders by means of simulations of psychopathological explorations.

  4. A Quantitative Visual Mapping and Visualization Approach for Deep Ocean Floor Research

    NASA Astrophysics Data System (ADS)

    Hansteen, T. H.; Kwasnitschka, T.

    2013-12-01

    Geological fieldwork on the sea floor is still impaired by our inability to resolve features on a sub-meter scale resolution in a quantifiable reference frame and over an area large enough to reveal the context of local observations. In order to overcome these issues, we have developed an integrated workflow of visual mapping techniques leading to georeferenced data sets which we examine using state-of-the-art visualization technology to recreate an effective working style of field geology. We demonstrate a microbathymetrical workflow, which is based on photogrammetric reconstruction of ROV imagery referenced to the acoustic vehicle track. The advantage over established acoustical systems lies in the true three-dimensionality of the data as opposed to the perspective projection from above produced by downward looking mapping methods. A full color texture mosaic derived from the imagery allows studies at resolutions beyond the resolved geometry (usually one order of magnitude below the image resolution) while color gives additional clues, which can only be partly resolved in acoustic backscatter. The creation of a three-dimensional model changes the working style from the temporal domain of a video recording back to the spatial domain of a map. We examine these datasets using a custom developed immersive virtual visualization environment. The ARENA (Artificial Research Environment for Networked Analysis) features a (lower) hemispherical screen at a diameter of six meters, accommodating up to four scientists at once thus providing the ability to browse data interactively among a group of researchers. This environment facilitates (1) the development of spatial understanding analogue to on-land outcrop studies, (2) quantitative observations of seafloor morphology and physical parameters of its deposits, (3) more effective formulation and communication of working hypotheses.

  5. Wave interactions with multiple semi-immersed Jarlan-type perforated breakwaters

    NASA Astrophysics Data System (ADS)

    Elbisy, Moussa S.

    2017-06-01

    This study examines wave interactions with multiple semi-immersed Jarlan-type perforated breakwaters. A numerical model based on linear wave theory and an eigenfunction expansion method has been developed to study the hydrodynamic characteristics of breakwaters. The numerical results show a good agreement with previous analytical results and experimental data for limiting cases of double partially immersed impermeable walls and double and triple Jarlan-type breakwaters. The wave transmission coefficient C T; reflection coefficient C R, and energy dissipation coefficient C E coefficients and the horizontal wave force exerted on the front and rear walls are examined. The results show that C R reaches the maximum value when B/L = 0.46 n while it is smallest when B/L=0.46 n+0.24 ( n=0, 1, 2,...). An economical triple semi-immersed Jarlan-type perforated breakwater can be designed with B/L = 0.25 and C R and C T ranging from 0.25 to 0.32 by choosing a relative draft d/h of 0.35 and a permeability parameter of the perforated front walls being 0.5 for an incident wave number kh nearly equal to 2.0. The triple semi-immersed Jarlan-type perforated breakwaters with significantly reduced C R, will enhance the structure's wave absorption ability, and lead to smaller wave forces compared with the double one. The proposed model may be used to predict the response of a structure in the preliminary design stage for practical engineering.

  6. Immersive Environment Technologies for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Wright, John R.; Hartman, Frank

    2000-01-01

    JPL's charter includes the unmanned exploration of the Solar System. One of the tools for exploring other planets is the rover as exemplified by Sojourner on the Mars Pathfinder mission. The light speed turnaround time between Earth and the outer planets precludes the use of teleoperated rovers so autonomous operations are built in to the current and upcoming generation devices. As the level of autonomy increases, the mode of operations shifts from low-level specification of activities to a higher-level specification of goals. To support this higher-level activity, it is necessary to provide the operator with an effective understanding of the in-situ environment and also the tools needed to specify the higher-level goals. Immersive environments provide the needed sense of presence to achieve this goal. Use of immersive environments at JPL has two main thrusts that will be discussed in this talk. One is the generation of 3D models of the in-situ environment, in particular the merging of models from different sensors, different modes (orbital, descent, and lander), and even different missions. The other is the use of various tools to visualize the environment within which the rover will be operating to maximize the understanding by the operator. A suite of tools is under development which provide an integrated view into the environment while providing a variety of modes of visualization. This allows the operator to smoothly switch from one mode to another depending on the information and presentation desired.

  7. Analysis of isothermal and cooling-rate-dependent immersion freezing by a unifying stochastic ice nucleation model

    NASA Astrophysics Data System (ADS)

    Alpert, Peter A.; Knopf, Daniel A.

    2016-02-01

    Immersion freezing is an important ice nucleation pathway involved in the formation of cirrus and mixed-phase clouds. Laboratory immersion freezing experiments are necessary to determine the range in temperature, T, and relative humidity, RH, at which ice nucleation occurs and to quantify the associated nucleation kinetics. Typically, isothermal (applying a constant temperature) and cooling-rate-dependent immersion freezing experiments are conducted. In these experiments it is usually assumed that the droplets containing ice nucleating particles (INPs) all have the same INP surface area (ISA); however, the validity of this assumption or the impact it may have on analysis and interpretation of the experimental data is rarely questioned. Descriptions of ice active sites and variability of contact angles have been successfully formulated to describe ice nucleation experimental data in previous research; however, we consider the ability of a stochastic freezing model founded on classical nucleation theory to reproduce previous results and to explain experimental uncertainties and data scatter. A stochastic immersion freezing model based on first principles of statistics is presented, which accounts for variable ISA per droplet and uses parameters including the total number of droplets, Ntot, and the heterogeneous ice nucleation rate coefficient, Jhet(T). This model is applied to address if (i) a time and ISA-dependent stochastic immersion freezing process can explain laboratory immersion freezing data for different experimental methods and (ii) the assumption that all droplets contain identical ISA is a valid conjecture with subsequent consequences for analysis and interpretation of immersion freezing. The simple stochastic model can reproduce the observed time and surface area dependence in immersion freezing experiments for a variety of methods such as: droplets on a cold-stage exposed to air or surrounded by an oil matrix, wind and acoustically levitated droplets, droplets in a continuous-flow diffusion chamber (CFDC), the Leipzig aerosol cloud interaction simulator (LACIS), and the aerosol interaction and dynamics in the atmosphere (AIDA) cloud chamber. Observed time-dependent isothermal frozen fractions exhibiting non-exponential behavior can be readily explained by this model considering varying ISA. An apparent cooling-rate dependence of Jhet is explained by assuming identical ISA in each droplet. When accounting for ISA variability, the cooling-rate dependence of ice nucleation kinetics vanishes as expected from classical nucleation theory. The model simulations allow for a quantitative experimental uncertainty analysis for parameters Ntot, T, RH, and the ISA variability. The implications of our results for experimental analysis and interpretation of the immersion freezing process are discussed.

  8. Enhancing the immersive reality of virtual simulators for easily accessible laparoscopic surgical training

    NASA Astrophysics Data System (ADS)

    McKenna, Kyra; McMenemy, Karen; Ferguson, R. S.; Dick, Alistair; Potts, Stephen

    2008-02-01

    Computer simulators are a popular method of training surgeons in the techniques of laparoscopy. However, for the trainee to feel totally immersed in the process, the graphical display should be as lifelike as possible and two-handed force feedback interaction is required. This paper reports on how a compelling immersive experience can be delivered at low cost using commonly available hardware components. Three specific themes are brought together. Firstly, programmable shaders executing in standard PC graphics adapter's deliver the appearance of anatomical realism, including effects of: translucent tissue surfaces, semi-transparent membranes, multilayer image texturing and real-time shadowing. Secondly, relatively inexpensive 'off the shelf' force feedback devices contribute to a holistic immersive experience. The final element described is the custom software that brings these together with hierarchically organized and optimized polygonal models for abdominal anatomy.

  9. Multi-arm multilateral haptics-based immersive tele-robotic system (HITS) for improvised explosive device disposal

    NASA Astrophysics Data System (ADS)

    Erickson, David; Lacheray, Hervé; Lai, Gilbert; Haddadi, Amir

    2014-06-01

    This paper presents the latest advancements of the Haptics-based Immersive Tele-robotic System (HITS) project, a next generation Improvised Explosive Device (IED) disposal (IEDD) robotic interface containing an immersive telepresence environment for a remotely-controlled three-articulated-robotic-arm system. While the haptic feedback enhances the operator's perception of the remote environment, a third teleoperated dexterous arm, equipped with multiple vision sensors and cameras, provides stereo vision with proper visual cues, and a 3D photo-realistic model of the potential IED. This decentralized system combines various capabilities including stable and scaled motion, singularity avoidance, cross-coupled hybrid control, active collision detection and avoidance, compliance control and constrained motion to provide a safe and intuitive control environment for the operators. Experimental results and validation of the current system are presented through various essential IEDD tasks. This project demonstrates that a two-armed anthropomorphic Explosive Ordnance Disposal (EOD) robot interface can achieve complex neutralization techniques against realistic IEDs without the operator approaching at any time.

  10. iVFTs - immersive virtual field trips for interactive learning about Earth's environment.

    NASA Astrophysics Data System (ADS)

    Bruce, G.; Anbar, A. D.; Semken, S. C.; Summons, R. E.; Oliver, C.; Buxner, S.

    2014-12-01

    Innovations in immersive interactive technologies are changing the way students explore Earth and its environment. State-of-the-art hardware has given developers the tools needed to capture high-resolution spherical content, 360° panoramic video, giga-pixel imagery, and unique viewpoints via unmanned aerial vehicles as they explore remote and physically challenging regions of our planet. Advanced software enables integration of these data into seamless, dynamic, immersive, interactive, content-rich, and learner-driven virtual field explorations, experienced online via HTML5. These surpass conventional online exercises that use 2-D static imagery and enable the student to engage in these virtual environments that are more like games than like lectures. Grounded in the active learning of exploration, inquiry, and application of knowledge as it is acquired, users interact non-linearly in conjunction with an intelligent tutoring system (ITS). The integration of this system allows the educational experience to be adapted to each individual student as they interact within the program. Such explorations, which we term "immersive virtual field trips" (iVFTs), are being integrated into cyber-learning allowing science teachers to take students to scientifically significant but inaccessible environments. Our team and collaborators are producing a diverse suite of freely accessible, iVFTs to teach key concepts in geology, astrobiology, ecology, and anthropology. Topics include Early Life, Biodiversity, Impact craters, Photosynthesis, Geologic Time, Stratigraphy, Tectonics, Volcanism, Surface Processes, The Rise of Oxygen, Origin of Water, Early Civilizations, Early Multicellular Organisms, and Bioarcheology. These diverse topics allow students to experience field sites all over the world, including, Grand Canyon (USA), Flinders Ranges (Australia), Shark Bay (Australia), Rainforests (Panama), Teotihuacan (Mexico), Upheaval Dome (USA), Pilbara (Australia), Mid-Atlantic Ridge (Iceland), and Mauna Kea (Hawaii). iVFTs are being beta-tested and used at ASU in several large-enrollment courses to assess its usability and effectiveness in meeting specific learning objectives. We invite geoscience educators to partake of this resource and find new applications to their own teaching.

  11. Sign Language Conversational Interaction between Chimpanzees.

    ERIC Educational Resources Information Center

    Fouts, Roger S.; And Others

    1984-01-01

    Systematic sampling was done of signing between five home-reared chimpanzees who had had 4-7 years of complete immersion in integrating their signing interaction into their nonverbal communication. Eight-eight percent of all signs reported fell into the social categories of reassurance, social interaction, and play. (SL)

  12. Constructing Image-Based Culture Definitions Using Metaphors: Impact of a Cross-Cultural Immersive Experience

    ERIC Educational Resources Information Center

    Tuleja, Elizabeth A.

    2017-01-01

    This study provides an approach to teaching and learning in the international business (IB) classroom about cultural values, beliefs, attitudes, and norms through the study of cultural metaphor. The methodology is based on established qualitative methods by using participants' visual pictures and written explanations--representative of their…

  13. The Flatworld Simulation Control Architecture (FSCA): A Framework for Scalable Immersive Visualization Systems

    DTIC Science & Technology

    2004-12-01

    handling using the X10 home automation protocol. Each 3D graphics client renders its scene according to an assigned virtual camera position. By having...control protocol. DMX is a versatile and robust framework which overcomes limitations of the X10 home automation protocol which we are currently using

  14. Compensating Scientism through "The Black Hole."

    ERIC Educational Resources Information Center

    Roth, Lane

    The focal image of the film "The Black Hole" functions as a visual metaphor for the sacred, order, unity, and eternal time. The black hole is a symbol that unites the antinomic pairs of conscious/unconscious, water/fire, immersion/emersion, death/rebirth, and hell/heaven. The black hole is further associated with the quest for…

  15. Evidence of Blocking with Geometric Cues in a Virtual Watermaze

    ERIC Educational Resources Information Center

    Redhead, Edward S.; Hamilton, Derek A.

    2009-01-01

    Three computer based experiments, testing human participants in a non-immersive virtual watermaze task, used a blocking design to assess whether two sets of geometric cues would compete in a manner described by associative models of learning. In stage 1, participants were required to discriminate between visually distinct platforms. In stage 2,…

  16. Visual Perspectives within Educational Computer Games: Effects on Presence and Flow within Virtual Immersive Learning Environments

    ERIC Educational Resources Information Center

    Scoresby, Jon; Shelton, Brett E.

    2011-01-01

    The mis-categorizing of cognitive states involved in learning within virtual environments has complicated instructional technology research. Further, most educational computer game research does not account for how learning activity is influenced by factors of game content and differences in viewing perspectives. This study is a qualitative…

  17. Depletion interaction between colloids mediated by an athermal polymer blend

    NASA Astrophysics Data System (ADS)

    Chervanyov, A. I.

    2018-03-01

    We calculate the immersion energy of a colloid and the potential of the depletion interaction (DI) acting between colloids immersed in an athermal polymer blend. The developed theory has no limitations with respect to the polymer-to-colloid size ratios and polymer densities, covering, in particular, dense polymer blends. We demonstrate that in addition to the standard compressibility-induced mechanism of the DI there exists the mechanism relying on the correlations between compositional fluctuations specific to polymer blends. We quantitatively investigate this "compositional" mechanism of the DI and demonstrate that it causes significant contributions to the effective force acting between colloids. Further we show that relative significance of the contributions to the colloid immersion energy and the depletion potential caused by the above compositional mechanism strongly depends on the mass fractions of the polymer species and their size ratio. We find out that these contributions strongly affect the range of the DI, thus causing a significant increase in the absolute value of the second virial coefficient of the effective potential acting between colloids.

  18. IB2d: a Python and MATLAB implementation of the immersed boundary method.

    PubMed

    Battista, Nicholas A; Strickland, W Christopher; Miller, Laura A

    2017-03-29

    The development of fluid-structure interaction (FSI) software involves trade-offs between ease of use, generality, performance, and cost. Typically there are large learning curves when using low-level software to model the interaction of an elastic structure immersed in a uniform density fluid. Many existing codes are not publicly available, and the commercial software that exists usually requires expensive licenses and may not be as robust or allow the necessary flexibility that in house codes can provide. We present an open source immersed boundary software package, IB2d, with full implementations in both MATLAB and Python, that is capable of running a vast range of biomechanics models and is accessible to scientists who have experience in high-level programming environments. IB2d contains multiple options for constructing material properties of the fiber structure, as well as the advection-diffusion of a chemical gradient, muscle mechanics models, and artificial forcing to drive boundaries with a preferred motion.

  19. Live-cell imaging of Salmonella Typhimurium interaction with zebrafish larvae after injection and immersion delivery methods.

    PubMed

    Varas, Macarena; Fariña, Alonso; Díaz-Pascual, Francisco; Ortíz-Severín, Javiera; Marcoleta, Andrés E; Allende, Miguel L; Santiviago, Carlos A; Chávez, Francisco P

    2017-04-01

    The zebrafish model has been used to determine the role of vertebrate innate immunity during bacterial infections. Here, we compare the in vivo immune response induced by GFP-tagged Salmonella Typhimurium inoculated by immersion and microinjection in transgenic zebrafish larvae. Our novel infection protocols in zebrafish allow live-cell imaging of Salmonella colonization. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Harnessing Neuroplasticity to Promote Rehabilitation: CI Therapy for TBI

    DTIC Science & Technology

    2016-10-01

    scheduled plus 33 to be enrolled, because we assume that the proportion of withdrawals will be the same as experienced to date, i.e., 24%. This plan will...period? Victor Mark, Investigator Interactive Immersive Virtual Reality Walking for SCI Neuropathic Pain (Trost) 0.24 calendar months Kim Cerise...Direct Costs: $149,999 This project designs and test an immersive virtual reality treatment method to control neuropathic pain following traumatic spinal

  1. An immersive surgery training system with live streaming capability.

    PubMed

    Yang, Yang; Guo, Xinqing; Yu, Zhan; Steiner, Karl V; Barner, Kenneth E; Bauer, Thomas L; Yu, Jingyi

    2014-01-01

    Providing real-time, interactive immersive surgical training has been a key research area in telemedicine. Earlier approaches have mainly adopted videotaped training that can only show imagery from a fixed view point. Recent advances on commodity 3D imaging have enabled a new paradigm for immersive surgical training by acquiring nearly complete 3D reconstructions of actual surgical procedures. However, unlike 2D videotaping that can easily stream data in real-time, by far 3D imaging based solutions require pre-capturing and processing the data; surgical trainings using the data have to be conducted offline after the acquisition. In this paper, we present a new real-time immersive 3D surgical training system. Our solution builds upon the recent multi-Kinect based surgical training system [1] that can acquire and display high delity 3D surgical procedures using only a small number of Microsoft Kinect sensors. We build on top of the system a client-server model for real-time streaming. On the server front, we efficiently fuse multiple Kinect data acquired from different viewpoints and compress and then stream the data to the client. On the client front, we build an interactive space-time navigator to allow remote users (e.g., trainees) to witness the surgical procedure in real-time as if they were present in the room.

  2. Silver nanoparticles enhance wound healing in zebrafish (Danio rerio).

    PubMed

    Seo, Seung Beom; Dananjaya, S H S; Nikapitiya, Chamilani; Park, Bae Keun; Gooneratne, Ravi; Kim, Tae-Yoon; Lee, Jehee; Kim, Cheol-Hee; De Zoysa, Mahanama

    2017-09-01

    Silver nanoparticles (AgNPs) were successfully synthesized by a chemical reduction method, physico-chemically characterized and their effect on wound-healing activity in zebrafish was investigated. The prepared AgNPs were circular-shaped, water soluble with average diameter and zeta potential of 72.66 nm and -0.45 mv, respectively. Following the creation of a laser skin wound on zebrafish, the effect of AgNPs on wound-healing activity was tested by two methods, direct skin application (2 μg/wound) and immersion in a solution of AgNPs and water (50 μg/L). The zebrafish were followed for 20 days post-wounding (dpw) by visual observation of wound size, calculating wound healing percentage (WHP), and histological examination. Visually, both direct skin application and immersion AgNPs treatments displayed clear and faster wound closure at 5, 10 and 20 dpw compared to the controls, which was confirmed by 5 dpw histology data. At 5 dpw, WHP was highest in the AgNPs immersion group (36.6%) > AgNPs direct application group (23.7%) > controls (18.2%), showing that WHP was most effective in fish immersed in AgNPs solution. In general, exposure to AgNPs induced gene expression of selected wound-healing-related genes, namely, transforming growth factor (TGF-β), matrix metalloproteinase (MMP) -9 and -13, pro-inflammatory cytokines (IL-1β and TNF-α) and antioxidant enzymes (superoxide dismutase and catalase), which observed differentiation at 12 and 24 h against the control; but the results were not consistently significant, and many either reached basal levels or were down regulated at 5 dpw in the wounded muscle. These results suggest that AgNPs are effective in acceleration of wound healing and altered the expression of some wound-healing-related genes. However, the detailed mechanism of enhanced wound healing remains to be investigated in fish. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Visualizing preparation using asymmetrical choline-like ionic liquids for scanning electron microscope observation of non-conductive biological samples.

    PubMed

    Abe, Shigeaki; Hyono, Atsushi; Kawai, Koji; Yonezawa, Tetsu

    2014-03-01

    In this study, we investigated conductivity preparation for scanning electron microscope (SEM) observation that used novel asymmetrical choline-type room temperature ionic liquids (RTIL). By immersion in only an RTIL solution, clear SEM images of several types of biological samples were successfully observed. In addition, we could visualize protozoans using RTILs without any dilution. These results suggested that the asymmetrical choline-type RTILs used in this study are suitable for visualizing of biological samples by SEM. Treatment without the need for dilution can obviate the need for adjusting the RTIL concentration and provide for a rapid and easy conductivity treatment for insulating samples.

  4. Exploring Design Requirements for Repurposing Dental Virtual Patients From the Web to Second Life: A Focus Group Study

    PubMed Central

    Antoniou, Panagiotis E; Athanasopoulou, Christina A; Dafli, Eleni

    2014-01-01

    Background Since their inception, virtual patients have provided health care educators with a way to engage learners in an experience simulating the clinician’s environment without danger to learners and patients. This has led this learning modality to be accepted as an essential component of medical education. With the advent of the visually and audio-rich 3-dimensional multi-user virtual environment (MUVE), a new deployment platform has emerged for educational content. Immersive, highly interactive, multimedia-rich, MUVEs that seamlessly foster collaboration provide a new hotbed for the deployment of medical education content. Objective This work aims to assess the suitability of the Second Life MUVE as a virtual patient deployment platform for undergraduate dental education, and to explore the requirements and specifications needed to meaningfully repurpose Web-based virtual patients in MUVEs. Methods Through the scripting capabilities and available art assets in Second Life, we repurposed an existing Web-based periodontology virtual patient into Second Life. Through a series of point-and-click interactions and multiple-choice queries, the user experienced a specific periodontology case and was asked to provide the optimal responses for each of the challenges of the case. A focus group of 9 undergraduate dentistry students experienced both the Web-based and the Second Life version of this virtual patient. The group convened 3 times and discussed relevant issues such as the group’s computer literacy, the assessment of Second Life as a virtual patient deployment platform, and compared the Web-based and MUVE-deployed virtual patients. Results A comparison between the Web-based and the Second Life virtual patient revealed the inherent advantages of the more experiential and immersive Second Life virtual environment. However, several challenges for the successful repurposing of virtual patients from the Web to the MUVE were identified. The identified challenges for repurposing of Web virtual patients to the MUVE platform from the focus group study were (1) increased case complexity to facilitate the user’s gaming preconception in a MUVE, (2) necessity to decrease textual narration and provide the pertinent information in a more immersive sensory way, and (3) requirement to allow the user to actuate the solutions of problems instead of describing them through narration. Conclusions For a successful systematic repurposing effort of virtual patients to MUVEs such as Second Life, the best practices of experiential and immersive game design should be organically incorporated in the repurposing workflow (automated or not). These findings are pivotal in an era in which open educational content is transferred to and shared among users, learners, and educators of various open repositories/environments. PMID:24927470

  5. A Theory of Immersion Freezing

    NASA Technical Reports Server (NTRS)

    Barahona, Donifan

    2017-01-01

    Immersion freezing is likely involved in the initiation of precipitation and determines to large extent the phase partitioning in convective clouds. Theoretical models commonly used to describe immersion freezing in atmospheric models are based on the classical nucleation theory which however neglects important interactions near the immersed particle that may affect nucleation rates. This work introduces a new theory of immersion freezing based on two premises. First, immersion ice nucleation is mediated by the modification of the properties of water near the particle-liquid interface, rather than by the geometry of the ice germ. Second, the same mechanism that leads to the decrease in the work of germ formation also decreases the mobility of water molecules near the immersed particle. These two premises allow establishing general thermodynamic constraints to the ice nucleation rate. Analysis of the new theory shows that active sites likely trigger ice nucleation, but they do not control the overall nucleation rate nor the probability of freezing. It also suggests that materials with different ice nucleation efficiency may exhibit similar freezing temperatures under similar conditions but differ in their sensitivity to particle surface area and cooling rate. Predicted nucleation rates show good agreement with observations for a diverse set of materials including dust, black carbon and bacterial ice nucleating particles. The application of the new theory within the NASA Global Earth System Model (GEOS-5) is also discussed.

  6. Towards understanding addiction factors of mobile devices: An eye tracking study on effect of screen size.

    PubMed

    Wibirama, Sunu; Nugroho, Hanung A

    2017-07-01

    Mobile devices addiction has been an important research topic in cognitive science, mental health, and human-machine interaction. Previous works observed mobile device addiction by logging mobile devices activity. Although immersion has been linked as a significant predictor of video game addiction, investigation on addiction factors of mobile device with behavioral measurement has never been done before. In this research, we demonstrated the usage of eye tracking to observe effect of screen size on experience of immersion. We compared subjective judgment with eye movements analysis. Non-parametric analysis on immersion score shows that screen size affects experience of immersion (p<;0.05). Furthermore, our experimental results suggest that fixational eye movements may be used as an indicator for future investigation of mobile devices addiction. Our experimental results are also useful to develop a guideline as well as intervention strategy to deal with smartphone addiction.

  7. The Flostation - an Immersive Cyberspace System

    NASA Technical Reports Server (NTRS)

    Park, Brian

    2006-01-01

    A flostation is a computer-controlled apparatus that, along with one or more computer(s) and other computer-controlled equipment, is part of an immersive cyberspace system. The system is said to be immersive in two senses of the word: (1) It supports the body in a modified form neutral posture experienced in zero gravity and (2) it is equipped with computer-controlled display equipment that helps to give the occupant of the chair a feeling of immersion in an environment that the system is designed to simulate. Neutral immersion was conceived during the Gemini program as a means of training astronauts for working in a zerogravity environment. Current derivatives include neutral-buoyancy tanks and the KC-135 airplane, each of which mimics the effects of zero gravity. While these have performed well in simulating the shorter-duration flights typical of the space program to date, a training device that can take astronauts to the next level will be needed for simulating longer-duration flights such as that of the International Space Station. The flostation is expected to satisfy this need. The flostation could also be adapted and replicated for use in commercial ventures ranging from home entertainment to medical treatment. The use of neutral immersion in the flostation enables the occupant to recline in an optimal posture of rest and meditation. This posture, combines savasana (known to practitioners of yoga) and a modified form of the neutral posture assumed by astronauts in outer space. As the occupant relaxes, awareness of the physical body is reduced. The neutral body posture, which can be maintained for hours without discomfort, is extended to the eyes, ears, and hands. The occupant can be surrounded with a full-field-of-view visual display and nearphone sound, and can be stimulated with full-body vibration and motion cueing. Once fully immersed, the occupant can use neutral hand controllers (that is, hand-posture sensors) to control various aspects of the simulated environment.

  8. Perception of approaching and retreating floor-projected shapes in a large, immersive, multimedia learning environment.

    PubMed

    Dolgov, Igor; Birchfield, David A; McBeath, Michael K; Thornburg, Harvey; Todd, Christopher G

    2009-04-01

    Perception of floor-projected moving geometric shapes was examined in the context of the Situated Multimedia Arts Learning Laboratory (SMALLab), an immersive, mixed-reality learning environment. As predicted, the projected destinations of shapes which retreated in depth (proximal origin) were judged significantly less accurately than those that approached (distal origin). Participants maintained similar magnitudes of error throughout the session, and no effect of practice was observed. Shape perception in an immersive multimedia environment is comparable to the real world. One may conclude that systematic exploration of basic psychological phenomena in novel mediated environments is integral to an understanding of human behavior in novel human-computer interaction architectures.

  9. Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback.

    PubMed

    Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T

    2007-07-01

    Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.

  10. "Journey to the Stars": Presenting What Stars Are to Global Planetarium Audiences by Blending Astrophysical Visualizations Into a Single Immersive Production at the American Museum of Natural History

    NASA Astrophysics Data System (ADS)

    Emmart, Carter; Mac Low, M.; Oppenheimer, B. R.; Kinzler, R.; Paglione, T. A. D.; Abbott, B. P.

    2010-01-01

    "Journey to the Stars" is the latest and fourth space show based on storytelling from data visualization at the Rose Center for Earth and Space at the American Museum of Natural History. This twenty five minute, full dome movie production presents to planetarium audiences what the stars are, where they come from, how they vary in type and over time, and why they are important to life of Earth. Over forty scientists from around the world contributed their research to what is visualized into roughly fifteen major scenes. How this production is directed into a consolidated immersive informal science experience with learning goals is an integrative process with many inputs and concerns for scientific accuracy. The goal is a seamless merger of visualizations at varying spatial and temporal scales with acuity toward depth perception, revealing unseen phenomena, and the layering of concepts together to build an understanding of stars; to blend our common experience of them in the sky with the uncommon meaning we have come to know through science. Scripted by Louise Gikow who has worked for Children's Television Workshop, narrated by Whoopie Goldberg, and musically scored by Robert Miller, this production strives to guide audiences through challenging scientific concepts by complimenting the natural beauty the subject matter presents with understandable prose and musical grandeur. "Journey to the Stars" was produced in cooperation with NASA's Science Mission Directorate, Heliophysics Division and is in release at major planetariums, worldwide.

  11. NASA GIBS Use in Live Planetarium Shows

    NASA Astrophysics Data System (ADS)

    Emmart, C. B.

    2015-12-01

    The American Museum of Natural History's Hayden Planetarium was rebuilt in year 2000 as an immersive theater for scientific data visualization to show the universe in context to our planet. Specific astrophysical movie productions provide the main daily programming, but interactive control software, developed at AMNH allows immersive presentation within a data aggregation of astronomical catalogs called the Digital Universe 3D Atlas. Since 2006, WMS globe browsing capabilities have been built into a software development collaboration with Sweden's Linkoping University (LiU). The resulting Uniview software, now a product of the company SCISS, is operated by about fifty planetariums around that world with ability to network amongst the sites for global presentations. Public presentation of NASA GIBS has allowed authoritative narratives to be presented within the range of data available in context to other sources such as Science on a Sphere, NASA Earth Observatory and Google Earth KML resources. Specifically, the NOAA supported World Views Network conducted a series of presentations across the US that focused on local ecological issues that could then be expanded in the course of presentation to national and global scales of examination. NASA support of for GIBS resources in an easy access multi scale streaming format like WMS has tremendously enabled particularly facile presentations of global monitoring like never before. Global networking of theaters for distributed presentations broadens out the potential for impact of this medium. Archiving and refinement of these presentations has already begun to inform new types of documentary productions that examine pertinent, global interdependency topics.

  12. Investigation of tracking systems properties in CAVE-type virtual reality systems

    NASA Astrophysics Data System (ADS)

    Szymaniak, Magda; Mazikowski, Adam; Meironke, Michał

    2017-08-01

    In recent years, many scientific and industrial centers in the world developed a virtual reality systems or laboratories. One of the most advanced solutions are Immersive 3D Visualization Lab (I3DVL), a CAVE-type (Cave Automatic Virtual Environment) laboratory. It contains two CAVE-type installations: six-screen installation arranged in a form of a cube, and four-screen installation, a simplified version of the previous one. The user feeling of "immersion" and interaction with virtual world depend on many factors, in particular on the accuracy of the tracking system of the user. In this paper properties of the tracking systems applied in I3DVL was investigated. For analysis two parameters were selected: the accuracy of the tracking system and the range of detection of markers by the tracking system in space of the CAVE. Measurements of system accuracy were performed for six-screen installation, equipped with four tracking cameras for three axes: X, Y, Z. Rotation around the Y axis was also analyzed. Measured tracking system shows good linear and rotating accuracy. The biggest issue was the range of the monitoring of markers inside the CAVE. It turned out, that the tracking system lose sight of the markers in the corners of the installation. For comparison, for a simplified version of CAVE (four-screen installation), equipped with eight tracking cameras, this problem was not occur. Obtained results will allow for improvement of cave quality.

  13. Design and development of physics simulations in the field of oscillations and waves suitable for k-12 and undergraduate instruction using video game technology

    NASA Astrophysics Data System (ADS)

    Tomesh, Trevor; Price, Colin

    2011-03-01

    Using the scripting language for the Unreal Tournament 2004 Engine, Unreal Script, demonstrations in the field of oscillations and waves were designed and developed. Variations on Euler's method and the Runge-Kutta method were used to numerically solve the equations of motion for seven different physical systems which were visually represented in the immersive environment of Unreal Tournament 2004. Data from each system was written to an output file, plotted and analyzed. The over-arching goal of this research is to successfully design and develop useful teaching tools for the k-12 and undergraduate classroom which, presented in the form of a video game, is immersive, engaging and educational.

  14. Exploring Learning through Audience Interaction in Virtual Reality Dome Theaters

    NASA Astrophysics Data System (ADS)

    Apostolellis, Panagiotis; Daradoumis, Thanasis

    Informal learning in public spaces like museums, science centers and planetariums is increasingly popular during the last years. Recent advancements in large-scale displays allowed contemporary technology-enhanced museums to get equipped with digital domes, some with real-time capabilities like Virtual Reality systems. By conducting extensive literature review we have come to the conclusion that little to no research has been carried out on the leaning outcomes that the combination of VR and audience interaction can provide in the immersive environments of dome theaters. Thus, we propose that audience collaboration in immersive virtual reality environments presents a promising approach to support effective learning in groups of school aged children.

  15. A coupled sharp-interface immersed boundary-finite-element method for flow-structure interaction with application to human phonation.

    PubMed

    Zheng, X; Xue, Q; Mittal, R; Beilamowicz, S

    2010-11-01

    A new flow-structure interaction method is presented, which couples a sharp-interface immersed boundary method flow solver with a finite-element method based solid dynamics solver. The coupled method provides robust and high-fidelity solution for complex flow-structure interaction (FSI) problems such as those involving three-dimensional flow and viscoelastic solids. The FSI solver is used to simulate flow-induced vibrations of the vocal folds during phonation. Both two- and three-dimensional models have been examined and qualitative, as well as quantitative comparisons, have been made with established results in order to validate the solver. The solver is used to study the onset of phonation in a two-dimensional laryngeal model and the dynamics of the glottal jet in a three-dimensional model and results from these studies are also presented.

  16. Revolutionizing Education: The Promise of Virtual Reality

    ERIC Educational Resources Information Center

    Gadelha, Rene

    2018-01-01

    Virtual reality (VR) has the potential to revolutionize education, as it immerses students in their learning more than any other available medium. By blocking out visual and auditory distractions in the classroom, it has the potential to help students deeply connect with the material they are learning in a way that has never been possible before.…

  17. Utilizing Immersive Visualization Systems: How to Dynamically Revolutionize Site-based Professional Development Experiences within Human Resources Management?

    ERIC Educational Resources Information Center

    Craft, Kirby A.

    2009-01-01

    How can we train today's workforce with innovative technologies when families are surrounded by state-of-the-art video games and high-definition televisions? Human resource managers and administrators are faced with difficult challenges to prepare beneficial and relevant professional development exercises that engage the minds of their employees.…

  18. IMMERSE: Interactive Mentoring for Multimodal Experiences in Realistic Social Encounters

    DTIC Science & Technology

    2015-08-28

    undergraduates funded by your agreement who graduated during this period and will receive scholarships or fellowships for further studies in science... Player Locomotion 9.2 Interacting with Real and Virtual Objects 9.3 Animation Combinations and Stage Management 10. Recommendations on the Way Ahead...Interaction with Virtual Characters ................................52! 9.1! Player Locomotion

  19. Interactive Spacecraft Trajectory Design Strategies Featuring Poincare Map Topology

    NASA Astrophysics Data System (ADS)

    Schlei, Wayne R.

    Space exploration efforts are shifting towards inexpensive and more agile vehicles. Versatility regarding spacecraft trajectories refers to the agility to correct deviations from an intended path or even the ability to adapt the future path to a new destination--all with limited spaceflight resources (i.e., small DeltaV budgets). Trajectory design methods for such nimble vehicles incorporate equally versatile procedures that allow for rapid and interactive decision making while attempting to reduce Delta V budgets, leading to a versatile trajectory design platform. A versatile design paradigm requires the exploitation of Poincare map topology , or the interconnected web of dynamical structures, existing within the chaotic dynamics of multi-body gravitational models to outline low-Delta V transfer options residing nearby to a current path. This investigation details an autonomous procedure to extract the periodic orbits (topology nodes) and correlated asymptotic flow structures (or the invariant manifolds representing topology links). The autonomous process summarized in this investigation (termed PMATE) overcomes discontinuities on the Poincare section that arise in the applied multi-body model (the planar circular restricted three-body problem) and detects a wide variety of novel periodic orbits. New interactive capabilities deliver a visual analytics foundation for versatile spaceflight design, especially for initial guess generation and manipulation. Such interactive strategies include the selection of states and arcs from Poincare section visualizations and the capabilities to draw and drag trajectories to remove dependency on initial state input. Furthermore, immersive selection is expanded to cull invariant manifold structures, yielding low-DeltaV or even DeltaV-free transfers between periodic orbits. The application of interactive design strategies featuring a dense extraction of Poincare map topology is demonstrated for agile spaceflight with a simple spacecraft rerouting scenario incorporating a very limited Delta V budget. In the Earth-Moon system, a low-DeltaV transfer from low Earth orbit (LEO) to the distant retrograde orbit (DRO) vicinity is derived with interactive topology-based design tactics. Finally, Poincare map topology is exploited in the Saturn-Enceladus system to explore a possible ballistic capture scenario around Enceladus.

  20. Analysis of isothermal and cooling-rate-dependent immersion freezing by a unifying stochastic ice nucleation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alpert, Peter A.; Knopf, Daniel A.

    Immersion freezing is an important ice nucleation pathway involved in the formation of cirrus and mixed-phase clouds. Laboratory immersion freezing experiments are necessary to determine the range in temperature, T, and relative humidity, RH, at which ice nucleation occurs and to quantify the associated nucleation kinetics. Typically, isothermal (applying a constant temperature) and cooling-rate-dependent immersion freezing experiments are conducted. In these experiments it is usually assumed that the droplets containing ice nucleating particles (INPs) all have the same INP surface area (ISA); however, the validity of this assumption or the impact it may have on analysis and interpretation of the experimentalmore » data is rarely questioned. Descriptions of ice active sites and variability of contact angles have been successfully formulated to describe ice nucleation experimental data in previous research; however, we consider the ability of a stochastic freezing model founded on classical nucleation theory to reproduce previous results and to explain experimental uncertainties and data scatter. A stochastic immersion freezing model based on first principles of statistics is presented, which accounts for variable ISA per droplet and uses parameters including the total number of droplets, N tot, and the heterogeneous ice nucleation rate coefficient, J het( T). This model is applied to address if (i) a time and ISA-dependent stochastic immersion freezing process can explain laboratory immersion freezing data for different experimental methods and (ii) the assumption that all droplets contain identical ISA is a valid conjecture with subsequent consequences for analysis and interpretation of immersion freezing. The simple stochastic model can reproduce the observed time and surface area dependence in immersion freezing experiments for a variety of methods such as: droplets on a cold-stage exposed to air or surrounded by an oil matrix, wind and acoustically levitated droplets, droplets in a continuous-flow diffusion chamber (CFDC), the Leipzig aerosol cloud interaction simulator (LACIS), and the aerosol interaction and dynamics in the atmosphere (AIDA) cloud chamber. Observed time-dependent isothermal frozen fractions exhibiting non-exponential behavior can be readily explained by this model considering varying ISA. An apparent cooling-rate dependence of J het is explained by assuming identical ISA in each droplet. When accounting for ISA variability, the cooling-rate dependence of ice nucleation kinetics vanishes as expected from classical nucleation theory. Finally, the model simulations allow for a quantitative experimental uncertainty analysis for parameters N tot, T, RH, and the ISA variability. We discuss the implications of our results for experimental analysis and interpretation of the immersion freezing process.« less

  1. Analysis of isothermal and cooling-rate-dependent immersion freezing by a unifying stochastic ice nucleation model

    DOE PAGES

    Alpert, Peter A.; Knopf, Daniel A.

    2016-02-24

    Immersion freezing is an important ice nucleation pathway involved in the formation of cirrus and mixed-phase clouds. Laboratory immersion freezing experiments are necessary to determine the range in temperature, T, and relative humidity, RH, at which ice nucleation occurs and to quantify the associated nucleation kinetics. Typically, isothermal (applying a constant temperature) and cooling-rate-dependent immersion freezing experiments are conducted. In these experiments it is usually assumed that the droplets containing ice nucleating particles (INPs) all have the same INP surface area (ISA); however, the validity of this assumption or the impact it may have on analysis and interpretation of the experimentalmore » data is rarely questioned. Descriptions of ice active sites and variability of contact angles have been successfully formulated to describe ice nucleation experimental data in previous research; however, we consider the ability of a stochastic freezing model founded on classical nucleation theory to reproduce previous results and to explain experimental uncertainties and data scatter. A stochastic immersion freezing model based on first principles of statistics is presented, which accounts for variable ISA per droplet and uses parameters including the total number of droplets, N tot, and the heterogeneous ice nucleation rate coefficient, J het( T). This model is applied to address if (i) a time and ISA-dependent stochastic immersion freezing process can explain laboratory immersion freezing data for different experimental methods and (ii) the assumption that all droplets contain identical ISA is a valid conjecture with subsequent consequences for analysis and interpretation of immersion freezing. The simple stochastic model can reproduce the observed time and surface area dependence in immersion freezing experiments for a variety of methods such as: droplets on a cold-stage exposed to air or surrounded by an oil matrix, wind and acoustically levitated droplets, droplets in a continuous-flow diffusion chamber (CFDC), the Leipzig aerosol cloud interaction simulator (LACIS), and the aerosol interaction and dynamics in the atmosphere (AIDA) cloud chamber. Observed time-dependent isothermal frozen fractions exhibiting non-exponential behavior can be readily explained by this model considering varying ISA. An apparent cooling-rate dependence of J het is explained by assuming identical ISA in each droplet. When accounting for ISA variability, the cooling-rate dependence of ice nucleation kinetics vanishes as expected from classical nucleation theory. Finally, the model simulations allow for a quantitative experimental uncertainty analysis for parameters N tot, T, RH, and the ISA variability. We discuss the implications of our results for experimental analysis and interpretation of the immersion freezing process.« less

  2. Sexual self-regulation and cognitive absorption as factors of sexual response toward virtual characters.

    PubMed

    Renaud, Patrice; Trottier, Dominique; Nolet, Kevin; Rouleau, Joanne L; Goyette, Mathieu; Bouchard, Stéphane

    2014-04-01

    The eye movements and penile responses of 20 male participants were recorded while they were immersed with virtual sexual stimuli. These participants were divided into two groups according to their capacity to focus their attention in immersion (high and low focus). In order to understand sexual self-regulation better, we subjected participants to three experimental conditions: (a) immersion with a preferred sexual stimulus, without sexual inhibition; (b) immersion with a preferred sexual stimulus, with sexual inhibition; and (c) immersion with a neutral stimulus. A significant difference was observed between the effects of each condition on erectile response and scanpath. The groups differed on self-regulation of their erectile responses and on their scanpath patterns. High focus participants had more difficulties than low focus participants with inhibiting their sexual responses and displayed less scattered eye movement trajectories over the critical areas of the virtual sexual stimuli. Results are interpreted in terms of sexual self-regulation and cognitive absorption in virtual immersion. In addition, the use of validated virtual sexual stimuli is presented as a methodological improvement over static and moving pictures, since it paves the way for the study of the role of social interaction in an ecologically valid and well-controlled way.

  3. Collaborative virtual environments art exhibition

    NASA Astrophysics Data System (ADS)

    Dolinsky, Margaret; Anstey, Josephine; Pape, Dave E.; Aguilera, Julieta C.; Kostis, Helen-Nicole; Tsoupikova, Daria

    2005-03-01

    This panel presentation will exhibit artwork developed in CAVEs and discuss how art methodologies enhance the science of VR through collaboration, interaction and aesthetics. Artists and scientists work alongside one another to expand scientific research and artistic expression and are motivated by exhibiting collaborative virtual environments. Looking towards the arts, such as painting and sculpture, computer graphics captures a visual tradition. Virtual reality expands this tradition to not only what we face, but to what surrounds us and even what responds to our body and its gestures. Art making that once was isolated to the static frame and an optimal point of view is now out and about, in fully immersive mode within CAVEs. Art knowledge is a guide to how the aesthetics of 2D and 3D worlds affect, transform, and influence the social, intellectual and physical condition of the human body through attention to psychology, spiritual thinking, education, and cognition. The psychological interacts with the physical in the virtual in such a way that each facilitates, enhances and extends the other, culminating in a "go together" world. Attention to sharing art experience across high-speed networks introduces a dimension of liveliness and aliveness when we "become virtual" in real time with others.

  4. NASA Sea Level Change Portal - It not just another portal site

    NASA Astrophysics Data System (ADS)

    Huang, T.; Quach, N.; Abercrombie, S. P.; Boening, C.; Brennan, H. P.; Gill, K. M.; Greguska, F. R., III; Jackson, R.; Larour, E. Y.; Shaftel, H.; Tenenbaum, L. F.; Zlotnicki, V.; Moore, B.; Moore, J.; Boeck, A.

    2017-12-01

    The NASA Sea Level Change Portal (https://sealevel.nasa.gov) is designed as a "one-stop" source for current sea level change information, including interactive tools for accessing and viewing regional data, a virtual dashboard of sea level indicators, and ongoing updates through a suite of editorial products that include content articles, graphics, videos, and animations. With increasing global temperatures warming the ocean and melting ice sheets and glaciers, there is an immediate need both for accelerating sea level change research and for making this research accessible to scientists in disparate discipline, to the general public, to policy makers and business. The immersive and innovative NASA portal debuted at the 2015 AGU attracts thousands of daily visitors and over 30K followers on Facebook®. Behind its intuitive interface is an extensible architecture that integrates site contents, data for various sources, visualization, horizontal-scale geospatial data analytic technology (called NEXUS), and an interactive 3D simulation platform (called the Virtual Earth System Laboratory). We will present an overview of our NASA portal and some of our architectural decisions along with discussion on our open-source, cloud-based data analytic technology that enables on-the-fly analysis of heterogeneous data.

  5. Science Education Using a Computer Model-Virtual Puget Sound

    NASA Astrophysics Data System (ADS)

    Fruland, R.; Winn, W.; Oppenheimer, P.; Stahr, F.; Sarason, C.

    2002-12-01

    We created an interactive learning environment based on an oceanographic computer model of Puget Sound-Virtual Puget Sound (VPS)-as an alternative to traditional teaching methods. Students immersed in this navigable 3-D virtual environment observed tidal movements and salinity changes, and performed tracer and buoyancy experiments. Scientific concepts were embedded in a goal-based scenario to locate a new sewage outfall in Puget Sound. Traditional science teaching methods focus on distilled representations of agreed-upon knowledge removed from real-world context and scientific debate. Our strategy leverages students' natural interest in their environment, provides meaningful context and engages students in scientific debate and knowledge creation. Results show that VPS provides a powerful learning environment, but highlights the need for research on how to most effectively represent concepts and organize interactions to support scientific inquiry and understanding. Research is also needed to ensure that new technologies and visualizations do not foster misconceptions, including the impression that the model represents reality rather than being a useful tool. In this presentation we review results from prior work with VPS and outline new work for a modeling partnership recently formed with funding from the National Ocean Partnership Program (NOPP).

  6. Nomad devices for interactions in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    George, Paul; Kemeny, Andras; Merienne, Frédéric; Chardonnet, Jean-Rémy; Thouvenin, Indira Mouttapa; Posselt, Javier; Icart, Emmanuel

    2013-03-01

    Renault is currently setting up a new CAVE™, a 5 rear-projected wall virtual reality room with a combined 3D resolution of 100 Mpixels, distributed over sixteen 4k projectors and two 2k projector as well as an additional 3D HD collaborative powerwall. Renault's CAVE™ aims at answering needs of the various vehicle conception steps [1]. Starting from vehicle Design, through the subsequent Engineering steps, Ergonomic evaluation and perceived quality control, Renault has built up a list of use-cases and carried out an early software evaluation in the four sided CAVE™ of Institute Image, called MOVE. One goal of the project is to study interactions in a CAVE™, especially with nomad devices such as IPhone or IPad to manipulate virtual objects and to develop visualization possibilities. Inspired by nomad devices current uses (multi-touch gestures, IPhone UI look'n'feel and AR applications), we have implemented an early feature set taking advantage of these popular input devices. In this paper, we present its performance through measurement data collected in our test platform, a 4-sided homemade low-cost virtual reality room, powered by ultra-short-range and standard HD home projectors.

  7. Context matters: Anterior and posterior cortical midline responses to sad movie scenes.

    PubMed

    Schlochtermeier, L H; Pehrs, C; Bakels, J-H; Jacobs, A M; Kappelhoff, H; Kuchinke, L

    2017-04-15

    Narrative movies can create powerful emotional responses. While recent research has advanced the understanding of neural networks involved in immersive movie viewing, their modulation within a movie's dynamic context remains inconclusive. In this study, 24 healthy participants passively watched sad scene climaxes taken from 24 romantic comedies, while brain activity was measured using functional magnetic resonance (fMRI). To study effects of context, the sad scene climaxes were presented with either coherent scene context, replaced non-coherent context or without context. In a second viewing, the same clips were rated continuously for sadness. The ratings varied over time with peaks of experienced sadness within the assumed climax intervals. Activations in anterior and posterior cortical midline regions increased if presented with both coherent and replaced context, while activation in the temporal gyri decreased. This difference was more pronounced for the coherent context condition. Psycho-Physiological interactions (PPI) analyses showed a context-dependent coupling of midline regions with occipital visual and sub-cortical reward regions. Our results demonstrate the pivotal role of midline structures and their interaction with perceptual and reward areas in processing contextually embedded socio-emotional information in movies. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Computation of three-dimensional multiphase flow dynamics by Fully-Coupled Immersed Flow (FCIF) solver

    NASA Astrophysics Data System (ADS)

    Miao, Sha; Hendrickson, Kelli; Liu, Yuming

    2017-12-01

    This work presents a Fully-Coupled Immersed Flow (FCIF) solver for the three-dimensional simulation of fluid-fluid interaction by coupling two distinct flow solvers using an Immersed Boundary (IB) method. The FCIF solver captures dynamic interactions between two fluids with disparate flow properties, while retaining the desirable simplicity of non-boundary-conforming grids. For illustration, we couple an IB-based unsteady Reynolds Averaged Navier Stokes (uRANS) simulator with a depth-integrated (long-wave) solver for the application of slug development with turbulent gas and laminar liquid. We perform a series of validations including turbulent/laminar flows over prescribed wavy boundaries and freely-evolving viscous fluids. These confirm the effectiveness and accuracy of both one-way and two-way coupling in the FCIF solver. Finally, we present a simulation example of the evolution from a stratified turbulent/laminar flow through the initiation of a slug that nearly bridges the channel. The results show both the interfacial wave dynamics excited by the turbulent gas forcing and the influence of the liquid on the gas turbulence. These results demonstrate that the FCIF solver effectively captures the essential physics of gas-liquid interaction and can serve as a useful tool for the mechanistic study of slug generation in two-phase gas/liquid flows in channels and pipes.

  9. Reducing Visual Discomfort with HMDs Using Dynamic Depth of Field.

    PubMed

    Carnegie, Kieran; Rhee, Taehyun

    2015-01-01

    Although head-mounted displays (HMDs) are ideal devices for personal viewing of immersive stereoscopic content, exposure to VR applications on them results in significant discomfort for the majority of people, with symptoms including eye fatigue, headaches, nausea, and sweating. A conflict between accommodation and vergence depth cues on stereoscopic displays is a significant cause of visual discomfort. This article describes the results of an evaluation used to judge the effectiveness of dynamic depth-of-field (DoF) blur in an effort to reduce discomfort caused by exposure to stereoscopic content on HMDs. Using a commercial game engine implementation, study participants report a reduction of visual discomfort on a simulator sickness questionnaire when DoF blurring is enabled. The study participants reported a decrease in symptom severity caused by HMD exposure, indicating that dynamic DoF can effectively reduce visual discomfort.

  10. Being There in the Midst of the Story: How Immersive Journalism Affects Our Perceptions and Cognitions.

    PubMed

    Sundar, S Shyam; Kang, Jin; Oprean, Danielle

    2017-11-01

    Immersive journalism in the form of virtual reality (VR) headsets and 360°-video is becoming more mainstream and is much touted for inducing greater "presence" than traditional text. But, does this presence influence psychological outcomes of reading news, such as memory for story content, perceptions of credibility, and empathy felt toward story characters? We propose that two key technological affordances of VR (modality and interactivity) are responsible for triggering three presence-related cognitive heuristics (being-there, interaction, and realism), which influence news readers' memory and their perceptions of credibility, empathy, and story-sharing intentions. We report a 3 (storytelling medium: VR vs. 360°-video vs. Text) × 2 (story: "The displaced" and "The click effect") mixed-factorial experiment, in which participants (N = 129) experienced two New York Times stories (that differed in their emotional intensity) using one of the three mediums (VR, 360°-video, Text). Participants who experienced the stories using VR and 360°-video outperformed those who read the same stories using text with pictures, not only on such presence-related outcomes as being-there, interaction, and realism, but also on perceived source credibility, story-sharing intention, and feelings of empathy. Moreover, we found that senses of being-there, interaction, and realism mediated the relationship between storytelling medium and reader perceptions of credibility, story recall, and story-sharing intention. These findings have theoretical implications for the psychology of virtual reality, and practical applications for immersive journalism in particular and interactive media in general.

  11. Implementation of 3d Tools and Immersive Experience Interaction for Supporting Learning in a Library-Archive Environment. Visions and Challenges

    NASA Astrophysics Data System (ADS)

    Angeletaki, A.; Carrozzino, M.; Johansen, S.

    2013-07-01

    In this paper we present an experimental environment of 3D books combined with a game application that has been developed by a collaboration project between the Norwegian University of Science and Technology in Trondheim, Norway the NTNU University Library, and the Percro laboratory of Santa Anna University in Pisa, Italy. MUBIL is an international research project involving museums, libraries and ICT academy partners aiming to develop a consistent methodology enabling the use of Virtual Environments as a metaphor to present manuscripts content through the paradigms of interaction and immersion, evaluating different possible alternatives. This paper presents the results of the application of two prototypes of books augmented with the use of XVR and IL technology. We explore immersive-reality design strategies in archive and library contexts for attracting new users. Our newly established Mubil-lab has invited school classes to test the books augmented with 3D models and other multimedia content in order to investigate whether the immersion in such environments can create wider engagement and support learning. The metaphor of 3D books and game designs in a combination allows the digital books to be handled through a tactile experience and substitute the physical browsing. In this paper we present some preliminary results about the enrichment of the user experience in such environment.

  12. The Use of Virtual Reality in Psychology: A Case Study in Visual Perception

    PubMed Central

    Wilson, Christopher J.; Soranzo, Alessandro

    2015-01-01

    Recent proliferation of available virtual reality (VR) tools has seen increased use in psychological research. This is due to a number of advantages afforded over traditional experimental apparatus such as tighter control of the environment and the possibility of creating more ecologically valid stimulus presentation and response protocols. At the same time, higher levels of immersion and visual fidelity afforded by VR do not necessarily evoke presence or elicit a “realistic” psychological response. The current paper reviews some current uses for VR environments in psychological research and discusses some ongoing questions for researchers. Finally, we focus on the area of visual perception, where both the advantages and challenges of VR are particularly salient. PMID:26339281

  13. Building Opportunities for Environmental Education Through Student Development of Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Moysey, S. M.; Boyer, D. M.; Mobley, C.; Byrd, V. L.

    2014-12-01

    It is increasingly common to utilize simulations and games in the classroom, but learning opportunities can also be created by having students construct these cyberinfrastructure resources themselves. We outline two examples of such projects completed during the summer of 2014 within the NSF ACI sponsored REU Site: Research Experiences for Undergraduates in Collaborative Data Visualization Applications at Clemson University (Award 1359223). The first project focuses on the development of immersive virtual reality field trips of geologic sites using the Oculus Rift headset. This project developed a platform which will allow users to navigate virtual terrains derived from real-world data obtained from the US Geological Survey and Google Earth. The system provides users with the ability to partake in an interactive first-person exploration of a region, such as the Grand Canyon, and thus makes an important educational contribution for students without access to these environmental assets in the real world. The second project focused on providing players visual feedback about the sustainability of their practices within the web-based, multiplayer watershed management game Naranpur Online. Identifying sustainability indicators that communicate meaningful information to players and finding an effective way to visualize these data were a primary challenge faced by the student researcher working on this project. To solve this problem the student translated findings from the literature to the context of the game to develop a hierarchical set of relative sustainability criteria to be accessed by players within a sustainability dashboard. Though the REU focused on visualization, both projects forced the students to transform their thinking to address higher-level questions regarding the utilization and communication of environmental data or concepts, thus enhancing the educational experience for themselves and future students.

  14. Immersive virtual reality as a teaching tool for neuroanatomy.

    PubMed

    Stepan, Katelyn; Zeiger, Joshua; Hanchuk, Stephanie; Del Signore, Anthony; Shrivastava, Raj; Govindaraj, Satish; Iloreta, Alfred

    2017-10-01

    Three-dimensional (3D) computer modeling and interactive virtual reality (VR) simulation are validated teaching techniques used throughout medical disciplines. Little objective data exists supporting its use in teaching clinical anatomy. Learner motivation is thought to limit the rate of utilization of such novel technologies. The purpose of this study is to evaluate the effectiveness, satisfaction, and motivation associated with immersive VR simulation in teaching medical students neuroanatomy. Images of normal cerebral anatomy were reconstructed from human Digital Imaging and Communications in Medicine (DICOM) computed tomography (CT) imaging and magnetic resonance imaging (MRI) into 3D VR formats compatible with the Oculus Rift VR System, a head-mounted display with tracking capabilities allowing for an immersive VR experience. The ventricular system and cerebral vasculature were highlighted and labeled to create a focused interactive model. We conducted a randomized controlled study with 66 medical students (33 in both the control and experimental groups). Pertinent neuroanatomical structures were studied using either online textbooks or the VR interactive model, respectively. We then evaluated the students' anatomy knowledge, educational experience, and motivation (using the Instructional Materials Motivation Survey [IMMS], a previously validated assessment). There was no significant difference in anatomy knowledge between the 2 groups on preintervention, postintervention, or retention quizzes. The VR group found the learning experience to be significantly more engaging, enjoyable, and useful (all p < 0.01) and scored significantly higher on the motivation assessment (p < 0.01). Immersive VR educational tools awarded a more positive learner experience and enhanced student motivation. However, the technology was equally as effective as the traditional text books in teaching neuroanatomy. © 2017 ARS-AAOA, LLC.

  15. Get immersed in the Soil Sciences: the first community of avatars in the EGU Assembly 2015!

    NASA Astrophysics Data System (ADS)

    Castillo, Sebastian; Alarcón, Purificación; Beato, Mamen; Emilio Guerrero, José; José Martínez, Juan; Pérez, Cristina; Ortiz, Leovigilda; Taguas, Encarnación V.

    2015-04-01

    Virtual reality and immersive worlds refer to artificial computer-generated environments, with which users act and interact as in a known environment by the use of figurative virtual individuals (avatars). Virtual environments will be the technology of the early twenty-first century that will most dramatically change the way we live, particularly in the areas of training and education, product development and entertainment (Schmorrow, 2009). The usefulness of immersive worlds has been proved in different fields. They reduce geographic and social barriers between different stakeholders and create virtual social spaces which can positively impact learning and discussion outcomes (Lorenzo et al. 2012). In this work we present a series of interactive meetings in a virtual building to celebrate the International Year of Soil to promote the importance of soil functions and its conservation. In a virtual room, the avatars of different senior researchers will meet young scientist avatars to talk about: 1) what remains to be done in Soil Sciences; 2) which are their main current limitations and difficulties and 3) which are the future hot research lines. The interactive participation does not require physically attend to the EGU Assembly 2015. In addition, this virtual building inspired in Soil Sciences can be completed with different teaching resources from different locations around the world and it will be used to improve the learning of Soil Sciences in a multicultural context. REFERENCES: Lorenzo C.M., Sicilia, M.A., Sánchez S. 2012. Studying the effectiveness of multi-user immersive environments for collaborative evaluation tasks. Computers & Education 59 (2012) 1361-1376 Schmorrow D.D. 2009. "Why virtual?" Theoretical Issues in Ergonomics Science 10(3): 279-282.

  16. Intuitive operability evaluation of surgical robot using brain activity measurement to determine immersive reality.

    PubMed

    Miura, Satoshi; Kobayashi, Yo; Kawamura, Kazuya; Seki, Masatoshi; Nakashima, Yasutaka; Noguchi, Takehiko; Kasuya, Masahiro; Yokoo, Yuki; Fujie, Masakatsu G

    2012-01-01

    Surgical robots have improved considerably in recent years, but intuitive operability, which represents user inter-operability, has not been quantitatively evaluated. Therefore, for design of a robot with intuitive operability, we propose a method to measure brain activity to determine intuitive operability. The objective of this paper is to determine the master configuration against the monitor that allows users to perceive the manipulator as part of their own body. We assume that the master configuration produces an immersive reality experience for the user of putting his own arm into the monitor. In our experiments, as subjects controlled the hand controller to position the tip of the virtual slave manipulator on a target in a surgical simulator, we measured brain activity through brain-imaging devices. We performed our experiments for a variety of master manipulator configurations with the monitor position fixed. For all test subjects, we found that brain activity was stimulated significantly when the master manipulator was located behind the monitor. We conclude that this master configuration produces immersive reality through the body image, which is related to visual and somatic sense feedback.

  17. An Immersed Boundary-Lattice Boltzmann Method for Simulating Particulate Flows

    NASA Astrophysics Data System (ADS)

    Zhang, Baili; Cheng, Ming; Lou, Jing

    2013-11-01

    A two-dimensional momentum exchange-based immersed boundary-lattice Boltzmann method developed by X.D. Niu et al. (2006) has been extended in three-dimensions for solving fluid-particles interaction problems. This method combines the most desirable features of the lattice Boltzmann method and the immersed boundary method by using a regular Eulerian mesh for the flow domain and a Lagrangian mesh for the moving particles in the flow field. The non-slip boundary conditions for the fluid and the particles are enforced by adding a force density term into the lattice Boltzmann equation, and the forcing term is simply calculated by the momentum exchange of the boundary particle density distribution functions, which are interpolated by the Lagrangian polynomials from the underlying Eulerian mesh. This method preserves the advantages of lattice Boltzmann method in tracking a group of particles and, at the same time, provides an alternative approach to treat solid-fluid boundary conditions. Numerical validations show that the present method is very accurate and efficient. The present method will be further developed to simulate more complex problems with particle deformation, particle-bubble and particle-droplet interactions.

  18. Immersed boundary methods for simulating fluid-structure interaction

    NASA Astrophysics Data System (ADS)

    Sotiropoulos, Fotis; Yang, Xiaolei

    2014-02-01

    Fluid-structure interaction (FSI) problems commonly encountered in engineering and biological applications involve geometrically complex flexible or rigid bodies undergoing large deformations. Immersed boundary (IB) methods have emerged as a powerful simulation tool for tackling such flows due to their inherent ability to handle arbitrarily complex bodies without the need for expensive and cumbersome dynamic re-meshing strategies. Depending on the approach such methods adopt to satisfy boundary conditions on solid surfaces they can be broadly classified as diffused and sharp interface methods. In this review, we present an overview of the fundamentals of both classes of methods with emphasis on solution algorithms for simulating FSI problems. We summarize and juxtapose different IB approaches for imposing boundary conditions, efficient iterative algorithms for solving the incompressible Navier-Stokes equations in the presence of dynamic immersed boundaries, and strong and loose coupling FSI strategies. We also present recent results from the application of such methods to study a wide range of problems, including vortex-induced vibrations, aquatic swimming, insect flying, human walking and renewable energy. Limitations of such methods and the need for future research to mitigate them are also discussed.

  19. Effect of the pH in the adsorption and in the immersion enthalpy of monohydroxylated phenols from aqueous solutions on activated carbons.

    PubMed

    Blanco-Martínez, D A; Giraldo, L; Moreno-Piraján, J C

    2009-09-30

    An activated carbon Carbochem--PS230 was modified by chemical and thermal treatment in flow of H(2) in order to evaluate the influence of the activated carbon chemical surface in the adsorption of the monohydroxylated phenols. The solid-solution interaction was determined by analyzing the adsorption isotherms at 298 K at pH 7, 9 and 11 during 48 h. The adsorption capacity of activated carbons increases when the pH solution decreases. The amount adsorbed increases in the reduced carbon at the maximum adsorption pH and decreases in the oxidized carbon. In the sample of granulated activated carbon, CAG, the monohydroxylated phenols adsorption capacity diminishes in the following order catechol >hydroquinone >resorcinol, at the three pH values. The experimental data are evaluated with Freundlich's and Langmuir's models. The immersion enthalpies are determined and increase with the retained amount, ranging between 21.5 and 45.7 J g(-1). In addition, the immersion enthalpies show more interaction with the reduced activated carbon that has lower total acidity contents.

  20. Experiencing Soil Science from your office through virtual experiences

    NASA Astrophysics Data System (ADS)

    Beato, M. Carmen; González-Merino, Ramón; Campillo, M. Carmen; Fernández-Ahumada, Elvira; Ortiz, Leovigilda; Taguas, Encarnación V.; Guerrero, José Emilio

    2017-04-01

    Currently, numerous tools based on the new information and communication technologies offer a wide range of possibilities for the implementation of interactive methodologies in Education and Science. In particular, virtual reality and immersive worlds - artificially generated computer environments where users interact through a figurative individual that represents them in that environment (their "avatar") - have been identified as the technology that will change the way we live, particularly in educational terms, product development and entertainment areas (Schmorrow, 2009). Gisbert-Cervera et al. (2011) consider that the 3D worlds in education, among others, provide a unique training and exchange of knowledge environment which allows a goal reflection to support activities and achieve learning outcomes. In Soil Sciences, the experimental component is essential to acquire the necessary knowledge to understand the biogeochemical processes taking place and their interactions with time, climate, topography and living organisms present. In this work, an immersive virtual environment which reproduces a series of pits have been developed to evaluate and differentiate soil characteristics such as texture, structure, consistency, color and other physical-chemical and biological properties for educational purposes. Bibliographical material such as pictures, books, papers and were collected in order to classify the information needed and to build the soil profiles into the virtual environment. The programming language for the virtual recreation was Unreal Engine4 (UE4; https://www.unrealengine.com/unreal-engine-4). This program was chosen because it provides two toolsets for programmers and it can also be used in tandem to accelerate development workflows. In addition, Unreal Engine4 technology powers hundreds of games as well as real-time 3D films, training simulations, visualizations and it creates very realistic graphics. For the evaluation of its impact and its usefulness in teaching, a series of surveys will be presented to undergraduate students and teachers. REFERENCES: Gisbert-Cervera M, Esteve-Gonzalez V., Camacho-Marti M.M. (2011). Delve into the Deep: Learning Potential in Metaverses and 3D Worlds. eLearning (25) Papers ISSN: 1887-1542 Schmorrow D.D. (2009). Why virtual? Theoretical Issues in Ergonomics Science 10(3): 279-282.

  1. A Nationwide Experimental Multi-Gigabit Network

    DTIC Science & Technology

    2003-03-01

    television and cinema , and to real- time interactive teleconferencing. There is another variable which affects this happy growth in network bandwidth and...render large scientific data sets with interactive frame rates on the desktop or in an immersive virtual reality ( VR ) environment. In our design, we

  2. CaveCAD: a tool for architectural design in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Schulze, Jürgen P.; Hughes, Cathleen E.; Zhang, Lelin; Edelstein, Eve; Macagno, Eduardo

    2014-02-01

    Existing 3D modeling tools were designed to run on desktop computers with monitor, keyboard and mouse. To make 3D modeling possible with mouse and keyboard, many 3D interactions, such as point placement or translations of geometry, had to be mapped to the 2D parameter space of the mouse, possibly supported by mouse buttons or keyboard keys. We hypothesize that had the designers of these existing systems had been able to assume immersive virtual reality systems as their target platforms, they would have been able to design 3D interactions much more intuitively. In collaboration with professional architects, we created a simple, but complete 3D modeling tool for virtual environments from the ground up and use direct 3D interaction wherever possible and adequate. In this publication, we present our approaches for interactions for typical 3D modeling functions, such as geometry creation, modification of existing geometry, and assignment of surface materials. We also discuss preliminary user experiences with this system.

  3. A Dynamic Mesh-Based Approach to Model Melting and Shape of an ESR Electrode

    NASA Astrophysics Data System (ADS)

    Karimi-Sibaki, E.; Kharicha, A.; Bohacek, J.; Wu, M.; Ludwig, A.

    2015-10-01

    This paper presents a numerical method to investigate the shape of tip and melt rate of an electrode during electroslag remelting process. The interactions between flow, temperature, and electromagnetic fields are taken into account. A dynamic mesh-based approach is employed to model the dynamic formation of the shape of electrode tip. The effect of slag properties such as thermal and electrical conductivities on the melt rate and electrode immersion depth is discussed. The thermal conductivity of slag has a dominant influence on the heat transfer in the system, hence on melt rate of electrode. The melt rate decreases with increasing thermal conductivity of slag. The electrical conductivity of slag governs the electric current path that in turn influences flow and temperature fields. The melting of electrode is a quite unstable process due to the complex interaction between the melt rate, immersion depth, and shape of electrode tip. Therefore, a numerical adaptation of electrode position in the slag has been implemented in order to achieve steady state melting. In fact, the melt rate, immersion depth, and shape of electrode tip are interdependent parameters of process. The generated power in the system is found to be dependent on both immersion depth and shape of electrode tip. In other words, the same amount of power was generated for the systems where the shapes of tip and immersion depth were different. Furthermore, it was observed that the shape of electrode tip is very similar for the systems running with the same ratio of power generation to melt rate. Comparison between simulations and experimental results was made to verify the numerical model.

  4. The immersion freezing behavior of ash particles from wood and brown coal burning

    NASA Astrophysics Data System (ADS)

    Grawe, Sarah; Augustin-Bauditz, Stefanie; Hartmann, Susan; Hellner, Lisa; Pettersson, Jan B. C.; Prager, Andrea; Stratmann, Frank; Wex, Heike

    2016-11-01

    It is generally known that ash particles from coal combustion can trigger ice nucleation when they interact with water vapor and/or supercooled droplets. However, data on the ice nucleation of ash particles from different sources, including both anthropogenic and natural combustion processes, are still scarce. As fossil energy sources still fuel the largest proportion of electric power production worldwide, and biomass burning contributes significantly to the global aerosol loading, further data are needed to better assess the ice nucleating efficiency of ash particles. In the framework of this study, we found that ash particles from brown coal (i.e., lignite) burning are up to 2 orders of magnitude more ice active in the immersion mode below -32 °C than those from wood burning. Fly ash from a coal-fired power plant was shown to be the most efficient at nucleating ice. Furthermore, the influence of various particle generation methods on the freezing behavior was studied. For instance, particles were generated either by dispersion of dry sample material, or by atomization of ash-water suspensions, and then led into the Leipzig Aerosol Cloud Interaction Simulator (LACIS) where the immersion freezing behavior was examined. Whereas the immersion freezing behavior of ashes from wood burning was not affected by the particle generation method, it depended on the type of particle generation for ash from brown coal. It was also found that the common practice of treating prepared suspensions in an ultrasonic bath to avoid aggregation of particles led to an enhanced ice nucleation activity. The findings of this study suggest (a) that ash from brown coal burning may influence immersion freezing in clouds close to the source and (b) that the freezing behavior of ash particles may be altered by a change in sample preparation and/or particle generation.

  5. The Perfectly Matched Layer absorbing boundary for fluid-structure interactions using the Immersed Finite Element Method.

    PubMed

    Yang, Jubiao; Yu, Feimi; Krane, Michael; Zhang, Lucy T

    2018-01-01

    In this work, a non-reflective boundary condition, the Perfectly Matched Layer (PML) technique, is adapted and implemented in a fluid-structure interaction numerical framework to demonstrate that proper boundary conditions are not only necessary to capture correct wave propagations in a flow field, but also its interacted solid behavior and responses. While most research on the topics of the non-reflective boundary conditions are focused on fluids, little effort has been done in a fluid-structure interaction setting. In this study, the effectiveness of the PML is closely examined in both pure fluid and fluid-structure interaction settings upon incorporating the PML algorithm in a fully-coupled fluid-structure interaction framework, the Immersed Finite Element Method. The performance of the PML boundary condition is evaluated and compared to reference solutions with a variety of benchmark test cases including known and expected solutions of aeroacoustic wave propagation as well as vortex shedding and advection. The application of the PML in numerical simulations of fluid-structure interaction is then investigated to demonstrate the efficacy and necessity of such boundary treatment in order to capture the correct solid deformation and flow field without the requirement of a significantly large computational domain.

  6. SciEthics Interactive: Science and Ethics Learning in a Virtual Environment

    ERIC Educational Resources Information Center

    Nadolny, Larysa; Woolfrey, Joan; Pierlott, Matthew; Kahn, Seth

    2013-01-01

    Learning in immersive 3D environments allows students to collaborate, build, and interact with difficult course concepts. This case study examines the design and development of the TransGen Island within the SciEthics Interactive project, a National Science Foundation-funded, 3D virtual world emphasizing learning science content in the context of…

  7. CrashEd--A Live Immersive, Learning Experience Embedding STEM Subjects in a Realistic, Interactive Crime Scene

    ERIC Educational Resources Information Center

    Bassford, Marie L.; Crisp, Annette; O'Sullivan, Angela; Bacon, Joanne; Fowler, Mark

    2016-01-01

    Interactive experiences are rapidly becoming popular via the surge of "escape rooms"; part game and part theatre, the "escape" experience is exploding globally, having gone from zero offered at the outset of 2010 to at least 2800 different experiences available worldwide today. CrashEd is an interactive learning experience that…

  8. A matrix-free implicit unstructured multigrid finite volume method for simulating structural dynamics and fluid structure interaction

    NASA Astrophysics Data System (ADS)

    Lv, X.; Zhao, Y.; Huang, X. Y.; Xia, G. H.; Su, X. H.

    2007-07-01

    A new three-dimensional (3D) matrix-free implicit unstructured multigrid finite volume (FV) solver for structural dynamics is presented in this paper. The solver is first validated using classical 2D and 3D cantilever problems. It is shown that very accurate predictions of the fundamental natural frequencies of the problems can be obtained by the solver with fast convergence rates. This method has been integrated into our existing FV compressible solver [X. Lv, Y. Zhao, et al., An efficient parallel/unstructured-multigrid preconditioned implicit method for simulating 3d unsteady compressible flows with moving objects, Journal of Computational Physics 215(2) (2006) 661-690] based on the immersed membrane method (IMM) [X. Lv, Y. Zhao, et al., as mentioned above]. Results for the interaction between the fluid and an immersed fixed-free cantilever are also presented to demonstrate the potential of this integrated fluid-structure interaction approach.

  9. Designing Experiential Modes: A Key Focus for Immersive Learning Environments

    ERIC Educational Resources Information Center

    Appelman, Robert

    2005-01-01

    A student sitting in a class and listening to an instructor talk is experiencing a particular mode of instruction sensed through visual and audio channels. She is aware that she is in the center of a classroom and also in close proximity to other students. Occasionally they gesture to the instructor at the front of the room, who stops talking when…

  10. A Critical Review of the Use of Virtual Reality in Construction Engineering Education and Training.

    PubMed

    Wang, Peng; Wu, Peng; Wang, Jun; Chi, Hung-Lin; Wang, Xiangyu

    2018-06-08

    Virtual Reality (VR) has been rapidly recognized and implemented in construction engineering education and training (CEET) in recent years due to its benefits of providing an engaging and immersive environment. The objective of this review is to critically collect and analyze the VR applications in CEET, aiming at all VR-related journal papers published from 1997 to 2017. The review follows a three-stage analysis on VR technologies, applications and future directions through a systematic analysis. It is found that the VR technologies adopted for CEET evolve over time, from desktop-based VR, immersive VR, 3D game-based VR, to Building Information Modelling (BIM)-enabled VR. A sibling technology, Augmented Reality (AR), for CEET adoptions has also emerged in recent years. These technologies have been applied in architecture and design visualization, construction health and safety training, equipment and operational task training, as well as structural analysis. Future research directions, including the integration of VR with emerging education paradigms and visualization technologies, have also been provided. The findings are useful for both researchers and educators to usefully integrate VR in their education and training programs to improve the training performance.

  11. Emerging CAE technologies and their role in Future Ambient Intelligence Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2011-03-01

    Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.

  12. High sensitivity to multisensory conflicts in agoraphobia exhibited by virtual reality.

    PubMed

    Viaud-Delmon, Isabelle; Warusfel, Olivier; Seguelas, Angeline; Rio, Emmanuel; Jouvent, Roland

    2006-10-01

    The primary aim of this study was to evaluate the effect of auditory feedback in a VR system planned for clinical use and to address the different factors that should be taken into account in building a bimodal virtual environment (VE). We conducted an experiment in which we assessed spatial performances in agoraphobic patients and normal subjects comparing two kinds of VEs, visual alone (Vis) and auditory-visual (AVis), during separate sessions. Subjects were equipped with a head-mounted display coupled with an electromagnetic sensor system and immersed in a virtual town. Their task was to locate different landmarks and become familiar with the town. In the AVis condition subjects were equipped with the head-mounted display and headphones, which delivered a soundscape updated in real-time according to their movement in the virtual town. While general performances remained comparable across the conditions, the reported feeling of immersion was more compelling in the AVis environment. However, patients exhibited more cybersickness symptoms in this condition. The result of this study points to the multisensory integration deficit of agoraphobic patients and underline the need for further research on multimodal VR systems for clinical use.

  13. Virtualized Traffic: reconstructing traffic flows from discrete spatiotemporal data.

    PubMed

    Sewall, Jason; van den Berg, Jur; Lin, Ming C; Manocha, Dinesh

    2011-01-01

    We present a novel concept, Virtualized Traffic, to reconstruct and visualize continuous traffic flows from discrete spatiotemporal data provided by traffic sensors or generated artificially to enhance a sense of immersion in a dynamic virtual world. Given the positions of each car at two recorded locations on a highway and the corresponding time instances, our approach can reconstruct the traffic flows (i.e., the dynamic motions of multiple cars over time) between the two locations along the highway for immersive visualization of virtual cities or other environments. Our algorithm is applicable to high-density traffic on highways with an arbitrary number of lanes and takes into account the geometric, kinematic, and dynamic constraints on the cars. Our method reconstructs the car motion that automatically minimizes the number of lane changes, respects safety distance to other cars, and computes the acceleration necessary to obtain a smooth traffic flow subject to the given constraints. Furthermore, our framework can process a continuous stream of input data in real time, enabling the users to view virtualized traffic events in a virtual world as they occur. We demonstrate our reconstruction technique with both synthetic and real-world input. © 2011 IEEE Published by the IEEE Computer Society

  14. Influence of light curing units and fluoride mouthrinse on morphological surface and color stability of a nanofilled composite resin.

    PubMed

    De Oliveira, Ana Luísa Botta Martins; Botta, Ana Carolina; Campos, Juliana Álvares Duarte Bonini; Garcia, Patrícia Petromilli Nordi Sasso

    2014-11-01

    Composite resin is a dental material susceptible to color change over time which limits the longevity of restorations made with this material. The influence of light curing units and different fluoride mouthrinses on superficial morphology and color stability of a nanofilled composite resin was evaluated. Specimens (N = 150) were prepared and polished. The experimental groups were divided according to the type of light source (halogen and LED) and immersion media (artificial saliva, 0.05% sodium fluoride solution-manipulated, Fluordent Reach, Oral B, Fluorgard). Specimens remained in artificial saliva for 24-h baseline. For 60 days, they were immersed in solutions for 1 min. Color readout was taken at baseline and after 60 days of immersion. Surface morphology was analyzed by Scanning Electron Microscopy (SEM) after 60 days of immersion. Color change data were submitted to two-way Analysis of Variance and Tukey tests (α = 0.05). Surface morphology was qualitatively analyzed. The factor light source presented no significant variability (P = 0.281), the immersion media, significant variability (P < 0.001) and interaction between factors, no significant variability (P = 0.050). According to SEM observations, no difference was noted in the surface of the specimens polymerized by different light sources, irrespective of the immersion medium. It was concluded that the light source did not influence the color stability of composite, irrespective of the immersion media, and among the fluoride solutions analyzed, Fluorgard was the one that promoted the greatest color change, however, this was not clinically perceptible. The immersion media did not influence the morphology of the studied resin. © 2014 Wiley Periodicals, Inc.

  15. Immersion ultrasonography: simultaneous A-scan and B-scan.

    PubMed

    Coleman, D J; Dallow, R L; Smith, M E

    1979-01-01

    In eyes with opaque media, ophthalmic ultrasound provides a unique source of information that can dramatically affect the course of patient management. In addition, when an ocular abnormality can be visualized, ultrasonography provides information that supplements and complements other diagnostic testing. It provides documentation and differentiation of abnormal states, such as vitreous hemorrhage and intraocular tumor, as well as differentiation of orbital tumors from inflammatory causes of exophthalmos. Additional capabilities of ultrasound are biometric determinations for calculation of intraocular lens implant powers and drug-effectiveness studies. Maximal information is derived from ultrasonography when A-scan and B-scan techniques are employed simultaneously. Flexibility of electronics, variable-frequency transducers, and the use of several different manual scanning patterns aid in detection and interpretation of results. The immersion system of ultrasonography provides these features optimally.

  16. Spectroscopic properties of triangular silver nanoplates immobilized on polyelectrolyte multilayer-modified glass substrates

    NASA Astrophysics Data System (ADS)

    Rabor, Janice B.; Kawamura, Koki; Muko, Daiki; Kurawaki, Junichi; Niidome, Yasuro

    2017-07-01

    Fabrication of surface-immobilized silver nanostructures with reproducible plasmonic properties by dip-coating technique is difficult due to shape alteration. To address this challenge, we used a polyelectrolyte multilayer to promote immobilization of as-received triangular silver nanoplates (TSNP) on a glass substrate through electrostatic interaction. The substrate-immobilized TSNP were characterized by absorption spectrophotometry and scanning electron microscopy. The bandwidth and peak position of localized surface plasmon resonance (LSPR) bands can be tuned by simply varying the concentration of the colloidal solution and immersion time. TSNP immobilized from a higher concentration of colloidal solution with longer immersion time produced broadened LSPR bands in the near-IR region, while a lower concentration with shorter immersion time produced narrower bands in the visible region. The shape of the nanoplates was retained even at long immersion time. Analysis of peak positions and bandwidths also revealed the point at which the main species of the immobilization had been changed from isolates to aggregates.

  17. Validation of an immersive virtual reality system for training near and far space neglect in individuals with stroke: a pilot study.

    PubMed

    Yasuda, Kazuhiro; Muroi, Daisuke; Ohira, Masahiro; Iwata, Hiroyasu

    2017-10-01

    Unilateral spatial neglect (USN) is defined as impaired ability to attend and see on one side, and when present, it interferes seriously with daily life. These symptoms can exist for near and far spaces combined or independently, and it is important to provide effective intervention for near and far space neglect. The purpose of this pilot study was to propose an immersive virtual reality (VR) rehabilitation program using a head-mounted display that is able to train both near and far space neglect, and to validate the immediate effect of the VR program in both near and far space neglect. Ten USN patients underwent the VR program with a pre-post design and no control. In the virtual environment, we developed visual searching and reaching tasks using an immersive VR system. Behavioral inattention test (BIT) scores obtained pre- and immediate post-VR program were compared. BIT scores obtained pre- and post-VR program revealed that far space neglect but not near space neglect improved promptly after the VR program. This effect for far space neglect was observed in the cancelation task, but not in the line bisection task. Positive effects of the immersive VR program for far space neglect are suggested by the results of the present pilot study. However, further studies with rigorous designs are needed to validate its clinical effectiveness.

  18. Enhancing radiological volumes with symbolic anatomy using image fusion and collaborative virtual reality.

    PubMed

    Silverstein, Jonathan C; Dech, Fred; Kouchoukos, Philip L

    2004-01-01

    Radiological volumes are typically reviewed by surgeons using cross-sections and iso-surface reconstructions. Applications that combine collaborative stereo volume visualization with symbolic anatomic information and data fusions would expand surgeons' capabilities in interpretation of data and in planning treatment. Such an application has not been seen clinically. We are developing methods to systematically combine symbolic anatomy (term hierarchies and iso-surface atlases) with patient data using data fusion. We describe our progress toward integrating these methods into our collaborative virtual reality application. The fully combined application will be a feature-rich stereo collaborative volume visualization environment for use by surgeons in which DICOM datasets will self-report underlying anatomy with visual feedback. Using hierarchical navigation of SNOMED-CT anatomic terms integrated with our existing Tele-immersive DICOM-based volumetric rendering application, we will display polygonal representations of anatomic systems on the fly from menus that query a database. The methods and tools involved in this application development are SNOMED-CT, DICOM, VISIBLE HUMAN, volumetric fusion and C++ on a Tele-immersive platform. This application will allow us to identify structures and display polygonal representations from atlas data overlaid with the volume rendering. First, atlas data is automatically translated, rotated, and scaled to the patient data during loading using a public domain volumetric fusion algorithm. This generates a modified symbolic representation of the underlying canonical anatomy. Then, through the use of collision detection or intersection testing of various transparent polygonal representations, the polygonal structures are highlighted into the volumetric representation while the SNOMED names are displayed. Thus, structural names and polygonal models are associated with the visualized DICOM data. This novel juxtaposition of information promises to expand surgeons' abilities to interpret images and plan treatment.

  19. CDPP Tools in the IMPEx infrastructure

    NASA Astrophysics Data System (ADS)

    Gangloff, Michel; Génot, Vincent; Bourrel, Nataliya; Hess, Sébastien; Khodachenko, Maxim; Modolo, Ronan; Kallio, Esa; Alexeev, Igor; Al-Ubaidi, Tarek; Cecconi, Baptiste; André, Nicolas; Budnik, Elena; Bouchemit, Myriam; Dufourg, Nicolas; Beigbeder, Laurent

    2014-05-01

    The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of plasma data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (http://amda.cdpp.eu/) which enables in depth analysis of large amount of data through dedicated functionalities such as: visualization, conditional search, cataloguing, and 3DView (http://3dview.cdpp.eu/) which provides immersive visualisations in planetary environments and is further developed to include simulation and observational data. Both tools implement the IMPEx protocol (http://impexfp7.oeaw.ac.at/) to give access to outputs of simulation runs and models in planetary sciences from several providers like LATMOS, FMI , SINP; prototypes have also been built to access some UCLA and CCMC simulations. These tools and their interaction will be presented together with the IMPEx simulation data model (http://impex.latmos.ipsl.fr/tools/DataModel.htm) used for the interface to model databases.

  20. High resolution ultrasonic spectroscopy system for nondestructive evaluation

    NASA Technical Reports Server (NTRS)

    Chen, C. H.

    1991-01-01

    With increased demand for high resolution ultrasonic evaluation, computer based systems or work stations become essential. The ultrasonic spectroscopy method of nondestructive evaluation (NDE) was used to develop a high resolution ultrasonic inspection system supported by modern signal processing, pattern recognition, and neural network technologies. The basic system which was completed consists of a 386/20 MHz PC (IBM AT compatible), a pulser/receiver, a digital oscilloscope with serial and parallel communications to the computer, an immersion tank with motor control of X-Y axis movement, and the supporting software package, IUNDE, for interactive ultrasonic evaluation. Although the hardware components are commercially available, the software development is entirely original. By integrating signal processing, pattern recognition, maximum entropy spectral analysis, and artificial neural network functions into the system, many NDE tasks can be performed. The high resolution graphics capability provides visualization of complex NDE problems. The phase 3 efforts involve intensive marketing of the software package and collaborative work with industrial sectors.

  1. Progress in video immersion using Panospheric imaging

    NASA Astrophysics Data System (ADS)

    Bogner, Stephen L.; Southwell, David T.; Penzes, Steven G.; Brosinsky, Chris A.; Anderson, Ron; Hanna, Doug M.

    1998-09-01

    Having demonstrated significant technical and marketplace advantages over other modalities for video immersion, PanosphericTM Imaging (PI) continues to evolve rapidly. This paper reports on progress achieved since AeroSense 97. The first practical field deployment of the technology occurred in June-August 1997 during the NASA-CMU 'Atacama Desert Trek' activity, where the Nomad mobile robot was teleoperated via immersive PanosphericTM imagery from a distance of several thousand kilometers. Research using teleoperated vehicles at DRES has also verified the exceptional utility of the PI technology for achieving high levels of situational awareness, operator confidence, and mission effectiveness. Important performance enhancements have been achieved with the completion of the 4th Generation PI DSP-based array processor system. The system is now able to provide dynamic full video-rate generation of spatial and computational transformations, resulting in a programmable and fully interactive immersive video telepresence. A new multi- CCD camera architecture has been created to exploit the bandwidth of this processor, yielding a well-matched PI system with greatly improved resolution. While the initial commercial application for this technology is expected to be video tele- conferencing, it also appears to have excellent potential for application in the 'Immersive Cockpit' concept. Additional progress is reported in the areas of Long Wave Infrared PI Imaging, Stereo PI concepts, PI based Video-Servoing concepts, PI based Video Navigation concepts, and Foveation concepts (to merge localized high-resolution views with immersive views).

  2. Assessing a VR-based learning environment for anatomy education.

    PubMed

    Hoffman, H; Murray, M; Hettinger, L; Viirre, E

    1998-01-01

    The purpose of the research proposed herein is to develop an empirical, methodological tool for the assessment of visual depth perception in virtual environments (VEs). Our goal is to develop and employ a behaviorally-based method for assessing the impact of VE design features on the perception of visual depth as indexed by the performance of fundamental perceptual-motor activities. Specifically, in this experiment we will assess the affect of two dimensions of VE system design--(1) viewing condition or "level of immersion", and (2) layout/design of the VE--on the performance of an engaging, game-like task. The characteristics of the task to be employed are as follows--(1) it places no demands on cognition in the form of problem solving, retrieval of previously learned information, or other analytic activity in order to assure that (2) variations in task performance can be exclusively attributed to the extent to which the experimental factors influence visual depth perception. Subjects' performance will be assessed in terms of the speed and accuracy of task performance, as well as underlying dimensions of performance such as workload, fatigue, and physiological well being (i.e., cybersickness). The results of this experiment will provide important information on the effect of VE immersion and other VE design issues on human perception and performance. Further development, refinement, and validation of this behaviorally-based methodology will be pursued to provide user-centered design criteria for the design and use of VE systems.

  3. Improving spatial perception in 5-yr.-old Spanish children.

    PubMed

    Jiménez, Andrés Canto; Sicilia, Antonio Oña; Vera, Juan Granda

    2007-06-01

    Assimilation of distance perception was studied in 70 Spanish primary school children. This assimilation involves the generation of projective images which are acquired through two mechanisms. One mechanism is spatial perception, wherein perceptual processes develop ensuring successful immersion in space and the acquisition of visual cues which a person may use to interpret images seen in the distance. The other mechanism is movement through space so that these images are produced. The present study evaluated the influence on improvements in spatial perception of using increasingly larger spaces for training sessions within a motor skills program. Visual parameters were measured in relation to the capture and tracking of moving objects or ocular motility and speed of detection or visual reaction time. Analysis showed that for the group trained in increasingly larger spaces, ocular motility and visual reaction time were significantly improved during. different phases of the program.

  4. Visualizing Sea Level Rise with Augmented Reality

    NASA Astrophysics Data System (ADS)

    Kintisch, E. S.

    2013-12-01

    Looking Glass is an application on the iPhone that visualizes in 3-D future scenarios of sea level rise, overlaid on live camera imagery in situ. Using a technology known as augmented reality, the app allows a layperson user to explore various scenarios of sea level rise using a visual interface. Then the user can see, in an immersive, dynamic way, how those scenarios would affect a real place. The first part of the experience activates users' cognitive, quantitative thinking process, teaching them how global sea level rise, tides and storm surge contribute to flooding; the second allows an emotional response to a striking visual depiction of possible future catastrophe. This project represents a partnership between a science journalist, MIT, and the Rhode Island School of Design, and the talk will touch on lessons this projects provides on structuring and executing such multidisciplinary efforts on future design projects.

  5. [A new concept in digestive surgery: the computer assisted surgical procedure, from virtual reality to telemanipulation].

    PubMed

    Marescaux, J; Clément, J M; Nord, M; Russier, Y; Tassetti, V; Mutter, D; Cotin, S; Ayache, N

    1997-11-01

    Surgical simulation increasingly appears to be an essential aspect of tomorrow's surgery. The development of a hepatic surgery simulator is an advanced concept calling for a new writing system which will transform the medical world: virtual reality. Virtual reality extends the perception of our five senses by representing more than the real state of things by the means of computer sciences and robotics. It consists of three concepts: immersion, navigation and interaction. Three reasons have led us to develop this simulator: the first is to provide the surgeon with a comprehensive visualisation of the organ. The second reason is to allow for planning and surgical simulation that could be compared with the detailed flight-plan for a commercial jet pilot. The third lies in the fact that virtual reality is an integrated part of the concept of computer assisted surgical procedure. The project consists of a sophisticated simulator which has to include five requirements: visual fidelity, interactivity, physical properties, physiological properties, sensory input and output. In this report we will describe how to get a realistic 3D model of the liver from bi-dimensional 2D medical images for anatomical and surgical training. The introduction of a tumor and the consequent planning and virtual resection is also described, as are force feedback and real-time interaction.

  6. National Laboratory for Advanced Scientific Visualization at UNAM - Mexico

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Varela, Alfredo

    2016-04-01

    In 2015, the National Autonomous University of Mexico (UNAM) joined the family of Universities and Research Centers where advanced visualization and computing plays a key role to promote and advance missions in research, education, community outreach, as well as business-oriented consulting. This initiative provides access to a great variety of advanced hardware and software resources and offers a range of consulting services that spans a variety of areas related to scientific visualization, among which are: neuroanatomy, embryonic development, genome related studies, geosciences, geography, physics and mathematics related disciplines. The National Laboratory for Advanced Scientific Visualization delivers services through three main infrastructure environments: the 3D fully immersive display system Cave, the high resolution parallel visualization system Powerwall, the high resolution spherical displays Earth Simulator. The entire visualization infrastructure is interconnected to a high-performance-computing-cluster (HPCC) called ADA in honor to Ada Lovelace, considered to be the first computer programmer. The Cave is an extra large 3.6m wide room with projected images on the front, left and right, as well as floor walls. Specialized crystal eyes LCD-shutter glasses provide a strong stereo depth perception, and a variety of tracking devices allow software to track the position of a user's hand, head and wand. The Powerwall is designed to bring large amounts of complex data together through parallel computing for team interaction and collaboration. This system is composed by 24 (6x4) high-resolution ultra-thin (2 mm) bezel monitors connected to a high-performance GPU cluster. The Earth Simulator is a large (60") high-resolution spherical display used for global-scale data visualization like geophysical, meteorological, climate and ecology data. The HPCC-ADA, is a 1000+ computing core system, which offers parallel computing resources to applications that requires large quantity of memory as well as large and fast parallel storage systems. The entire system temperature is controlled by an energy and space efficient cooling solution, based on large rear door liquid cooled heat exchangers. This state-of-the-art infrastructure will boost research activities in the region, offer a powerful scientific tool for teaching at undergraduate and graduate levels, and enhance association and cooperation with business-oriented organizations.

  7. Cross-Cultural Nonverbal Cue Immersive Training

    DTIC Science & Technology

    2008-12-01

    our daily lives and is paramount to collaborative interaction. Both verbal and nonverbal messages interact to form human communication . Verbal...the context of the conversation. Third, eye contact and behavior is considered to be the most important in human communication , which refers to...transmitted. The face is critical in human communication since it is the most visible during interaction. Facial and emotion expression relating to

  8. Effect of attractive interactions between polymers on the effective force acting between colloids immersed in a polymer system: Analytic liquid-state theory.

    PubMed

    Chervanyov, A I

    2016-12-28

    By making use of the polymer reference interaction site model, we analytically study the effect of attractive interactions between polymers on the effective forces acting between colloids immersed in a polymer system. The performed theoretical analysis has no restrictions with respect to the polymer density and relative sizes of the colloids and polymers. The polymer mediated (PM) potential acting between colloids is shown to significantly depend on the strength and range of the polymer-polymer interactions. In the nano-particle limit, where the colloid radius is much smaller than the polymer gyration radius, the presence of attractive polymer-polymer interactions causes only quantitative changes to the PM potential. In the opposite limit of relatively large colloids, the polymer-polymer interactions revert the sign of the total effective force acting between colloids so that this force becomes attractive at sufficiently large polymer densities. With the objective to study an intricate interplay between the attractive PM forces and steric repulsion in different polymer density regimes, we calculate the second virial coefficient B of the total effective potential acting between colloids. The dependence of B on the polymer density is discussed in detail, revealing several novel features of the PM interactions caused by the presence of attractive polymer-polymer interactions.

  9. Interactive Simulator Training in Civil Construction: Evaluation from the Trainer's Perspective

    ERIC Educational Resources Information Center

    Tichon, Jennifer; Diver, Phil

    2012-01-01

    The popularity of simulators to augment training programs for operators of heavy machinery has been growing across several industries including mining, rail and more recently construction. High-fidelity, interactive simulation is typically achieved through complete immersion in brief, stressful and complex VR scenarios. The use of simulation…

  10. Hybrid finite difference/finite element immersed boundary method.

    PubMed

    E Griffith, Boyce; Luo, Xiaoyu

    2017-12-01

    The immersed boundary method is an approach to fluid-structure interaction that uses a Lagrangian description of the structural deformations, stresses, and forces along with an Eulerian description of the momentum, viscosity, and incompressibility of the fluid-structure system. The original immersed boundary methods described immersed elastic structures using systems of flexible fibers, and even now, most immersed boundary methods still require Lagrangian meshes that are finer than the Eulerian grid. This work introduces a coupling scheme for the immersed boundary method to link the Lagrangian and Eulerian variables that facilitates independent spatial discretizations for the structure and background grid. This approach uses a finite element discretization of the structure while retaining a finite difference scheme for the Eulerian variables. We apply this method to benchmark problems involving elastic, rigid, and actively contracting structures, including an idealized model of the left ventricle of the heart. Our tests include cases in which, for a fixed Eulerian grid spacing, coarser Lagrangian structural meshes yield discretization errors that are as much as several orders of magnitude smaller than errors obtained using finer structural meshes. The Lagrangian-Eulerian coupling approach developed in this work enables the effective use of these coarse structural meshes with the immersed boundary method. This work also contrasts two different weak forms of the equations, one of which is demonstrated to be more effective for the coarse structural discretizations facilitated by our coupling approach. © 2017 The Authors International  Journal  for  Numerical  Methods  in  Biomedical  Engineering Published by John Wiley & Sons Ltd.

  11. Resist development status for immersion lithography

    NASA Astrophysics Data System (ADS)

    Tsuji, Hiromitsu; Yoshida, Masaaki; Ishizuka, Keita; Hirano, Tomoyuki; Endo, Kotaro; Sato, Mitsuru

    2005-05-01

    Immersion lithography has already demonstrated superior performance for next generation semiconductor manufacturing, while some challenges with contact immersion fluids and resist still remain. There are many interactions to be considered with regards to the solid and liquid interface. Resist elusion in particular requires very careful attention since the impact on the lens and fluid supply system in exposure tool could pose a significant risk at the manufacturing stage. TOK developed a screening procedure to detect resist elution of ion species down to ppb levels during non and post exposure steps. It was found that the PAG cation elution is affected by molecular weight and structure while the PAG anion elution was dependent on the molecular structure and mobility. In this paper, lithographic performance is also discussed with the low elution type resist.

  12. Molecular modeling of the process of reversible dissolution of the collagen protein under the action of tissue-clearing agents

    NASA Astrophysics Data System (ADS)

    Dvoretsky, K. N.; Berezin, K. V.; Chernavina, M. L.; Likhter, A. M.; Shagautdinova, I. T.; Antonova, E. M.; Rybakov, A. V.; Grechukhina, O. N.; Tuchin, V. V.

    2018-04-01

    The interaction of glycerol immersion agent with collagen mimetic peptide ((GPH)9)3 and a fragment of the microfibril 5((GPH)12)3 was studied by the classical molecular dynamics method using the GROMACS software. The change in geometric parameters of collagen α-chains at various concentrations of an aqueous solution of glycerol is analyzed. It is shown that these changes nonlinearly depend on the concentration and are limited to a certain level, which correlates with the experimental data on optical clearing efficiency of human skin. A hypothesis on the cause of the decreased efficiency of optical skin clearing at high immersion agent concentrations is put forward. The molecular mechanism of immersion optical clearing of biological tissues is discussed.

  13. Immersive Environments - A Connectivist Approach

    NASA Astrophysics Data System (ADS)

    Loureiro, Ana; Bettencourt, Teresa

    We are conducting a research project with the aim of achieving better and more efficient ways to facilitate teaching and learning in Higher Level Education. We have chosen virtual environments, with particular emphasis to Second Life® platform augmented by web 2.0 tools, to develop the study. The Second Life® environment has some interesting characteristics that captured our attention, it is immersive; it is a real world simulator; it is a social network; it allows real time communication, cooperation, collaboration and interaction; it is a safe and controlled environment. We specifically chose tools from web 2.0 that enable sharing and collaborative way of learning. Through understanding the characteristics of this learning environment, we believe that immersive learning along with other virtual tools can be integrated in today's pedagogical practices.

  14. Sensor Spatial Distortion, Visual Latency, and Update Rate Effects on 3D Tracking in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Ellis, S. R.; Adelstein, B. D.; Baumeler, S.; Jense, G. J.; Jacoby, R. H.; Trejo, Leonard (Technical Monitor)

    1998-01-01

    Several common defects that we have sought to minimize in immersing virtual environments are: static sensor spatial distortion, visual latency, and low update rates. Human performance within our environments during large amplitude 3D tracking was assessed by objective and subjective methods in the presence and absence of these defects. Results show that 1) removal of our relatively small spatial sensor distortion had minor effects on the tracking activity, 2) an Adapted Cooper-Harper controllability scale proved the most sensitive subjective indicator of the degradation of dynamic fidelity caused by increasing latency and decreasing frame rates, and 3) performance, as measured by normalized RMS tracking error or subjective impressions, was more markedly influenced by changing visual latency than by update rate.

  15. Manifold compositions, music visualization, and scientific sonification in an immersive virtual-reality environment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaper, H. G.

    1998-01-05

    An interdisciplinary project encompassing sound synthesis, music composition, sonification, and visualization of music is facilitated by the high-performance computing capabilities and the virtual-reality environments available at Argonne National Laboratory. The paper describes the main features of the project's centerpiece, DIASS (Digital Instrument for Additive Sound Synthesis); ''A.N.L.-folds'', an equivalence class of compositions produced with DIASS; and application of DIASS in two experiments in the sonification of complex scientific data. Some of the larger issues connected with this project, such as the changing ways in which both scientists and composers perform their tasks, are briefly discussed.

  16. Postural and Spatial Orientation Driven by Virtual Reality

    PubMed Central

    Keshner, Emily A.; Kenyon, Robert V.

    2009-01-01

    Orientation in space is a perceptual variable intimately related to postural orientation that relies on visual and vestibular signals to correctly identify our position relative to vertical. We have combined a virtual environment with motion of a posture platform to produce visual-vestibular conditions that allow us to explore how motion of the visual environment may affect perception of vertical and, consequently, affect postural stabilizing responses. In order to involve a higher level perceptual process, we needed to create a visual environment that was immersive. We did this by developing visual scenes that possess contextual information using color, texture, and 3-dimensional structures. Update latency of the visual scene was close to physiological latencies of the vestibulo-ocular reflex. Using this system we found that even when healthy young adults stand and walk on a stable support surface, they are unable to ignore wide field of view visual motion and they adapt their postural orientation to the parameters of the visual motion. Balance training within our environment elicited measurable rehabilitation outcomes. Thus we believe that virtual environments can serve as a clinical tool for evaluation and training of movement in situations that closely reflect conditions found in the physical world. PMID:19592796

  17. Neural mechanisms of limb position estimation in the primate brain.

    PubMed

    Shi, Ying; Buneo, Christopher A

    2011-01-01

    Understanding the neural mechanisms of limb position estimation is important both for comprehending the neural control of goal directed arm movements and for developing neuroprosthetic systems designed to replace lost limb function. Here we examined the role of area 5 of the posterior parietal cortex in estimating limb position based on visual and somatic (proprioceptive, efference copy) signals. Single unit recordings were obtained as monkeys reached to visual targets presented in a semi-immersive virtual reality environment. On half of the trials animals were required to maintain their limb position at these targets while receiving both visual and non-visual feedback of their arm position, while on the other trials visual feedback was withheld. When examined individually, many area 5 neurons were tuned to the position of the limb in the workspace but very few neurons modulated their firing rates based on the presence/absence of visual feedback. At the population level however decoding of limb position was somewhat more accurate when visual feedback was provided. These findings support a role for area 5 in limb position estimation but also suggest that visual signals regarding limb position are only weakly represented in this area, and only at the population level.

  18. An immersed-shell method for modelling fluid–structure interactions

    PubMed Central

    Viré, A.; Xiang, J.; Pain, C. C.

    2015-01-01

    The paper presents a novel method for numerically modelling fluid–structure interactions. The method consists of solving the fluid-dynamics equations on an extended domain, where the computational mesh covers both fluid and solid structures. The fluid and solid velocities are relaxed to one another through a penalty force. The latter acts on a thin shell surrounding the solid structures. Additionally, the shell is represented on the extended domain by a non-zero shell-concentration field, which is obtained by conservatively mapping the shell mesh onto the extended mesh. The paper outlines the theory underpinning this novel method, referred to as the immersed-shell approach. It also shows how the coupling between a fluid- and a structural-dynamics solver is achieved. At this stage, results are shown for cases of fundamental interest. PMID:25583857

  19. Visual influence on path integration in darkness indicates a multimodal representation of large-scale space

    PubMed Central

    Tcheang, Lili; Bülthoff, Heinrich H.; Burgess, Neil

    2011-01-01

    Our ability to return to the start of a route recently performed in darkness is thought to reflect path integration of motion-related information. Here we provide evidence that motion-related interoceptive representations (proprioceptive, vestibular, and motor efference copy) combine with visual representations to form a single multimodal representation guiding navigation. We used immersive virtual reality to decouple visual input from motion-related interoception by manipulating the rotation or translation gain of the visual projection. First, participants walked an outbound path with both visual and interoceptive input, and returned to the start in darkness, demonstrating the influences of both visual and interoceptive information in a virtual reality environment. Next, participants adapted to visual rotation gains in the virtual environment, and then performed the path integration task entirely in darkness. Our findings were accurately predicted by a quantitative model in which visual and interoceptive inputs combine into a single multimodal representation guiding navigation, and are incompatible with a model of separate visual and interoceptive influences on action (in which path integration in darkness must rely solely on interoceptive representations). Overall, our findings suggest that a combined multimodal representation guides large-scale navigation, consistent with a role for visual imagery or a cognitive map. PMID:21199934

  20. NASA Virtual Glovebox: An Immersive Virtual Desktop Environment for Training Astronauts in Life Science Experiments

    NASA Technical Reports Server (NTRS)

    Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard

    2003-01-01

    The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.

  1. Long term storage test of titanium material with liquid fluorine propellant

    NASA Technical Reports Server (NTRS)

    Denson, J. R.; English, W. D.; Roth, J.; Toy, A.

    1979-01-01

    The compatibility of 6AL-4V Ti with propellant grade GF2 and LF2 at 77 K for up to 3 years was investigated. Titanium double coupons, annealed or heat treated, with 16 or 64 RMS finishes, were immersed in F2 in individual Pyrex capsules and stored under LN2 for 29 and 39 months. Pre and post immersion tests were performed on the propellant and coupons. Chemical analysis of the propellant did not reveal any significant changes due to titanium corrosion. Gravimetric, visual, microscopic, and metallurgical examination with pitting analysis did not reveal gross corrosion of the titanium although pitting appears to be greater after 39 months exposure. The increase in pit size and number raises the possibility of unpredictable crack propagation instability. Fracture toughness tests are necessary to define this possibility.

  2. Scientific Visualization for Atmospheric Data Analysis in Collaborative Virtual Environments

    NASA Astrophysics Data System (ADS)

    Engelke, Wito; Flatken, Markus; Garcia, Arturo S.; Bar, Christian; Gerndt, Andreas

    2016-04-01

    1 INTRODUCTION The three year European research project CROSS DRIVE (Collaborative Rover Operations and Planetary Science Analysis System based on Distributed Remote and Interactive Virtual Environments) started in January 2014. The research and development within this project is motivated by three use case studies: landing site characterization, atmospheric science and rover target selection [1]. Currently the implementation for the second use case is in its final phase [2]. Here, the requirements were generated based on the domain experts input and lead to development and integration of appropriate methods for visualization and analysis of atmospheric data. The methods range from volume rendering, interactive slicing, iso-surface techniques to interactive probing. All visualization methods are integrated in DLR's Terrain Rendering application. With this, the high resolution surface data visualization can be enriched with additional methods appropriate for atmospheric data sets. This results in an integrated virtual environment where the scientist has the possibility to interactively explore his data sets directly within the correct context. The data sets include volumetric data of the martian atmosphere, precomputed two dimensional maps and vertical profiles. In most cases the surface data as well as the atmospheric data has global coverage and is of time dependent nature. Furthermore, all interaction is synchronized between different connected application instances, allowing for collaborative sessions between distant experts. 2 VISUALIZATION TECHNIQUES Also the application is currently used for visualization of data sets related to Mars the techniques can be used for other data sets as well. Currently the prototype is capable of handling 2 and 2.5D surface data as well as 4D atmospheric data. Specifically, the surface data is presented using an LoD approach which is based on the HEALPix tessellation of a sphere [3, 4, 5] and can handle data sets in the order of terabytes. The combination of different data sources (e.g., MOLA, HRSC, HiRISE) and selection of presented data (e.g., infrared, spectral, imagery) is also supported. Furthermore, the data is presented unchanged and with the highest possible resolution for the target setup (e.g., power-wall, workstation, laptop) and view distance. The visualization techniques for the volumetric data sets can handle VTK [6] based data sets and also support different grid types as well as a time component. In detail, the integrated volume rendering uses a GPU based ray casting algorithm which was adapted to work in spherical coordinate systems. This approach results in interactive frame-rates without compromising visual fidelity. Besides direct visualization via volume rendering the prototype supports interactive slicing, extraction of iso-surfaces and probing. The latter can also be used for side-by-side comparison and on-the-fly diagram generation within the application. Similarily to the surface data a combination of different data sources is supported as well. For example, the extracted iso-surface of a scalar pressure field can be used for the visualization of the temperature. The software development is supported by the ViSTA VR-toolkit [7] and supports different target systems as well as a wide range of VR-devices. Furthermore, the prototype is scalable to run on laptops, workstations and cluster setups. REFERENCES [1] A. S. Garcia, D. J. Roberts, T. Fernando, C. Bar, R. Wolff, J. Dodiya, W. Engelke, and A. Gerndt, "A collaborative workspace architecture for strengthening collaboration among space scientists," in IEEE Aerospace Conference, (Big Sky, Montana, USA), 7-14 March 2015. [2] W. Engelke, "Mars Cartography VR System 2/3." German Aerospace Center (DLR), 2015. Project Deliverable D4.2. [3] E. Hivon, F. K. Hansen, and A. J. Banday, "The healpix primer," arXivpreprint astro-ph/9905275, 1999. [4] K. M. Gorski, E. Hivon, A. Banday, B. D. Wandelt, F. K. Hansen, M. Reinecke, and M. Bartelmann, "Healpix: a framework for high-resolution discretization and fast analysis of data distributed on the sphere," The Astrophysical Journal, vol. 622, no. 2, p. 759, 2005. [5] R. Westerteiger, A. Gerndt, and B. Hamann, "Spherical terrain render- ing using the hierarchical healpix grid," VLUDS, vol. 11, pp. 13-23, 2011. [6] W. Schroeder, K. Martin, and B. Lorensen, The Visualization Toolkit. Kitware, 4 ed., 2006. [7] T. van Reimersdahl, T. Kuhlen, A. Gerndt, J. Henrichs, and C. Bischof, "ViSTA: a multimodal, platform-independent VR-toolkit based on WTK, VTK, and MPI," in Proceedings of the 4th International Immersive Projection Technology Workshop (IPT), 2000.

  3. The Two-Way Language Bridge: Co-Constructing Bilingual Language Learning Opportunities

    ERIC Educational Resources Information Center

    Martin-Beltran, Melinda

    2010-01-01

    Using a sociocultural theoretical lens, this study examines the nature of student interactions in a dual immersion school to analyze affordances for bilingual language learning, language exchange, and co-construction of language expertise. This article focuses on data from audio- and video-recorded interactions of fifth-grade students engaged in…

  4. Interacting with the World--Moving History beyond the Classroom.

    ERIC Educational Resources Information Center

    Bebensee, Larry; Evans, Mark

    1990-01-01

    Describes a project in which students from 11 countries took part in an Arab-Israeli Conflict simulation-- an interactive communication simulation that immersed students in the complex dynamics of international reality. The project is a pilot program from the Region of Peel School Board, Ontario. Example of simulation is appended. (SLM)

  5. What Matters Most when Students and Teachers Use Interactive Whiteboards in Mathematics Classrooms?

    ERIC Educational Resources Information Center

    McQuillan, Kimberley; Northcote, Maria; Beamish, Peter

    2012-01-01

    Teachers are encouraged to immerse their students in rich and engaging learning environments (NSW Department of Education and Training, 2003). One teaching tool that can facilitate the creation of rich learning environments is the interactive whiteboard (IWB) (Baker, 2009). When teaching mathematics, the varied representational aspects of IWBs can…

  6. Color stability, radiopacity, and chemical characteristics of white mineral trioxide aggregate associated with 2 different vehicles in contact with blood.

    PubMed

    Guimarães, Bruno Martini; Tartari, Talita; Marciano, Marina Angélica; Vivan, Rodrigo Ricci; Mondeli, Rafael Francisco Lia; Camilleri, Josette; Duarte, Marco Antonio Hungaro

    2015-06-01

    Discoloration of mineral trioxide aggregate (MTA) can be exacerbated by the interaction of the cement with body fluids such as blood. This study aimed to analyze the color alteration, chemical characteristics, and radiopacity of MTA manipulated with 2 different vehicles after immersion in blood or distilled water (DW). MTA mixed with 100% DW or 80% DW/20% propylene glycol (PG) as vehicles were placed into rubber rings and incubated at 37°C and 100% relative humidity until set. Color assessment and scanning electron microscopy/energy-dispersive spectroscopy analysis were performed after setting and repeated after 7, 15, and 30 days after immersion in blood and DW. Statistical analysis for color alteration and radiopacity was performed using nonparametric Kruskal-Wallis and Dunn tests (P < .05). When 80% DW/20% PG was used as the vehicle, significantly lower color alterations were observed for all time periods compared with 100% DW when immersed in blood (P < .05). All surfaces displayed morphologic changes after immersion in both media because of loss of bismuth. A decrease in radiopacity was observed over time in all groups, with a statistically significant difference after 30 days for groups DW immersed in blood and 80% DW/20% immersed in both media (P < .05). The ratio of 80% DW/20% PG as a vehicle for MTA results in a lower color alteration when in contact with blood. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  7. A Theory of Heterogeneous Ice Nucleation in the Immersion Mode

    NASA Astrophysics Data System (ADS)

    Barahona, D.

    2017-12-01

    Immersion ice nucleation is likely involved in the initiation of precipitation and determines to a large extent the phase partitioning in convective clouds. Theoretical models commonly used to describe immersion freezing in atmospheric models are based on the classical nucleation theory. CNT however neglects important interactions near the immersed particle that may affect nucleation rates. This work introduces a new theory of immersion freezing based on two premises. First, immersion ice nucleation is mediated by the modification of the properties of water near the particle-liquid interface rather than by the geometry of the ice germ. Second, the same mechanism that leads to the decrease in the work of germ formation also decreases the mobility of water molecules near the immersed particle. These two premises allow establishing general thermodynamic constraints to the ice nucleation rate. Analysis of the new theory shows that active sites likely trigger ice nucleation, but they do not control the overall nucleation rate nor the probability of freezing. It also suggests that materials with different ice nucleation efficiency may exhibit similar freezing temperatures under similar conditions but differ in their sensitivity to particle surface area and cooling rate. The theory suggests that many species are very efficient at nucleating ice and it is likely that highly effective INP are not uncommon in the atmosphere; however ice nucleation rates may be slower than currently believed. Predicted nucleation rates show good agreement with experimental results for a diverse set of atmospheric relevant materials including dust, black carbon and bacterial ice nucleating particles. The application of the new theory within the NASA Global Earth System Model (GEOS-5) is also discussed.

  8. Virtual reality as a tool for cross-cultural communication: an example from military team training

    NASA Astrophysics Data System (ADS)

    Downes-Martin, Stephen; Long, Mark; Alexander, Joanna R.

    1992-06-01

    A major problem with communication across cultures, whether professional or national, is that simple language translation if often insufficient to communicate the concepts. This is especially true when the communicators come from highly specialized fields of knowledge or from national cultures with long histories of divergence. This problem becomes critical when the goal of the communication is national negotiation dealing with such high risk items as arms negotiation or trade wars. Virtual Reality technology has considerable potential for facilitating communication across cultures, by immersing the communicators within multiple visual representations of the concepts, and providing control over those representations. Military distributed team training provides a model for virtual reality suitable for cross cultural communication such as negotiation. In both team training and negotiation, the participants must cooperate, agree on a set of goals, and achieve mastery over the concepts being negotiated. Team training technologies suitable for supporting cross cultural negotiation exist (branch wargaming, computer image generation and visualization, distributed simulation), and have developed along different lines than traditional virtual reality technology. Team training de-emphasizes the realism of physiological interfaces between the human and the virtual reality, and emphasizes the interaction of humans with each other and with intelligent simulated agents within the virtual reality. This approach to virtual reality is suggested as being more fruitful for future work.

  9. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads.

    PubMed

    Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-05-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.

  10. Interactive Molecular Graphics for Augmented Reality Using HoloLens.

    PubMed

    Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas

    2018-06-13

    Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.

  11. Simulations of two sedimenting-interacting spheres with different sizes and initial configurations using immersed boundary method

    NASA Astrophysics Data System (ADS)

    Liao, Chuan-Chieh; Hsiao, Wen-Wei; Lin, Ting-Yu; Lin, Chao-An

    2015-06-01

    Numerical investigations are carried out for the drafting, kissing and tumbling (DKT) phenomenon of two freely falling spheres within a long container by using an immersed-boundary method. The method is first validated with flows induced by a sphere settling under gravity in a small container for which experimental data are available. The hydrodynamic interactions of two spheres are then studied with different sizes and initial configurations. When a regular sphere is placed below the larger one, the duration of kissing decreases in pace with the increase in diameter ratio. On the other hand, the time duration of the kissing stage increases in tandem with the increase in diameter ratio as the large sphere is placed below the regular one, and there is no DKT interactions beyond threshold diameter ratio. Also, the gap between homogeneous spheres remains constant at the terminal velocity, whereas the gaps between the inhomogeneous spheres increase due to the differential terminal velocity.

  12. Playing in or out of character: user role differences in the experience of interactive storytelling.

    PubMed

    Roth, Christian; Vermeulen, Ivar; Vorderer, Peter; Klimmt, Christoph; Pizzi, David; Lugrin, Jean-Luc; Cavazza, Marc

    2012-11-01

    Interactive storytelling (IS) is a promising new entertainment technology synthesizing preauthored narrative with dynamic user interaction. Existing IS prototypes employ different modes to involve users in a story, ranging from individual avatar control to comprehensive control over the virtual environment. The current experiment tested whether different player modes (exerting local vs. global influence) yield different user experiences (e.g., senses of immersion vs. control). A within-subject design involved 34 participants playing the cinematic IS drama "Emo Emma"( 1 ) both in the local (actor) and in global (ghost) mode. The latter mode allowed free movement in the virtual environment and hidden influence on characters, objects, and story development. As expected, control-related experiential qualities (effectance, autonomy, flow, and pride) were more intense for players in the global (ghost) mode. Immersion-related experiences did not differ over modes. Additionally, men preferred the sense of command facilitated by the ghost mode, whereas women preferred the sense of involvement facilitated by the actor mode.

  13. Evaluation of an Interactive Undergraduate Cosmology Curriculum

    NASA Astrophysics Data System (ADS)

    White, Aaron; Coble, Kimberly A.; Martin, Dominique; Hayes, Patrycia; Targett, Tom; Cominsky, Lynn R.

    2018-06-01

    The Big Ideas in Cosmology is an immersive set of web-based learning modules that integrates text, figures, and visualizations with short and long interactive tasks as well as labs that allow students to manipulate and analyze real cosmological data. This enables the transformation of general education astronomy and cosmology classes from primarily lecture and book-based courses to a format that builds important STEM skills, while engaging those outside the field with modern discoveries and a more realistic sense of practices and tools used by professional astronomers. Over two semesters, we field-tested the curriculum in general education cosmology classes at a state university in California [N ~ 80]. We administered pre- and post-instruction multiple-choice and open-ended content surveys as well as the CLASS, to gauge the effectiveness of the course and modules. Questions addressed included the structure, composition, and evolution of the universe, including students’ reasoning and “how we know.”Module development and evaluation was supported by NASA ROSES E/PO Grant #NNXl0AC89G, the Illinois Space Grant Consortium, the Fermi E/PO program, Sonoma State University’s Space Science Education and Public Outreach Group, and San Francisco State University. The modules are published by Great River Learning/Kendall-Hunt.

  14. VR versus LF: towards the limitation-free 3D

    NASA Astrophysics Data System (ADS)

    Balogh, Tibor; Kara, Peter A.

    2017-06-01

    The evolution of 3D technologies shows a cyclical learning curve with a series of hypes and dead ends, with mistakes and consequences. 3D images contain significantly more information than the corresponding 2D ones. 3D display systems should be built on more pixels, or higher speed components. For true 3D, this factor is in the order of 100x, which is a real technological challenge. If not fulfilled, the capabilities of 3D systems will be compromised: headgears will be needed, or the viewers should be positioned or tracked, single-user devices, lack of parallax, missing cues, etc. The temptation is always there: why to provide all the information, just what the person absorbs that moment (subjective or objective visualization). Virtual Reality (VR) glasses have been around for more than two decades. With the latest technical improvements, VR became the next hype. 3D immersion was added as a new phenomenon; however, VR represents an isolated experience, and still requires headgears and a controlled environment. Augmented Reality (AR) in this sense is different. Will the VR/AR hype with the headgears be a dead end? While VR headsets may sell better than smart glasses or 3D TV glasses, also consider that using the technology may require a set of behavioral changes that the majority of people do not want to make. Displays and technologies that restrict viewers, or cause any discomfort will not be accepted on the long term. The newer wave of 3D is forecasted to 2018-2020, answering the need for unaided, limitation-free 3D experience. Light Field (LF) systems represent the next-generation in 3D. The HoloVizio system, having a capacity in the order of 100x, offers natural, restrictions-free 3D experience on a full field of view, enabling collaborative use for an unlimited number of viewers, even in a wider, immersive space. As a scalable technology, the display range goes from monitor-style units, through automotive 3D HUDs, screen-less solutions, up to cinema systems, and Holografika is working on interactive large-scale immersive systems and glasses-free 3D LED walls.

  15. Defense Science Board 2006 Summer Study on 21st Century Strategic Technology Vectors, Volume 2: Critical Capabilities and Enabling Technologies

    DTIC Science & Technology

    2007-02-01

    neurosciences , 12 I CH APT ER 2 particularly those analytic elements that create models to assist in understanding individual and...precision geo-location 10. Cause-effect models (environment, infrastructure, socio-cultural, DIME, PMESII) 11. Storytelling , gisting and advanced...sources/TRL 5 Storytelling , gisting and advanced visualization)/TRL 2-5 High fidelity, socio-culturally relevant immersive games, training and mission

  16. Eye-tracking novice and expert geologist groups in the field and laboratory

    NASA Astrophysics Data System (ADS)

    Cottrell, R. D.; Evans, K. M.; Jacobs, R. A.; May, B. B.; Pelz, J. B.; Rosen, M. R.; Tarduno, J. A.; Voronov, J.

    2010-12-01

    We are using an Active Vision approach to learn how novices and expert geologists acquire visual information in the field. The Active Vision approach emphasizes that visual perception is an active process wherein new information is acquired about a particular environment through exploratory eye movements. Eye movements are not only influenced by physical stimuli, but are also strongly influenced by high-level perceptual and cognitive processes. Eye-tracking data were collected on ten novices (undergraduate geology students) and 3 experts during a 10-day field trip across California focused on neotectonics. In addition, high-resolution panoramic images were captured at each key locality for use in a semi-immersive laboratory environment. Examples of each data type will be presented. The number of observers will be increased in subsequent field trips, but expert/novice differences are already apparent in the first set of individual eye-tracking records, including gaze time, gaze pattern and object recognition. We will review efforts to quantify these patterns, and development of semi-immersive environments to display geologic scenes. The research is a collaborative effort between Earth scientists, Cognitive scientists and Imaging scientists at the University of Rochester and the Rochester Institute of Technology and with funding from the National Science Foundation.

  17. Lower-limb hot-water immersion acutely induces beneficial hemodynamic and cardiovascular responses in peripheral arterial disease and healthy, elderly controls.

    PubMed

    Thomas, Kate N; van Rij, André M; Lucas, Samuel J E; Cotter, James D

    2017-03-01

    Passive heat induces beneficial perfusion profiles, provides substantive cardiovascular strain, and reduces blood pressure, thereby holding potential for healthy and cardiovascular disease populations. The aim of this study was to assess acute responses to passive heat via lower-limb, hot-water immersion in patients with peripheral arterial disease (PAD) and healthy, elderly controls. Eleven patients with PAD (age 71 ± 6 yr, 7 male, 4 female) and 10 controls (age 72 ± 7 yr, 8 male, 2 female) underwent hot-water immersion (30-min waist-level immersion in 42.1 ± 0.6°C water). Before, during, and following immersion, brachial and popliteal artery diameter, blood flow, and shear stress were assessed using duplex ultrasound. Lower-limb perfusion was measured also using venous occlusion plethysmography and near-infrared spectroscopy. During immersion, shear rate increased ( P < 0.0001) comparably between groups in the popliteal artery (controls: +183 ± 26%; PAD: +258 ± 54%) and brachial artery (controls: +117 ± 24%; PAD: +107 ± 32%). Lower-limb blood flow increased significantly in both groups, as measured from duplex ultrasound (>200%), plethysmography (>100%), and spectroscopy, while central and peripheral pulse-wave velocity decreased in both groups. Mean arterial blood pressure was reduced by 22 ± 9 mmHg (main effect P < 0.0001, interaction P = 0.60) during immersion, and remained 7 ± 7 mmHg lower 3 h afterward. In PAD, popliteal shear profiles and claudication both compared favorably with those measured immediately following symptom-limited walking. A 30-min hot-water immersion is a practical means of delivering heat therapy to PAD patients and healthy, elderly individuals to induce appreciable systemic (chronotropic and blood pressure lowering) and hemodynamic (upper and lower-limb perfusion and shear rate increases) responses. Copyright © 2017 the American Physiological Society.

  18. Spatial awareness in immersive virtual environments revealed in open-loop walking

    NASA Astrophysics Data System (ADS)

    Turano, Kathleen A.; Chaudhury, Sidhartha

    2005-03-01

    People are able to walk without vision to previously viewed targets in the real world. This ability to update one"s position in space has been attributed to a path integration system that uses internally generated self-motion signals together with the perceived object-to-self distance of the target. In a previous study using an immersive virtual environment (VE), we found that many subjects were unable to walk without vision to a previously viewed target located 4 m away. Their walking paths were influenced by the room structure that varied trial to trial. In this study we investigated whether the phenomenon is specific to a VE by testing subjects in a real world and a VE. The real world was viewed with field restricting goggles and via cameras using the same head-mounted display as in the VE. The results showed that only in the VE were walking paths influenced by the room structure. Women were more affected than men, and the effect decreased over trials and after subjects performed the task in the real world. The results also showed that a brief (<0.5 s) exposure to the visual scene during self-motion was sufficient to reduce the influence of the room structure on walking paths. The results are consistent with the idea that without visual experience within the VE, the path integration system is unable to effectively update one"s spatial position. As a result, people rely on other cues to define their position in space. Women, unlike men, choose to use visual cues about environmental structure to reorient.

  19. The Adaptive Effects Of Virtual Interfaces: Vestibulo-Ocular Reflex and Simulator Sickness.

    DTIC Science & Technology

    1998-08-07

    rearrangement: a pattern of stimulation differing from that existing as a result of normal interactions with the real world. Stimulus rearrangements can...is immersive and interactive . virtual interface: a system of transducers, signal processors, computer hardware and software that create an... interactive medium through which: 1) information is transmitted to the senses in the form of two- and three dimensional virtual images and 2) psychomotor

  20. Calorimetric Study of Alkali Metal Ion (K +, Na +, Li +) Exchange in a Clay-Like MXene

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Geetu; Muthuswamy, Elayaraja; Naguib, Michael

    Intercalation of ions in layered materials has been explored to improve the rate capability in Li-ion batteries and supercapacitors. This work investigates the energetics of alkali ion exchange in a clay-like MXene, Ti 3C 2T x, where T x stands for anionic surface moieties, by immersion calorimetry in aqueous solutions. The measured immersion enthalpies of clay-like Ti 3C 2T x, ΔH imm, at 25 °C in 1 M KCl, 1 M NaCl, 1 M LiCl, and nanopure water are -9.19 (±0.56), -5.90 (±0.31), -1.31 (±0.20), and -1.29 (±0.13) kJ/mol of MXene, respectively. Inductively coupled plasma mass spectrometry is used tomore » obtain the concentrations of alkali ions in the solid and aqueous phases. Using these concentrations, the enthalpies of exchange of alkali metal ions (Li+, Na+, and K+) are calculated; ΔHex in 1 M KCl, 1 M NaCl, 1 M LiCl, and nanopure water are -9.3 (±2.2), 21.0 (±0.9), -1.3 (±0.2), and 302.4 (±0.6) kJ/mol of MXene, respectively. Both immersion and exchange enthalpies are most exothermic for potassium. This suggests that K+ ions interact more strongly with anions present in the interlayers of this MXene than Na + and Li + ions. Water vapor adsorption calorimetry indicates very weak interaction of water with the MXene, while immersion calorimetry suggests a weakly hydrophilic nature of the MXene surface.« less

  1. Calorimetric Study of Alkali Metal Ion (K +, Na +, Li +) Exchange in a Clay-Like MXene

    DOE PAGES

    Sharma, Geetu; Muthuswamy, Elayaraja; Naguib, Michael; ...

    2017-06-21

    Intercalation of ions in layered materials has been explored to improve the rate capability in Li-ion batteries and supercapacitors. This work investigates the energetics of alkali ion exchange in a clay-like MXene, Ti 3C 2T x, where T x stands for anionic surface moieties, by immersion calorimetry in aqueous solutions. The measured immersion enthalpies of clay-like Ti 3C 2T x, ΔH imm, at 25 °C in 1 M KCl, 1 M NaCl, 1 M LiCl, and nanopure water are -9.19 (±0.56), -5.90 (±0.31), -1.31 (±0.20), and -1.29 (±0.13) kJ/mol of MXene, respectively. Inductively coupled plasma mass spectrometry is used tomore » obtain the concentrations of alkali ions in the solid and aqueous phases. Using these concentrations, the enthalpies of exchange of alkali metal ions (Li+, Na+, and K+) are calculated; ΔHex in 1 M KCl, 1 M NaCl, 1 M LiCl, and nanopure water are -9.3 (±2.2), 21.0 (±0.9), -1.3 (±0.2), and 302.4 (±0.6) kJ/mol of MXene, respectively. Both immersion and exchange enthalpies are most exothermic for potassium. This suggests that K+ ions interact more strongly with anions present in the interlayers of this MXene than Na + and Li + ions. Water vapor adsorption calorimetry indicates very weak interaction of water with the MXene, while immersion calorimetry suggests a weakly hydrophilic nature of the MXene surface.« less

  2. Recent developments in virtual experience design and production

    NASA Astrophysics Data System (ADS)

    Fisher, Scott S.

    1995-03-01

    Today, the media of VR and Telepresence are in their infancy and the emphasis is still on technology and engineering. But, it is not the hardware people might use that will determine whether VR becomes a powerful medium--instead, it will be the experiences that they are able to have that will drive its acceptance and impact. A critical challenge in the elaboration of these telepresence capabilities will be the development of environments that are as unpredictable and rich in interconnected processes as an actual location or experience. This paper will describe the recent development of several Virtual Experiences including: `Menagerie', an immersive Virtual Environment inhabited by virtual characters designed to respond to and interact with its users; and `The Virtual Brewery', an immersive public VR installation that provides multiple levels of interaction in an artistic interpretation of the brewing process.

  3. Local and Remote Cooperation With Virtual and Robotic Agents: A P300 BCI Study in Healthy and People Living With Spinal Cord Injury.

    PubMed

    Tidoni, Emmanuele; Abu-Alqumsan, Mohammad; Leonardis, Daniele; Kapeller, Christoph; Fusco, Gabriele; Guger, Cristoph; Hintermuller, Cristoph; Peer, Angelika; Frisoli, Antonio; Tecchia, Franco; Bergamasco, Massimo; Aglioti, Salvatore Maria

    2017-09-01

    The development of technological applications that allow people to control and embody external devices within social interaction settings represents a major goal for current and future brain-computer interface (BCI) systems. Prior research has suggested that embodied systems may ameliorate BCI end-user's experience and accuracy in controlling external devices. Along these lines, we developed an immersive P300-based BCI application with a head-mounted display for virtual-local and robotic-remote social interactions and explored in a group of healthy participants the role of proprioceptive feedback in the control of a virtual surrogate (Study 1). Moreover, we compared the performance of a small group of people with spinal cord injury (SCI) to a control group of healthy subjects during virtual and robotic social interactions (Study 2), where both groups received a proprioceptive stimulation. Our attempt to combine immersive environments, BCI technologies and neuroscience of body ownership suggests that providing realistic multisensory feedback still represents a challenge. Results have shown that healthy and people living with SCI used the BCI within the immersive scenarios with good levels of performance (as indexed by task accuracy, optimizations calls and Information Transfer Rate) and perceived control of the surrogates. Proprioceptive feedback did not contribute to alter performance measures and body ownership sensations. Further studies are necessary to test whether sensorimotor experience represents an opportunity to improve the use of future embodied BCI applications.

  4. Listeners' expectation of room acoustical parameters based on visual cues

    NASA Astrophysics Data System (ADS)

    Valente, Daniel L.

    Despite many studies investigating auditory spatial impressions in rooms, few have addressed the impact of simultaneous visual cues on localization and the perception of spaciousness. The current research presents an immersive audio-visual study, in which participants are instructed to make spatial congruency and quantity judgments in dynamic cross-modal environments. The results of these psychophysical tests suggest the importance of consilient audio-visual presentation to the legibility of an auditory scene. Several studies have looked into audio-visual interaction in room perception in recent years, but these studies rely on static images, speech signals, or photographs alone to represent the visual scene. Building on these studies, the aim is to propose a testing method that uses monochromatic compositing (blue-screen technique) to position a studio recording of a musical performance in a number of virtual acoustical environments and ask subjects to assess these environments. In the first experiment of the study, video footage was taken from five rooms varying in physical size from a small studio to a small performance hall. Participants were asked to perceptually align two distinct acoustical parameters---early-to-late reverberant energy ratio and reverberation time---of two solo musical performances in five contrasting visual environments according to their expectations of how the room should sound given its visual appearance. In the second experiment in the study, video footage shot from four different listening positions within a general-purpose space was coupled with sounds derived from measured binaural impulse responses (IRs). The relationship between the presented image, sound, and virtual receiver position was examined. It was found that many visual cues caused different perceived events of the acoustic environment. This included the visual attributes of the space in which the performance was located as well as the visual attributes of the performer. The addressed visual makeup of the performer included: (1) an actual video of the performance, (2) a surrogate image of the performance, for example a loudspeaker's image reproducing the performance, (3) no visual image of the performance (empty room), or (4) a multi-source visual stimulus (actual video of the performance coupled with two images of loudspeakers positioned to the left and right of the performer). For this experiment, perceived auditory events of sound were measured in terms of two subjective spatial metrics: Listener Envelopment (LEV) and Apparent Source Width (ASW) These metrics were hypothesized to be dependent on the visual imagery of the presented performance. Data was also collected by participants matching direct and reverberant sound levels for the presented audio-visual scenes. In the final experiment, participants judged spatial expectations of an ensemble of musicians presented in the five physical spaces from Experiment 1. Supporting data was accumulated in two stages. First, participants were given an audio-visual matching test, in which they were instructed to align the auditory width of a performing ensemble to a varying set of audio and visual cues. In the second stage, a conjoint analysis design paradigm was explored to extrapolate the relative magnitude of explored audio-visual factors in affecting three assessed response criteria: Congruency (the perceived match-up of the auditory and visual cues in the assessed performance), ASW and LEV. Results show that both auditory and visual factors affect the collected responses, and that the two sensory modalities coincide in distinct interactions. This study reveals participant resiliency in the presence of forced auditory-visual mismatch: Participants are able to adjust the acoustic component of the cross-modal environment in a statistically similar way despite randomized starting values for the monitored parameters. Subjective results of the experiments are presented along with objective measurements for verification.

  5. Mining Interactions in Immersive Learning Environments for Real-Time Student Feedback

    ERIC Educational Resources Information Center

    Kennedy, Gregor; Ioannou, Ioanna; Zhou, Yun; Bailey, James; O'Leary, Stephen

    2013-01-01

    The analysis and use of data generated by students' interactions with learning systems or programs--learning analytics--has recently gained widespread attention in the educational technology community. Part of the reason for this interest is based on the potential of learning analytic techniques such as data mining to find hidden patterns in…

  6. Sounds of silence: How to animate virtual worlds with sound

    NASA Technical Reports Server (NTRS)

    Astheimer, Peter

    1993-01-01

    Sounds are an integral and sometimes annoying part of our daily life. Virtual worlds which imitate natural environments gain a lot of authenticity from fast, high quality visualization combined with sound effects. Sounds help to increase the degree of immersion for human dwellers in imaginary worlds significantly. The virtual reality toolkit of IGD (Institute for Computer Graphics) features a broad range of standard visual and advanced real-time audio components which interpret an object-oriented definition of the scene. The virtual reality system 'Virtual Design' realized with the toolkit enables the designer of virtual worlds to create a true audiovisual environment. Several examples on video demonstrate the usage of the audio features in Virtual Design.

  7. An immersed boundary method for fluid-structure interaction with compressible multiphase flows

    NASA Astrophysics Data System (ADS)

    Wang, Li; Currao, Gaetano M. D.; Han, Feng; Neely, Andrew J.; Young, John; Tian, Fang-Bao

    2017-10-01

    This paper presents a two-dimensional immersed boundary method for fluid-structure interaction with compressible multiphase flows involving large structure deformations. This method involves three important parts: flow solver, structure solver and fluid-structure interaction coupling. In the flow solver, the compressible multiphase Navier-Stokes equations for ideal gases are solved by a finite difference method based on a staggered Cartesian mesh, where a fifth-order accuracy Weighted Essentially Non-Oscillation (WENO) scheme is used to handle spatial discretization of the convective term, a fourth-order central difference scheme is employed to discretize the viscous term, the third-order TVD Runge-Kutta scheme is used to discretize the temporal term, and the level-set method is adopted to capture the multi-material interface. In this work, the structure considered is a geometrically non-linear beam which is solved by using a finite element method based on the absolute nodal coordinate formulation (ANCF). The fluid dynamics and the structure motion are coupled in a partitioned iterative manner with a feedback penalty immersed boundary method where the flow dynamics is defined on a fixed Lagrangian grid and the structure dynamics is described on a global coordinate. We perform several validation cases (including fluid over a cylinder, structure dynamics, flow induced vibration of a flexible plate, deformation of a flexible panel induced by shock waves in a shock tube, an inclined flexible plate in a hypersonic flow, and shock-induced collapse of a cylindrical helium cavity in the air), and compare the results with experimental and other numerical data. The present results agree well with the published data and the current experiment. Finally, we further demonstrate the versatility of the present method by applying it to a flexible plate interacting with multiphase flows.

  8. Virtual Reality as a Story Telling Platform for Geoscience Communication

    NASA Astrophysics Data System (ADS)

    Lazar, K.; Moysey, S. M.

    2017-12-01

    Capturing the attention of students and the public is a critical step for increasing societal interest and literacy in earth science issues. Virtual reality (VR) provides a means for geoscience engagement that is well suited to place-based learning through exciting and immersive experiences. One approach is to create fully-immersive virtual gaming environments where players interact with physical objects, such as rock samples and outcrops, to pursue geoscience learning goals. Developing an experience like this, however, can require substantial programming expertise and resources. At the other end of the development spectrum, it is possible for anyone to create immersive virtual experiences with 360-degree imagery, which can be made interactive using easy to use VR editing software to embed videos, audio, images, and other content within the 360-degree image. Accessible editing tools like these make the creation of VR experiences something that anyone can tackle. Using the VR editor ThingLink and imagery from Google Maps, for example, we were able to create an interactive tour of the Grand Canyon, complete with embedded assessments, in a matter of hours. The true power of such platforms, however, comes from the potential to engage students as content authors to create and share stories of place that explore geoscience issues from their personal perspective. For example, we have used combinations of 360-degree images with interactive mapping and web platforms to enable students with no programming experience to create complex web apps as highly engaging story telling platforms. We highlight here examples of how we have implemented such story telling approaches with students to assess learning in courses, to share geoscience research outcomes, and to communicate issues of societal importance.

  9. A standardized set of 3-D objects for virtual reality research and applications.

    PubMed

    Peeters, David

    2018-06-01

    The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.

  10. Envisioning the future of home care: applications of immersive virtual reality.

    PubMed

    Brennan, Patricia Flatley; Arnott Smith, Catherine; Ponto, Kevin; Radwin, Robert; Kreutz, Kendra

    2013-01-01

    Accelerating the design of technologies to support health in the home requires 1) better understanding of how the household context shapes consumer health behaviors and (2) the opportunity to afford engineers, designers, and health professionals the chance to systematically study the home environment. We developed the Living Environments Laboratory (LEL) with a fully immersive, six-sided virtual reality CAVE to enable recreation of a broad range of household environments. We have successfully developed a virtual apartment, including a kitchen, living space, and bathroom. Over 2000 people have visited the LEL CAVE. Participants use an electronic wand to activate common household affordances such as opening a refrigerator door or lifting a cup. Challenges currently being explored include creating natural gesture to interface with virtual objects, developing robust, simple procedures to capture actual living environments and rendering them in a 3D visualization, and devising systematic stable terminologies to characterize home environments.

  11. The influence of visual and vestibular orientation cues in a clock reading task.

    PubMed

    Davidenko, Nicolas; Cheong, Yeram; Waterman, Amanda; Smith, Jacob; Anderson, Barrett; Harmon, Sarah

    2018-05-23

    We investigated how performance in the real-life perceptual task of analog clock reading is influenced by the clock's orientation with respect to egocentric, gravitational, and visual-environmental reference frames. In Experiment 1, we designed a simple clock-reading task and found that observers' reaction time to correctly tell the time depends systematically on the clock's orientation. In Experiment 2, we dissociated egocentric from environmental reference frames by having participants sit upright or lie sideways while performing the task. We found that both reference frames substantially contribute to response times in this task. In Experiment 3, we placed upright or rotated participants in an upright or rotated immersive virtual environment, which allowed us to further dissociate vestibular from visual cues to the environmental reference frame. We found evidence of environmental reference frame effects only when visual and vestibular cues were aligned. We discuss the implications for the design of remote and head-mounted displays. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Visualization of fluid turbulence and acoustic cavitation during phacoemulsification.

    PubMed

    Tognetto, Daniele; Sanguinetti, Giorgia; Sirotti, Paolo; Brezar, Edoardo; Ravalico, Giuseppe

    2005-02-01

    To describe a technique for visualizing fluid turbulence and cavitational energy created by ultrasonic phaco tips. University Eye Clinic of Trieste, Trieste, Italy. Generation of cavitational energy by the phaco tip was visualized using an optical test bench comprising several components. The technique uses a telescope system to expand a laser light source into a coherent, collimated beam of light with a diameter of approximately 50.0 mm. The expanded laser beam shines on the test tube containing the tip activated in a medium of water or ophthalmic viscosurgical device (OVD). Two precision optical collimators complete the optical test bench and form the system used to focus data onto a charge-coupled device television camera connected to a recorder. Images of irrigation, irrigation combined with aspiration, irrigation/aspiration, and phacosonication were obtained with the tip immersed in a tube containing water or OVD. Optical image processing enabled acoustic cavitation to be visualized during phacosonication. The system is a possible means of evaluating a single phaco apparatus power setting and comparing phaco machines and techniques.

  13. The influence of an immersive virtual environment on the segmental organization of postural stabilizing responses.

    PubMed

    Keshner, E A; Kenyon, R V

    2000-01-01

    We examined the effect of a 3-dimensional stereoscopic scene on segmental stabilization. Eight subjects participated in static sway and locomotion experiments with a visual scene that moved sinusoidally or at constant velocity about the pitch or roll axes. Segmental displacements, Fast Fourier Transforms, and Root Mean Square values were calculated. In both pitch and roll, subjects exhibited greater magnitudes of motion in head and trunk than ankle. Smaller amplitudes and frequent phase reversals suggested control of the ankle by segmental proprioceptive inputs and ground reaction forces rather than by the visual-vestibular signals. Postural controllers may set limits of motion at each body segment rather than be governed solely by a perception of the visual vertical. Two locomotor strategies were also exhibited, implying that some subjects could override the effect of the roll axis optic flow field. Our results demonstrate task dependent differences that argue against using static postural responses to moving visual fields when assessing more dynamic tasks.

  14. Usability of stereoscopic view in teleoperation

    NASA Astrophysics Data System (ADS)

    Boonsuk, Wutthigrai

    2015-03-01

    Recently, there are tremendous growths in the area of 3D stereoscopic visualization. The 3D stereoscopic visualization technology has been used in a growing number of consumer products such as the 3D televisions and the 3D glasses for gaming systems. This technology refers to the idea that human brain develops depth of perception by retrieving information from the two eyes. Our brain combines the left and right images on the retinas and extracts depth information. Therefore, viewing two video images taken at slightly distance apart as shown in Figure 1 can create illusion of depth [8]. Proponents of this technology argue that the stereo view of 3D visualization increases user immersion and performance as more information is gained through the 3D vision as compare to the 2D view. However, it is still uncertain if additional information gained from the 3D stereoscopic visualization can actually improve user performance in real world situations such as in the case of teleoperation.

  15. High Resolution Visualization Applied to Future Heavy Airlift Concept Development and Evaluation

    NASA Technical Reports Server (NTRS)

    FordCook, A. B.; King, T.

    2012-01-01

    This paper explores the use of high resolution 3D visualization tools for exploring the feasibility and advantages of future military cargo airlift concepts and evaluating compatibility with existing and future payload requirements. Realistic 3D graphic representations of future airlifters are immersed in rich, supporting environments to demonstrate concepts of operations to key personnel for evaluation, feedback, and development of critical joint support. Accurate concept visualizations are reviewed by commanders, platform developers, loadmasters, soldiers, scientists, engineers, and key principal decision makers at various stages of development. The insight gained through the review of these physically and operationally realistic visualizations is essential to refining design concepts to meet competing requirements in a fiscally conservative defense finance environment. In addition, highly accurate 3D geometric models of existing and evolving large military vehicles are loaded into existing and proposed aircraft cargo bays. In this virtual aircraft test-loading environment, materiel developers, engineers, managers, and soldiers can realistically evaluate the compatibility of current and next-generation airlifters with proposed cargo.

  16. Reversible Tailoring of Mechanical Properties of Carbon Nanotube Forests by Immersing in Solvents

    DTIC Science & Technology

    2014-12-07

    quantify the strength of vdW interactions between CNTs, Hamaker constant of CNTs in vacuum, Av ¼ V 12pD 2 G , was evaluated where ‘V’ is the vdW...effectively do not interact with each other. Therefore, we assumed curved surface–surface vdW interaction between two CNTs to evaluate the Hamaker ...of the vdW forces are directly proportional to Hamaker constant, which depends on the macroscopic properties of the interacting objects and the

  17. Authoring Immersive Mixed Reality Experiences

    NASA Astrophysics Data System (ADS)

    Misker, Jan M. V.; van der Ster, Jelle

    Creating a mixed reality experience is a complicated endeavour. From our practice as a media lab in the artistic domain we found that engineering is “only” a first step in creating a mixed reality experience. Designing the appearance and directing the user experience are equally important for creating an engaging, immersive experience. We found that mixed reality artworks provide a very good test bed for studying these topics. This chapter details three steps required for authoring mixed reality experiences: engineering, designing and directing. We will describe a platform (VGE) for creating mixed reality environments that incorporates these steps. A case study (EI4) is presented in which this platform was used to not only engineer the system, but in which an artist was given the freedom to explore the artistic merits of mixed reality as an artistic medium, which involved areas such as the look and feel, multimodal experience and interaction, immersion as a subjective emotion and game play scenarios.

  18. The Emergence of Agent-Based Technology as an Architectural Component of Serious Games

    NASA Technical Reports Server (NTRS)

    Phillips, Mark; Scolaro, Jackie; Scolaro, Daniel

    2010-01-01

    The evolution of games as an alternative to traditional simulations in the military context has been gathering momentum over the past five years, even though the exploration of their use in the serious sense has been ongoing since the mid-nineties. Much of the focus has been on the aesthetics of the visuals provided by the core game engine as well as the artistry provided by talented development teams to produce not only breathtaking artwork, but highly immersive game play. Consideration of game technology is now so much a part of the modeling and simulation landscape that it is becoming difficult to distinguish traditional simulation solutions from game-based approaches. But games have yet to provide the much needed interactive free play that has been the domain of semi-autonomous forces (SAF). The component-based middleware architecture that game engines provide promises a great deal in terms of options for the integration of agent solutions to support the development of non-player characters that engage the human player without the deterministic nature of scripted behaviors. However, there are a number of hard-learned lessons on the modeling and simulation side of the equation that game developers have yet to learn, such as: correlation of heterogeneous systems, scalability of both terrain and numbers of non-player entities, and the bi-directional nature of simulation to game interaction provided by Distributed Interactive Simulation (DIS) and High Level Architecture (HLA).

  19. NASA's "Eyes On The Solar System:" A Real-time, 3D-Interactive Tool to Teach the Wonder of Planetary Science

    NASA Astrophysics Data System (ADS)

    Hussey, K.

    2014-12-01

    NASA's Jet Propulsion Laboratory is using video game technology to immerse students, the general public and mission personnel in our solar system and beyond. "Eyes on the Solar System," a cross-platform, real-time, 3D-interactive application that can run on-line or as a stand-alone "video game," is of particular interest to educators looking for inviting tools to capture students interest in a format they like and understand. (eyes.nasa.gov). It gives users an extraordinary view of our solar system by virtually transporting them across space and time to make first-person observations of spacecraft, planetary bodies and NASA/ESA missions in action. Key scientific results illustrated with video presentations, supporting imagery and web links are imbedded contextually into the solar system. Educators who want an interactive, game-based approach to engage students in learning Planetary Science will see how "Eyes" can be effectively used to teach its principles to grades 3 through 14.The presentation will include a detailed demonstration of the software along with a description/demonstration of how this technology is being adapted for education. There will also be a preview of coming attractions. This work is being conducted by the Visualization Technology Applications and Development Group at NASA's Jet Propulsion Laboratory, the same team responsible for "Eyes on the Earth 3D," and "Eyes on Exoplanets," which can be viewed at eyes.nasa.gov/earth and eyes.nasa.gov/exoplanets.

  20. A numerical comparison with an exact solution for the transient response of a cylinder immersed in a fluid. [computer simulated underwater tests to determine transient response of a submerged cylindrical shell

    NASA Technical Reports Server (NTRS)

    Giltrud, M. E.; Lucas, D. S.

    1979-01-01

    The transient response of an elastic cylindrical shell immersed in an acoustic media that is engulfed by a plane wave is determined numerically. The method applies to the USA-STAGS code which utilizes the finite element method for the structural analysis and the doubly asymptotic approximation for the fluid-structure interaction. The calculations are compared to an exact analysis for two separate loading cases: a plane step wave and an exponentially decaying plane wave.

  1. Symmetric and asymmetric wormholes immersed in rotating matter

    NASA Astrophysics Data System (ADS)

    Hoffmann, Christian; Ioannidou, Theodora; Kahlen, Sarah; Kleihaus, Burkhard; Kunz, Jutta

    2018-06-01

    We consider four-dimensional wormholes immersed in bosonic matter. While their existence is based on the presence of a phantom field, many of their interesting physical properties are bestowed upon them by an ordinary complex scalar field, which carries only a mass term, but no self-interactions. For instance, the rotation of the scalar field induces a rotation of the throat as well. Moreover, the bosonic matter need not be symmetrically distributed in both asymptotically flat regions, leading to symmetric and asymmetric rotating wormhole spacetimes. The presence of the rotating matter also allows for wormholes with a double throat.

  2. 2-D transmitral flows simulation by means of the immersed boundary method on unstructured grids

    NASA Astrophysics Data System (ADS)

    Denaro, F. M.; Sarghini, F.

    2002-04-01

    Interaction between computational fluid dynamics and clinical researches recently allowed a deeper understanding of the physiology of complex phenomena involving cardio-vascular mechanisms. The aim of this paper is to develop a simplified numerical model based on the Immersed Boundary Method and to perform numerical simulations in order to study the cardiac diastolic phase during which the left ventricle is filled with blood flowing from the atrium throughout the mitral valve. As one of the diagnostic problems to be faced by clinicians is the lack of a univocal definition of the diastolic performance from the velocity measurements obtained by Eco-Doppler techniques, numerical simulations are supposed to provide an insight both into the physics of the diastole and into the interpretation of experimental data. An innovative application of the Immersed Boundary Method on unstructured grids is presented, fulfilling accuracy requirements related to the development of a thin boundary layer along the moving immersed boundary. It appears that this coupling between unstructured meshes and the Immersed Boundary Method is a promising technique when a wide range of spatial scales is involved together with a moving boundary. Numerical simulations are performed in a range of physiological parameters and a qualitative comparison with experimental data is presented, in order to demonstrate that, despite the simplified model, the main physiological characteristics of the diastole are well represented. Copyright

  3. Enhanced Virtual Presence for Immersive Visualization of Complex Situations for Mission Rehearsal

    DTIC Science & Technology

    1997-06-01

    taken. We propose to join both these technologies together in a registration device . The registration device would be small and portable and easily...registering the panning of the camera (or other sensing device ) and also stitch together the shots to automatically generate panoramic files necessary to...database and as the base information changes each of the linked drawings is automatically updated. Filename Format A specific naming convention should be

  4. Educational Uses of Virtual Reality Technology.

    DTIC Science & Technology

    1998-01-01

    technology. It is affordable in that a basic level of technology can be achieved on most existing personal computers at either no cost or some minimal...actually present in a virtual environment is termed "presence" and is an artifact of being visually immersed in the computer -generated virtual world...Carolina University, VREL Teachers 1996 onward £ CO ■3 u VR in Education University of Illinois, National Center for Super- computing Applications

  5. Fun and Games: using Games and Immersive Exploration to Teach Earth and Space Science

    NASA Astrophysics Data System (ADS)

    Reiff, P. H.; Sumners, C.

    2011-12-01

    We have been using games to teach Earth and Space Science for over 15 years. Our software "TicTacToe" has been used continuously at the Houston Museum of Natural Science since 2002. It is the single piece of educational software in the "Earth Forum" suite that holds the attention of visitors the longest - averaging over 10 minutes compared to 1-2 minutes for the other software kiosks. We now have question sets covering solar system, space weather, and Earth science. In 2010 we introduced a new game technology - that of immersive interactive explorations. In our "Tikal Explorer", visitors use a game pad to navigate a three-dimensional environment of the Classic Maya city of Tikal. Teams of students climb pyramids, look for artifacts, identify plants and animals, and site astronomical alignments that predict the annual return of the rains. We also have a new 3D exploration of the International Space Station, where students can fly around and inside the ISS. These interactive explorations are very natural to the video-game generation, and promise to bring educational objectives to experiences that had previously been used strictly for gaming. If space permits, we will set up our portable Discovery Dome in the poster session for a full immersive demonstration of these game environments.

  6. The electro-structural behaviour of yarn-like carbon nanotube fibres immersed in organic liquids

    NASA Astrophysics Data System (ADS)

    Terrones, Jeronimo; Windle, Alan H.; Elliott, James A.

    2014-10-01

    Yarn-like carbon nanotube (CNT) fibres are a hierarchically-structured material with a variety of promising applications such as high performance composites, sensors and actuators, smart textiles, and energy storage and transmission. However, in order to fully realize these possibilities, a more detailed understanding of their interactions with the environment is required. In this work, we describe a simplified representation of the hierarchical structure of the fibres from which several mathematical models are constructed to explain electro-structural interactions of fibres with organic liquids. A balance between the elastic and surface energies of the CNT bundle network in different media allows the determination of the maximum lengths that open junctions can sustain before collapsing to minimize the surface energy. This characteristic length correlates well with the increase of fibre resistance upon immersion in organic liquids. We also study the effect of charge accumulation in open interbundle junctions and derive expressions to describe experimental data on the non-ohmic electrical behaviour of fibres immersed in polar liquids. Our analyses suggest that the non-ohmic behaviour is caused by progressively shorter junctions collapsing as the voltage is increased. Since our models are not based on any property unique to carbon nanotubes, they should also be useful to describe other hierarchical structures.

  7. Integrated Web-Based Immersive Exploration of the Coordinated Canyon Experiment Data using Open Source STOQS Software

    NASA Astrophysics Data System (ADS)

    McCann, M. P.; Gwiazda, R.; O'Reilly, T. C.; Maier, K. L.; Lundsten, E. M.; Parsons, D. R.; Paull, C. K.

    2017-12-01

    The Coordinated Canyon Experiment (CCE) in Monterey Submarine Canyon has produced a wealth of oceanographic measurements whose analysis will improve understanding of turbidity current processes. Exploration of this data set, consisting of over 60 parameters from 15 platforms, is facilitated by using the open source Spatial Temporal Oceanographic Query System (STOQS) software (https://github.com/stoqs/stoqs). The Monterey Bay Aquarium Research Institute (MBARI) originally developed STOQS to help manage and visualize upper water column oceanographic measurements, but the generality of its data model permits effective use for any kind of spatial/temporal measurement data. STOQS consists of a PostgreSQL database and server-side Python/Django software; the client-side is jQuery JavaScript supporting AJAX requests to update a single page web application. The User Interface (UI) is optimized to provide a quick overview of data in spatial and temporal dimensions, as well as in parameter, platform, and data value space. A user may zoom into any feature of interest and select it, initiating a filter operation that updates the UI with an overview of all the data in the new filtered selection. When details are desired, radio buttons and checkboxes are selected to generate a number of different types of visualizations. These include color-filled temporal section and line plots, parameter-parameter plots, 2D map plots, and interactive 3D spatial visualizations. The Extensible 3D (X3D) standard and X3DOM JavaScript library provide the technology for presenting animated 3D data directly within the web browser. Most of the oceanographic measurements from the CCE (e.g. mooring mounted ADCP and CTD data) are easily visualized using established methods. However, unified integration and multiparameter display of several concurrently deployed sensors across a network of platforms is a challenge we hope to solve. Moreover, STOQS also allows display of data from a new instrument - the Benthic Event Detector (BED). The BED records 50Hz samples of orientation and acceleration when it moves. These data are converted to the CF-NetCDF format and then loaded into a STOQS database. Using the Spatial-3D view a user may interact with a virtual playback of BED motions, giving new insight into submarine canyon sediment density flows.

  8. Interactive terrain visualization enables virtual field work during rapid scientific response to the 2010 Haiti earthquake

    USGS Publications Warehouse

    Cowgill, Eric; Bernardin, Tony S.; Oskin, Michael E.; Bowles, Christopher; Yikilmaz, M. Burak; Kreylos, Oliver; Elliott, Austin J.; Bishop, Scott; Gold, Ryan D.; Morelan, Alexander; Bawden, Gerald W.; Hamann, Bernd; Kellogg, Louise

    2012-01-01

    The moment magnitude (Mw) 7.0 12 January 2010 Haiti earthquake is the first major earthquake for which a large-footprint LiDAR (light detection and ranging) survey was acquired within several weeks of the event. Here, we describe the use of virtual reality data visualization to analyze massive amounts (67 GB on disk) of multiresolution terrain data during the rapid scientific response to a major natural disaster. In particular, we describe a method for conducting virtual field work using both desktop computers and a 4-sided, 22 m3 CAVE immersive virtual reality environment, along with KeckCAVES (Keck Center for Active Visualization in the Earth Sciences) software tools LiDAR Viewer, to analyze LiDAR point-cloud data, and Crusta, for 2.5 dimensional surficial geologic mapping on a bare-earth digital elevation model. This system enabled virtual field work that yielded remote observations of the topographic expression of active faulting within an ∼75-km-long section of the eastern Enriquillo–Plantain Garden fault spanning the 2010 epicenter. Virtual field observations indicated that the geomorphic evidence of active faulting and ancient surface rupture varies along strike. Landform offsets of 6–50 m along the Enriquillo–Plantain Garden fault east of the 2010 epicenter and closest to Port-au-Prince attest to repeated recent surface-rupturing earthquakes there. In the west, the fault trace is well defined by displaced landforms, but it is not as clear as in the east. The 2010 epicenter is within a transition zone between these sections that extends from Grand Goâve in the west to Fayette in the east. Within this transition, between L'Acul (lat 72°40′W) and the Rouillone River (lat 72°35′W), the Enriquillo–Plantain Garden fault is undefined along an embayed low-relief range front, with little evidence of recent surface rupture. Based on the geometry of the eastern and western faults that show evidence of recent surface rupture, we propose that the 2010 event occurred within a stepover that appears to have served as a long-lived boundary between rupture segments, explaining the lack of 2010 surface rupture. This study demonstrates how virtual reality–based data visualization has the potential to transform rapid scientific response by enabling virtual field studies and real-time interactive analysis of massive terrain data sets.

  9. Life in unexpected places: Employing visual thinking strategies in global health training.

    PubMed

    Allison, Jill; Mulay, Shree; Kidd, Monica

    2017-01-01

    The desire to make meaning out of images, metaphor, and other representations indicates higher-order cognitive skills that can be difficult to teach, especially in the complex and unfamiliar environments like those encountered in many global health experiences. Because reflecting on art can help develop medical students' imaginative and interpretive skills, we used visual thinking strategies (VTS) during an immersive 4-week global health elective for medical students to help them construct new understanding of the social determinants of health in a low-resource setting. We were aware of no previous formal efforts to use art in global health training. We assembled a group of eight medical students in front of a street mural in Kathmandu and used VTS methods to interpret the scene with respect to the social determinants of health. We recorded and transcribed the conversation and conducted a thematic analysis of student responses. Students shared observations about the mural in a supportive, nonjudgmental fashion. Two main themes emerged from their observations: those of human-environment interactions (specifically community dynamics, subsistence land use, resources, and health) and entrapment/control, particularly relating to expectations of, and demands on, women in traditional farming communities. They used the images as well as their experience in Nepali communities to consolidate complex community health concepts. VTS helped students articulate their deepening understanding of the social determinants of health in Nepal, suggesting that reflection on visual art can help learners apply, analyze, and evaluate complex concepts in global health. We demonstrate the relevance of drawing upon many aspects of cultural learning, regarding art as a kind of text that holds valuable information. These findings may help provide innovative opportunities for teaching and evaluating global health training in the future.

  10. Transduction between worlds: using virtual and mixed reality for earth and planetary science

    NASA Astrophysics Data System (ADS)

    Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.

    2017-12-01

    Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.

  11. Implementing virtual reality interfaces for the geosciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, W.; Jacobsen, J.; Austin, A.

    1996-06-01

    For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called {open_quotes}Virtual Reality{close_quotes} (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools. VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter threemore » or six-dimensional parameters. In the latter, three or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world, but the keyboard is the obstacle in that typing is cumbersome, error-prone and typically slow. In the latter, the user can interact with these parameters by means of motor skills which are highly developed. Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as a tool for computing streamlines via manipulation of a {open_quotes}rake.{close_quotes} The rake is presented to the user in the form of a {open_quotes}virtual well{close_quotes} icon, and provides parameters used by the streamlines algorithm.« less

  12. "torino 1911" Project: a Contribution of a Slam-Based Survey to Extensive 3d Heritage Modeling

    NASA Astrophysics Data System (ADS)

    Chiabrando, F.; Della Coletta, C.; Sammartano, G.; Spanò, A.; Spreafico, A.

    2018-05-01

    In the framework of the digital documentation of complex environments the advanced Geomatics researches offers integrated solution and multi-sensor strategies for the 3D accurate reconstruction of stratified structures and articulated volumes in the heritage domain. The use of handheld devices for rapid mapping, both image- and range-based, can help the production of suitable easy-to use and easy-navigable 3D model for documentation projects. These types of reality-based modelling could support, with their tailored integrated geometric and radiometric aspects, valorisation and communication projects including virtual reconstructions, interactive navigation settings, immersive reality for dissemination purposes and evoking past places and atmospheres. The aim of this research is localized within the "Torino 1911" project, led by the University of San Diego (California) in cooperation with the PoliTo. The entire project is conceived for multi-scale reconstruction of the real and no longer existing structures in the whole park space of more than 400,000 m2, for a virtual and immersive visualization of the Turin 1911 International "Fabulous Exposition" event, settled in the Valentino Park. Particularly, in the presented research, a 3D metric documentation workflow is proposed and validated in order to integrate the potentialities of LiDAR mapping by handheld SLAM-based device, the ZEB REVO Real Time instrument by GeoSLAM (2017 release), instead of TLS consolidated systems. Starting from these kind of models, the crucial aspects of the trajectories performances in the 3D reconstruction and the radiometric content from imaging approaches are considered, specifically by means of compared use of common DSLR cameras and portable sensors.

  13. An evaluation-guided approach for effective data visualization on tablets

    NASA Astrophysics Data System (ADS)

    Games, Peter S.; Joshi, Alark

    2015-01-01

    There is a rising trend of data analysis and visualization tasks being performed on a tablet device. Apps with interactive data visualization capabilities are available for a wide variety of domains. We investigate whether users grasp how to effectively interpret and interact with visualizations. We conducted a detailed user evaluation to study the abilities of individuals with respect to analyzing data on a tablet through an interactive visualization app. Based upon the results of the user evaluation, we find that most subjects performed well at understanding and interacting with simple visualizations, specifically tables and line charts. A majority of the subjects struggled with identifying interactive widgets, recognizing interactive widgets with overloaded functionality, and understanding visualizations which do not display data for sorted attributes. Based on our study, we identify guidelines for designers and developers of mobile data visualization apps that include recommendations for effective data representation and interaction.

  14. Integrating visualization and interaction research to improve scientific workflows.

    PubMed

    Keefe, Daniel F

    2010-01-01

    Scientific-visualization research is, nearly by necessity, interdisciplinary. In addition to their collaborators in application domains (for example, cell biology), researchers regularly build on close ties with disciplines related to visualization, such as graphics, human-computer interaction, and cognitive science. One of these ties is the connection between visualization and interaction research. This isn't a new direction for scientific visualization (see the "Early Connections" sidebar). However, momentum recently seems to be increasing toward integrating visualization research (for example, effective visual presentation of data) with interaction research (for example, innovative interactive techniques that facilitate manipulating and exploring data). We see evidence of this trend in several places, including the visualization literature and conferences.

  15. "That's when It Hit Home": Creating Interactive, Collective, and Transformative Learning Experiences through the Traveling Classroom

    ERIC Educational Resources Information Center

    Gengler, Amanda Marie

    2010-01-01

    Travel is a powerful pedagogical tool for critical and feminist teachers, as it leads to learning that is uniquely interactive, collective, and transformative. It places students in immersive contact with real-world realities, which the teachers strive to help them see, come to terms with, and connect to the positionality of their own lived…

  16. Why mucosal health?

    USDA-ARS?s Scientific Manuscript database

    Aquaculture species depend more heavily on mucosal barriers than their terrestrial agricultural counterparts as they are continuously interacting with the aquatic microbiota. Unlike classical immune centers, such as the spleen and kidney, the accessibility of mucosal surfaces through immersion/dip t...

  17. Electrorotation of a metal sphere immersed in an electrolyte of finite Debye length.

    PubMed

    García-Sánchez, Pablo; Ramos, Antonio

    2015-11-01

    We theoretically study the rotation induced on a metal sphere immersed in an electrolyte and subjected to a rotating electric field. The rotation arises from the interaction of the field with the electric charges induced at the metal-electrolyte interface, i.e., the induced electrical double layer (EDL). Particle rotation is due to the torque on the induced dipole, and also from induced-charge electro-osmostic flow (ICEO). The interaction of the electric field with the induced dipole on the system gives rise to counterfield rotation, i.e., the direction opposite to the rotation of the electric field. ICEO generates co-field rotation of the sphere. For thin EDL, ICEO generates negligible rotation. For increasing size of EDL, co-field rotation appears and, in the limit of very thick EDL, it compensates the counter-field rotation induced by the electrical torque. We also report computations of the rotating fluid velocity field around the sphere.

  18. Virtually compliant: Immersive video gaming increases conformity to false computer judgments.

    PubMed

    Weger, Ulrich W; Loughnan, Stephen; Sharma, Dinkar; Gonidis, Lazaros

    2015-08-01

    Real-life encounters with face-to-face contact are on the decline in a world in which many routine tasks are delegated to virtual characters-a development that bears both opportunities and risks. Interacting with such virtual-reality beings is particularly common during role-playing videogames, in which we incarnate into the virtual reality of an avatar. Video gaming is known to lead to the training and development of real-life skills and behaviors; hence, in the present study we sought to explore whether role-playing video gaming primes individuals' identification with a computer enough to increase computer-related social conformity. Following immersive video gaming, individuals were indeed more likely to give up their own best judgment and to follow the vote of computers, especially when the stimulus context was ambiguous. Implications for human-computer interactions and for our understanding of the formation of identity and self-concept are discussed.

  19. Hydrodynamics of a three-dimensional self-propelled flexible plate

    NASA Astrophysics Data System (ADS)

    Ryu, Jaeha; Sung, Hyung Jin

    2017-11-01

    A three-dimensional self-propelled flexible plate in a quiescent flow was simulated using the immersed boundary method. The clamped leading edge of the flexible plate was forced into a vertical oscillation, while free to move horizontally. To reveal the hydrodynamics of the plate, the averaged cruising speed (UC) , the input power (P) , and the swimming efficiency (η) were analyzed as a function of the bending rigidity (γ) and the flapping frequency (f) . The velocity field around the plate and the exerted force on the plate were demonstrated to find out the dynamic interaction between the plate and the surrounding fluid. The kinematics of the plate, the maximum angle of attack (ϕmax) , and the mean effective length (Leff) were examined accounting for the hydrodynamics of the self-propelled flexible plate. The vortical structures around the plate were visualized, and the influence of the tip vortex on the swimming efficiency was explored qualitatively and quantitatively. This work was supported by the Creative Research Initiatives (No. 2017-013369) program of the National Research Foundation of Korea (MSIP).

  20. Community Resilience Informed by Science and Experience (C-RISE)

    NASA Astrophysics Data System (ADS)

    Young Morse, R.; Peake, L.; Bowness, G.

    2017-12-01

    The Gulf of Maine Research Institute is developing an interactive learning experience that engages participants in the interdependence of humans and the environment, the cycles of observation and experiment that advance science knowledge, and the changes we see now and that are predicted for sea level and storm frequency. These scientific concepts and principles will be brought to human scale through the connection to the challenge of city planning in our harbor communities. We are leveraging the ESRI Story Maps platform to build rich visualization-based narratives that feature NOAA maps, data and tools. Our program participants work in teams to navigate the content and participate in facilitated group discussions led by our educators. Based on the adult learning experience and in concert with new content being developed for the LabVenture program around the theme of Climate Change, we will develop a learning experience for 5th and 6th graders.Our goal is to immerse 1000+ adults from target communities in Greater Portland region as well as 8000+ middle school students from throughout the state in the experience.

Top