Transduction between worlds: using virtual and mixed reality for earth and planetary science
NASA Astrophysics Data System (ADS)
Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.
2017-12-01
Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
Earth Science Learning in SMALLab: A Design Experiment for Mixed Reality
ERIC Educational Resources Information Center
Birchfield, David; Megowan-Romanowicz, Colleen
2009-01-01
Conversational technologies such as email, chat rooms, and blogs have made the transition from novel communication technologies to powerful tools for learning. Currently virtual worlds are undergoing the same transition. We argue that the next wave of innovation is at the level of the computer interface, and that mixed-reality environments offer…
Chuah, Joon Hao; Lok, Benjamin; Black, Erik
2013-04-01
Health sciences students often practice and are evaluated on interview and exam skills by working with standardized patients (people that role play having a disease or condition). However, standardized patients do not exist for certain vulnerable populations such as children and the intellectually disabled. As a result, students receive little to no exposure to vulnerable populations before becoming working professionals. To address this problem and thereby increase exposure to vulnerable populations, we propose using virtual humans to simulate members of vulnerable populations. We created a mixed reality pediatric patient that allowed students to practice pediatric developmental exams. Practicing several exams is necessary for students to understand how to properly interact with and correctly assess a variety of children. Practice also increases a student's confidence in performing the exam. Effective practice requires students to treat the virtual child realistically. Treating the child realistically might be affected by how the student and virtual child physically interact, so we created two object interaction interfaces - a natural interface and a mouse-based interface. We tested the complete mixed reality exam and also compared the two object interaction interfaces in a within-subjects user study with 22 participants. Our results showed that the participants accepted the virtual child as a child and treated it realistically. Participants also preferred the natural interface, but the interface did not affect how realistically participants treated the virtual child.
Yu, Xunyi; Ganz, Aura
2011-01-01
In this paper we introduce a Mixed Reality Triage and Evacuation game, MiRTE, that is used in the development, testing and training of Mass Casualty Incident (MCI) information systems for first responders. Using the Source game engine from Valve software, MiRTE creates immersive virtual environments to simulate various incident scenarios, and enables interactions between multiple players/first responders. What distinguishes it from a pure computer simulation game is that it can interface with external mass casualty incident management systems, such as DIORAMA. The game will enable system developers to specify technical requirements of underlying technology, and test different alternatives of design. After the information system hardware and software are completed, the game can simulate various algorithms such as localization technologies, and interface with an actual user interface on PCs and Smartphones. We implemented and tested the game with the DIORAMA system.
2011-01-01
Background Although principles based in motor learning, rehabilitation, and human-computer interfaces can guide the design of effective interactive systems for rehabilitation, a unified approach that connects these key principles into an integrated design, and can form a methodology that can be generalized to interactive stroke rehabilitation, is presently unavailable. Results This paper integrates phenomenological approaches to interaction and embodied knowledge with rehabilitation practices and theories to achieve the basis for a methodology that can support effective adaptive, interactive rehabilitation. Our resulting methodology provides guidelines for the development of an action representation, quantification of action, and the design of interactive feedback. As Part I of a two-part series, this paper presents key principles of the unified approach. Part II then describes the application of this approach within the implementation of the Adaptive Mixed Reality Rehabilitation (AMRR) system for stroke rehabilitation. Conclusions The accompanying principles for composing novel mixed reality environments for stroke rehabilitation can advance the design and implementation of effective mixed reality systems for the clinical setting, and ultimately be adapted for home-based application. They furthermore can be applied to other rehabilitation needs beyond stroke. PMID:21875441
Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.
Mateu, Juan; Lasala, María José; Alamán, Xavier
2015-08-31
In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.
A New Design for Airway Management Training with Mixed Reality and High Fidelity Modeling.
Shen, Yunhe; Hananel, David; Zhao, Zichen; Burke, Daniel; Ballas, Crist; Norfleet, Jack; Reihsen, Troy; Sweet, Robert
2016-01-01
Restoring airway function is a vital task in many medical scenarios. Although various simulation tools have been available for learning such skills, recent research indicated that fidelity in simulating airway management deserves further improvements. In this study, we designed and implemented a new prototype for practicing relevant tasks including laryngoscopy, intubation and cricothyrotomy. A large amount of anatomical details or landmarks were meticulously selected and reconstructed from medical scans, and 3D-printed or molded to the airway intervention model. This training model was augmented by virtually and physically presented interactive modules, which are interoperable with motion tracking and sensor data feedback. Implementation results showed that this design is a feasible approach to develop higher fidelity airway models that can be integrated with mixed reality interfaces.
A 3-D mixed-reality system for stereoscopic visualization of medical dataset.
Ferrari, Vincenzo; Megali, Giuseppe; Troia, Elena; Pietrabissa, Andrea; Mosca, Franco
2009-11-01
We developed a simple, light, and cheap 3-D visualization device based on mixed reality that can be used by physicians to see preoperative radiological exams in a natural way. The system allows the user to see stereoscopic "augmented images," which are created by mixing 3-D virtual models of anatomies obtained by processing preoperative volumetric radiological images (computed tomography or MRI) with real patient live images, grabbed by means of cameras. The interface of the system consists of a head-mounted display equipped with two high-definition cameras. Cameras are mounted in correspondence of the user's eyes and allow one to grab live images of the patient with the same point of view of the user. The system does not use any external tracker to detect movements of the user or the patient. The movements of the user's head and the alignment of virtual patient with the real one are done using machine vision methods applied on pairs of live images. Experimental results, concerning frame rate and alignment precision between virtual and real patient, demonstrate that machine vision methods used for localization are appropriate for the specific application and that systems based on stereoscopic mixed reality are feasible and can be proficiently adopted in clinical practice.
Adaptive multimodal interaction in mobile augmented reality: A conceptual framework
NASA Astrophysics Data System (ADS)
Abidin, Rimaniza Zainal; Arshad, Haslina; Shukri, Saidatul A'isyah Ahmad
2017-10-01
Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.
Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit
Mateu, Juan; Lasala, María José; Alamán, Xavier
2015-01-01
In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275
Authoring Immersive Mixed Reality Experiences
NASA Astrophysics Data System (ADS)
Misker, Jan M. V.; van der Ster, Jelle
Creating a mixed reality experience is a complicated endeavour. From our practice as a media lab in the artistic domain we found that engineering is “only” a first step in creating a mixed reality experience. Designing the appearance and directing the user experience are equally important for creating an engaging, immersive experience. We found that mixed reality artworks provide a very good test bed for studying these topics. This chapter details three steps required for authoring mixed reality experiences: engineering, designing and directing. We will describe a platform (VGE) for creating mixed reality environments that incorporates these steps. A case study (EI4) is presented in which this platform was used to not only engineer the system, but in which an artist was given the freedom to explore the artistic merits of mixed reality as an artistic medium, which involved areas such as the look and feel, multimodal experience and interaction, immersion as a subjective emotion and game play scenarios.
Towards multi-platform software architecture for Collaborative Teleoperation
NASA Astrophysics Data System (ADS)
Domingues, Christophe; Otmane, Samir; Davesne, Frederic; Mallem, Malik
2009-03-01
Augmented Reality (AR) can provide to a Human Operator (HO) a real help in achieving complex tasks, such as remote control of robots and cooperative teleassistance. Using appropriate augmentations, the HO can interact faster, safer and easier with the remote real world. In this paper, we present an extension of an existing distributed software and network architecture for collaborative teleoperation based on networked human-scaled mixed reality and mobile platform. The first teleoperation system was composed by a VR application and a Web application. However the 2 systems cannot be used together and it is impossible to control a distant robot simultaneously. Our goal is to update the teleoperation system to permit a heterogeneous collaborative teleoperation between the 2 platforms. An important feature of this interface is based on the use of different Virtual Reality platforms and different Mobile platforms to control one or many robots.
Towards multi-platform software architecture for Collaborative Teleoperation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domingues, Christophe; Otmane, Samir; Davesne, Frederic
2009-03-05
Augmented Reality (AR) can provide to a Human Operator (HO) a real help in achieving complex tasks, such as remote control of robots and cooperative teleassistance. Using appropriate augmentations, the HO can interact faster, safer and easier with the remote real world. In this paper, we present an extension of an existing distributed software and network architecture for collaborative teleoperation based on networked human-scaled mixed reality and mobile platform. The first teleoperation system was composed by a VR application and a Web application. However the 2 systems cannot be used together and it is impossible to control a distant robotmore » simultaneously. Our goal is to update the teleoperation system to permit a heterogeneous collaborative teleoperation between the 2 platforms. An important feature of this interface is based on the use of different Virtual Reality platforms and different Mobile platforms to control one or many robots.« less
Mixed Reality with HoloLens: Where Virtual Reality Meets Augmented Reality in the Operating Room.
Tepper, Oren M; Rudy, Hayeem L; Lefkowitz, Aaron; Weimer, Katie A; Marks, Shelby M; Stern, Carrie S; Garfein, Evan S
2017-11-01
Virtual reality and augmented reality devices have recently been described in the surgical literature. The authors have previously explored various iterations of these devices, and although they show promise, it has become clear that virtual reality and/or augmented reality devices alone do not adequately meet the demands of surgeons. The solution may lie in a hybrid technology known as mixed reality, which merges many virtual reality and augmented realty features. Microsoft's HoloLens, the first commercially available mixed reality device, provides surgeons intraoperative hands-free access to complex data, the real environment, and bidirectional communication. This report describes the use of HoloLens in the operating room to improve decision-making and surgical workflow. The pace of mixed reality-related technological development will undoubtedly be rapid in the coming years, and plastic surgeons are ideally suited to both lead and benefit from this advance.
Future Cyborgs: Human-Machine Interface for Virtual Reality Applications
2007-04-01
FUTURE CYBORGS : HUMAN-MACHINE INTERFACE FOR VIRTUAL REALITY APPLICATIONS Robert R. Powell, Major, USAF April 2007 Blue Horizons...SUBTITLE Future Cyborgs : Human-Machine Interface for Virtual Reality Applications 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...Nicholas Negroponte, Being Digital (New York: Alfred A Knopf, Inc, 1995), 123. 23 Ibid. 24 Andy Clark, Natural-Born Cyborgs (New York: Oxford
Reality Check: Basics of Augmented, Virtual, and Mixed Reality.
Brigham, Tara J
2017-01-01
Augmented, virtual, and mixed reality applications all aim to enhance a user's current experience or reality. While variations of this technology are not new, within the last few years there has been a significant increase in the number of artificial reality devices or applications available to the general public. This column will explain the difference between augmented, virtual, and mixed reality and how each application might be useful in libraries. It will also provide an overview of the concerns surrounding these different reality applications and describe how and where they are currently being used.
Hands in space: gesture interaction with augmented-reality interfaces.
Billinghurst, Mark; Piumsomboon, Tham; Huidong Bai
2014-01-01
Researchers at the Human Interface Technology Laboratory New Zealand (HIT Lab NZ) are investigating free-hand gestures for natural interaction with augmented-reality interfaces. They've applied the results to systems for desktop computers and mobile devices.
Fast mental states decoding in mixed reality.
De Massari, Daniele; Pacheco, Daniel; Malekshahi, Rahim; Betella, Alberto; Verschure, Paul F M J; Birbaumer, Niels; Caria, Andrea
2014-01-01
The combination of Brain-Computer Interface (BCI) technology, allowing online monitoring and decoding of brain activity, with virtual and mixed reality (MR) systems may help to shape and guide implicit and explicit learning using ecological scenarios. Real-time information of ongoing brain states acquired through BCI might be exploited for controlling data presentation in virtual environments. Brain states discrimination during mixed reality experience is thus critical for adapting specific data features to contingent brain activity. In this study we recorded electroencephalographic (EEG) data while participants experienced MR scenarios implemented through the eXperience Induction Machine (XIM). The XIM is a novel framework modeling the integration of a sensing system that evaluates and measures physiological and psychological states with a number of actuators and effectors that coherently reacts to the user's actions. We then assessed continuous EEG-based discrimination of spatial navigation, reading and calculation performed in MR, using linear discriminant analysis (LDA) and support vector machine (SVM) classifiers. Dynamic single trial classification showed high accuracy of LDA and SVM classifiers in detecting multiple brain states as well as in differentiating between high and low mental workload, using a 5 s time-window shifting every 200 ms. Our results indicate overall better performance of LDA with respect to SVM and suggest applicability of our approach in a BCI-controlled MR scenario. Ultimately, successful prediction of brain states might be used to drive adaptation of data representation in order to boost information processing in MR.
Fast mental states decoding in mixed reality
De Massari, Daniele; Pacheco, Daniel; Malekshahi, Rahim; Betella, Alberto; Verschure, Paul F. M. J.; Birbaumer, Niels; Caria, Andrea
2014-01-01
The combination of Brain-Computer Interface (BCI) technology, allowing online monitoring and decoding of brain activity, with virtual and mixed reality (MR) systems may help to shape and guide implicit and explicit learning using ecological scenarios. Real-time information of ongoing brain states acquired through BCI might be exploited for controlling data presentation in virtual environments. Brain states discrimination during mixed reality experience is thus critical for adapting specific data features to contingent brain activity. In this study we recorded electroencephalographic (EEG) data while participants experienced MR scenarios implemented through the eXperience Induction Machine (XIM). The XIM is a novel framework modeling the integration of a sensing system that evaluates and measures physiological and psychological states with a number of actuators and effectors that coherently reacts to the user's actions. We then assessed continuous EEG-based discrimination of spatial navigation, reading and calculation performed in MR, using linear discriminant analysis (LDA) and support vector machine (SVM) classifiers. Dynamic single trial classification showed high accuracy of LDA and SVM classifiers in detecting multiple brain states as well as in differentiating between high and low mental workload, using a 5 s time-window shifting every 200 ms. Our results indicate overall better performance of LDA with respect to SVM and suggest applicability of our approach in a BCI-controlled MR scenario. Ultimately, successful prediction of brain states might be used to drive adaptation of data representation in order to boost information processing in MR. PMID:25505878
ERIC Educational Resources Information Center
Taçgin, Zeynep; Arslan, Ahmet
2017-01-01
The purpose of this study is to determine perception of postgraduate Computer Education and Instructional Technologies (CEIT) students regarding the concepts of Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR), Augmented Virtuality (AV) and Mirror Reality; and to offer a table that includes differences and similarities between…
Mixed reality ultrasound guidance system: a case study in system development and a cautionary tale.
Ameri, Golafsoun; Baxter, John S H; Bainbridge, Daniel; Peters, Terry M; Chen, Elvis C S
2018-04-01
Real-time ultrasound has become a crucial aspect of several image-guided interventions. One of the main constraints of such an approach is the difficulty in interpretability of the limited field of view of the image, a problem that has recently been addressed using mixed reality, such as augmented reality and augmented virtuality. The growing popularity and maturity of mixed reality has led to a series of informal guidelines to direct development of new systems and to facilitate regulatory approval. However, the goals of mixed reality image guidance systems and the guidelines for their development have not been thoroughly discussed. The purpose of this paper is to identify and critically examine development guidelines in the context of a mixed reality ultrasound guidance system through a case study. A mixed reality ultrasound guidance system tailored to central line insertions was developed in close collaboration with an expert user. This system outperformed ultrasound-only guidance in a novice user study and has obtained clearance for clinical use in humans. A phantom study with 25 experienced physicians was carried out to compare the performance of the mixed reality ultrasound system against conventional ultrasound-only guidance. Despite the previous promising results, there was no statistically significant difference between the two systems. Guidelines for developing mixed reality image guidance systems cannot be applied indiscriminately. Each design decision, no matter how well justified, should be the subject of scientific and technical investigation. Iterative and small-scale evaluation can readily unearth issues and previously unknown or implicit system requirements. We recommend a wary eye in development of mixed reality ultrasound image guidance systems emphasizing small-scale iterative evaluation alongside system development. Ultimately, we recommend that the image-guided intervention community furthers and deepens this discussion into best practices in developing image-guided interventions.
A haptic interface for virtual simulation of endoscopic surgery.
Rosenberg, L B; Stredney, D
1996-01-01
Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.
Building intuitive 3D interfaces for virtual reality systems
NASA Astrophysics Data System (ADS)
Vaidya, Vivek; Suryanarayanan, Srikanth; Seitel, Mathias; Mullick, Rakesh
2007-03-01
An exploration of techniques for developing intuitive, and efficient user interfaces for virtual reality systems. Work seeks to understand which paradigms from the better-understood world of 2D user interfaces remain viable within 3D environments. In order to establish this a new user interface was created that applied various understood principles of interface design. A user study was then performed where it was compared with an earlier interface for a series of medical visualization tasks.
DVV: a taxonomy for mixed reality visualization in image guided surgery.
Kersten-Oertel, Marta; Jannin, Pierre; Collins, D Louis
2012-02-01
Mixed reality visualizations are increasingly studied for use in image guided surgery (IGS) systems, yet few mixed reality systems have been introduced for daily use into the operating room (OR). This may be the result of several factors: the systems are developed from a technical perspective, are rarely evaluated in the field, and/or lack consideration of the end user and the constraints of the OR. We introduce the Data, Visualization processing, View (DVV) taxonomy which defines each of the major components required to implement a mixed reality IGS system. We propose that these components be considered and used as validation criteria for introducing a mixed reality IGS system into the OR. A taxonomy of IGS visualization systems is a step toward developing a common language that will help developers and end users discuss and understand the constituents of a mixed reality visualization system, facilitating a greater presence of future systems in the OR. We evaluate the DVV taxonomy based on its goodness of fit and completeness. We demonstrate the utility of the DVV taxonomy by classifying 17 state-of-the-art research papers in the domain of mixed reality visualization IGS systems. Our classification shows that few IGS visualization systems' components have been validated and even fewer are evaluated.
Productive confusions: learning from simulations of pandemic virus outbreaks in Second Life
NASA Astrophysics Data System (ADS)
Cárdenas, Micha; Greci, Laura S.; Hurst, Samantha; Garman, Karen; Hoffman, Helene; Huang, Ricky; Gates, Michael; Kho, Kristen; Mehrmand, Elle; Porteous, Todd; Calvitti, Alan; Higginbotham, Erin; Agha, Zia
2011-03-01
Users of immersive virtual reality environments have reported a wide variety of side and after effects including the confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real can be turned around to explore the possibilities for immersion with minimal technological support in virtual world group training simulations. This paper will describe observations from my time working as an artist/researcher with the UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will examine moments of training to learn the software interface, moments within the drill and interviews after the drill.
Collaborative Embodied Learning in Mixed Reality Motion-Capture Environments: Two Science Studies
ERIC Educational Resources Information Center
Johnson-Glenberg, Mina C.; Birchfield, David A.; Tolentino, Lisa; Koziupa, Tatyana
2014-01-01
These 2 studies investigate the extent to which an Embodied Mixed Reality Learning Environment (EMRELE) can enhance science learning compared to regular classroom instruction. Mixed reality means that physical tangible and digital components were present. The content for the EMRELE required that students map abstract concepts and relations onto…
NASA Astrophysics Data System (ADS)
Miranda, Mateus R.; Costa, Henrik; Oliveira, Luiz; Bernardes, Thiago; Aguiar, Carla; Miosso, Cristiano; Oliveira, Alessandro B. S.; Diniz, Alberto C. G. C.; Domingues, Diana Maria G.
2015-03-01
This paper aims at describing an experimental platform used to evaluate the performance of individuals at training immersive physiological games. The platform proposed is embedded in an immersive environment in a CAVE of Virtual Reality and consists on a base frame with actuators with three degrees of freedom, sensor array interface and physiological sensors. Physiological data of breathing, galvanic skin resistance (GSR) and pressure on the hand of the user and a subjective questionnaire were collected during the experiments. The theoretical background used in a project focused on Software Engineering, Biomedical Engineering in the field of Ergonomics and Creative Technologies in order to presents this case study, related of an evaluation of a vehicular simulator located inside the CAVE. The analysis of the simulator uses physiological data of the drivers obtained in a period of rest and after the experience, with and without movements at the simulator. Also images from the screen are captured through time at the embedded experience and data collected through physiological data visualization (average frequency and RMS graphics). They are empowered by the subjective questionnaire as strong lived experience provided by the technological apparatus. The performed immersion experience inside the CAVE allows to replicate behaviors from physical spaces inside data space enhanced by physiological properties. In this context, the biocybrid condition is expanded beyond art and entertainment, as it is applied to automotive engineering and biomedical engineering. In fact, the kinesthetic sensations amplified by synesthesia replicates the sensation of displacement in the interior of an automobile, as well as the sensations of vibration and vertical movements typical of a vehicle, different speeds, collisions, etc. The contribution of this work is the possibility to tracing a stress analysis protocol for drivers while operating a vehicle getting affective behaviors coming from physiological data, mixed to embedded simulation in Mixed Reality.
The Input-Interface of Webcam Applied in 3D Virtual Reality Systems
ERIC Educational Resources Information Center
Sun, Huey-Min; Cheng, Wen-Lin
2009-01-01
Our research explores a virtual reality application based on Web camera (Webcam) input-interface. The interface can replace with the mouse to control direction intention of a user by the method of frame difference. We divide a frame into nine grids from Webcam and make use of the background registration to compute the moving object. In order to…
Open multi-agent control architecture to support virtual-reality-based man-machine interfaces
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen; Brasch, Marcel
2001-10-01
Projective Virtual Reality is a new and promising approach to intuitively operable man machine interfaces for the commanding and supervision of complex automation systems. The user interface part of Projective Virtual Reality heavily builds on latest Virtual Reality techniques, a task deduction component and automatic action planning capabilities. In order to realize man machine interfaces for complex applications, not only the Virtual Reality part has to be considered but also the capabilities of the underlying robot and automation controller are of great importance. This paper presents a control architecture that has proved to be an ideal basis for the realization of complex robotic and automation systems that are controlled by Virtual Reality based man machine interfaces. The architecture does not just provide a well suited framework for the real-time control of a multi robot system but also supports Virtual Reality metaphors and augmentations which facilitate the user's job to command and supervise a complex system. The developed control architecture has already been used for a number of applications. Its capability to integrate sensor information from sensors of different levels of abstraction in real-time helps to make the realized automation system very responsive to real world changes. In this paper, the architecture will be described comprehensively, its main building blocks will be discussed and one realization that is built based on an open source real-time operating system will be presented. The software design and the features of the architecture which make it generally applicable to the distributed control of automation agents in real world applications will be explained. Furthermore its application to the commanding and control of experiments in the Columbus space laboratory, the European contribution to the International Space Station (ISS), is only one example which will be described.
My thoughts through a robot's eyes: an augmented reality-brain-machine interface.
Kansaku, Kenji; Hata, Naoki; Takano, Kouji
2010-02-01
A brain-machine interface (BMI) uses neurophysiological signals from the brain to control external devices, such as robot arms or computer cursors. Combining augmented reality with a BMI, we show that the user's brain signals successfully controlled an agent robot and operated devices in the robot's environment. The user's thoughts became reality through the robot's eyes, enabling the augmentation of real environments outside the anatomy of the human body.
Samosky, Joseph T; Allen, Pete; Boronyak, Steve; Branstetter, Barton; Hein, Steven; Juhas, Mark; Nelson, Douglas A; Orebaugh, Steven; Pinto, Rohan; Smelko, Adam; Thompson, Mitch; Weaver, Robert A
2011-01-01
We are developing a simulator of peripheral nerve block utilizing a mixed-reality approach: the combination of a physical model, an MRI-derived virtual model, mechatronics and spatial tracking. Our design uses tangible (physical) interfaces to simulate surface anatomy, haptic feedback during needle insertion, mechatronic display of muscle twitch corresponding to the specific nerve stimulated, and visual and haptic feedback for the injection syringe. The twitch response is calculated incorporating the sensed output of a real neurostimulator. The virtual model is isomorphic with the physical model and is derived from segmented MRI data. This model provides the subsurface anatomy and, combined with electromagnetic tracking of a sham ultrasound probe and a standard nerve block needle, supports simulated ultrasound display and measurement of needle location and proximity to nerves and vessels. The needle tracking and virtual model also support objective performance metrics of needle targeting technique.
Tools virtualization for command and control systems
NASA Astrophysics Data System (ADS)
Piszczek, Marek; Maciejewski, Marcin; Pomianek, Mateusz; Szustakowski, Mieczysław
2017-10-01
Information management is an inseparable part of the command process. The result is that the person making decisions at the command post interacts with data providing devices in various ways. Tools virtualization process can introduce a number of significant modifications in the design of solutions for management and command. The general idea involves replacing physical devices user interface with their digital representation (so-called Virtual instruments). A more advanced level of the systems "digitalization" is to use the mixed reality environments. In solutions using Augmented reality (AR) customized HMI is displayed to the operator when he approaches to each device. Identification of device is done by image recognition of photo codes. Visualization is achieved by (optical) see-through head mounted display (HMD). Control can be done for example by means of a handheld touch panel. Using the immersive virtual environment, the command center can be digitally reconstructed. Workstation requires only VR system (HMD) and access to information network. Operator can interact with devices in such a way as it would perform in real world (for example with the virtual hands). Because of their procedures (an analysis of central vision, eye tracking) MR systems offers another useful feature of reducing requirements for system data throughput. Due to the fact that at the moment we focus on the single device. Experiments carried out using Moverio BT-200 and SteamVR systems and the results of experimental application testing clearly indicate the ability to create a fully functional information system with the use of mixed reality technology.
Practical system for generating digital mixed reality video holograms.
Song, Joongseok; Kim, Changseob; Park, Hanhoon; Park, Jong-Il
2016-07-10
We propose a practical system that can effectively mix the depth data of real and virtual objects by using a Z buffer and can quickly generate digital mixed reality video holograms by using multiple graphic processing units (GPUs). In an experiment, we verify that real objects and virtual objects can be merged naturally in free viewing angles, and the occlusion problem is well handled. Furthermore, we demonstrate that the proposed system can generate mixed reality video holograms at 7.6 frames per second. Finally, the system performance is objectively verified by users' subjective evaluations.
Virtual Reality: An Instructional Medium for Visual-Spatial Tasks.
ERIC Educational Resources Information Center
Regian, J. Wesley; And Others
1992-01-01
Describes an empirical exploration of the instructional potential of virtual reality as an interface for simulation-based training. Shows that subjects learned spatial-procedural and spatial-navigational skills in virtual reality. (SR)
Virtual Reality: Toward Fundamental Improvements in Simulation-Based Training.
ERIC Educational Resources Information Center
Thurman, Richard A.; Mattoon, Joseph S.
1994-01-01
Considers the role and effectiveness of virtual reality in simulation-based training. The theoretical and practical implications of verity, integration, and natural versus artificial interface are discussed; a three-dimensional classification scheme for virtual reality is described; and the relationship between virtual reality and other…
Adler, Adir; Ben-Ari, Adital
2018-01-01
Until recently, the literature that addressed the phenomenon of mixed-orientation relationships, in which the female partner is straight and the male partner is non-straight, has focused mainly on the men's perspective. Most of the studies have employed a pessimistic tone, underscoring the obstacles faced by each of the partners. This study was designed to understand how women of mixed-orientation relationships construct their reality within such a relationship, focusing on elements that assist them in maintaining those relationships. Based on the phenomenological paradigm, in-depth interviews with eight women in mixed-orientation relationships were conducted. The findings indicate that in order to adapt to their newly constructed reality, women reframe various individual, marital, and social aspects in their lives. Those reframing processes constituted a point of departure to developing a conceptual model, which outlines the journey to reality reconstruction among women in mixed-orientation relationships.
Mixed-Reality Prototypes to Support Early Creative Design
NASA Astrophysics Data System (ADS)
Safin, Stéphane; Delfosse, Vincent; Leclercq, Pierre
The domain we address is creative design, mainly architecture. Rooted in a multidisciplinary approach as well as a deep understanding of architecture and design, our method aims at proposing adapted mixed-reality solutions to support two crucial activities: sketch-based preliminary design and distant synchronous collaboration in design. This chapter provides a summary of our work on a mixed-reality device, based on a drawing table (the Virtual Desktop), designed specifically to address real-life/business-focused issues. We explain our methodology, describe the two supported activities and the related users’ needs, detail the technological solution we have developed, and present the main results of multiple evaluation sessions. We conclude with a discussion of the usefulness of a profession-centered methodology and the relevance of mixed reality to support creative design activities.
A Modular Set of Mixed Reality Simulators for Blind and Guided Procedures
2016-08-01
AWARD NUMBER: W81XWH-14-1-0113 TITLE: A Modular Set of Mixed Reality Simulators for “blind” and Guided Procedures PRINCIPAL INVESTIGATOR...2015 – 07/31/2016 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER A Modular Set of Mixed Reality Simulators for “Blind” and Guided Procedures 5b...editor developed to facilitate creation by non-technical educators of ITs for the set of modular simulators, (c) a curriculum for self-study and self
Linking Audio and Visual Information while Navigating in a Virtual Reality Kiosk Display
ERIC Educational Resources Information Center
Sullivan, Briana; Ware, Colin; Plumlee, Matthew
2006-01-01
3D interactive virtual reality museum exhibits should be easy to use, entertaining, and informative. If the interface is intuitive, it will allow the user more time to learn the educational content of the exhibit. This research deals with interface issues concerning activating audio descriptions of images in such exhibits while the user is…
ERIC Educational Resources Information Center
Franchi, Jorge
1994-01-01
Highlights of this overview of virtual reality include optics; interface devices; virtual worlds; potential applications, including medicine and archaeology; problems, including costs; current research and development; future possibilities; and a listing of vendors and suppliers of virtual reality products. (Contains 11 references.) (LRW)
Enhancing Health-Care Services with Mixed Reality Systems
NASA Astrophysics Data System (ADS)
Stantchev, Vladimir
This work presents a development approach for mixed reality systems in health care. Although health-care service costs account for 5-15% of GDP in developed countries the sector has been remarkably resistant to the introduction of technology-supported optimizations. Digitalization of data storing and processing in the form of electronic patient records (EPR) and hospital information systems (HIS) is a first necessary step. Contrary to typical business functions (e.g., accounting or CRM) a health-care service is characterized by a knowledge intensive decision process and usage of specialized devices ranging from stethoscopes to complex surgical systems. Mixed reality systems can help fill the gap between highly patient-specific health-care services that need a variety of technical resources on the one side and the streamlined process flow that typical process supporting information systems expect on the other side. To achieve this task, we present a development approach that includes an evaluation of existing tasks and processes within the health-care service and the information systems that currently support the service, as well as identification of decision paths and actions that can benefit from mixed reality systems. The result is a mixed reality system that allows a clinician to monitor the elements of the physical world and to blend them with virtual information provided by the systems. He or she can also plan and schedule treatments and operations in the digital world depending on status information from this mixed reality.
Virtual Reality: Real Promises and False Expectations.
ERIC Educational Resources Information Center
Homan, Willem J.
1994-01-01
Examines virtual reality (VR), and discusses the dilemma of defining VR, the limitations of the current technology, and the implications of VR for education. Highlights include a VR experience; human factors and the interface; and altered reality versus VR. (Author/AEF)
Augmented Reality, Virtual Reality and Their Effect on Learning Style in the Creative Design Process
ERIC Educational Resources Information Center
Chandrasekera, Tilanka; Yoon, So-Yeon
2018-01-01
Research has shown that user characteristics such as preference for using an interface can result in effective use of the interface. Research has also suggested that there is a relationship between learner preference and creativity. This study uses the VARK learning styles inventory to assess students learning style then explores how this learning…
Comparing two types of navigational interfaces for Virtual Reality.
Teixeira, Luís; Vilar, Elisângela; Duarte, Emília; Rebelo, Francisco; da Silva, Fernando Moreira
2012-01-01
Previous studies suggest significant differences between navigating virtual environments in a life-like walking manner (i.e., using treadmills or walk-in-place techniques) and virtual navigation (i.e., flying while really standing). The latter option, which usually involves hand-centric devices (e.g., joysticks), is the most common in Virtual Reality-based studies, mostly due to low costs, less space and technology demands. However, recently, new interaction devices, originally conceived for videogames have become available offering interesting potentialities for research. This study aimed to explore the potentialities of the Nintendo Wii Balance Board as a navigation interface in a Virtual Environment presented in an immersive Virtual Reality system. Comparing participants' performance while engaged in a simulated emergency egress allows determining the adequacy of such alternative navigation interface on the basis of empirical results. Forty university students participated in this study. Results show that participants were more efficient when performing navigation tasks using the Joystick than with the Balance Board. However there were no significantly differences in the behavioral compliance with exit signs. Therefore, this study suggests that, at least for tasks similar to the studied, the Balance Board have good potentiality to be used as a navigation interface for Virtual Reality systems.
Ambient intelligence in health care.
Riva, Giuseppe
2003-06-01
Ambient Intelligence (AmI) is a new paradigm in information technology, in which people are empowered through a digital environment that is aware of their presence and context, and is sensitive, adaptive, and responsive to their needs, habits, gestures and emotions. The most ambitious expression of AmI is Intelligent Mixed Reality (IMR), an evolution of traditional virtual reality environments. Using IMR, it is possible to integrate computer interfaces into the real environment, so that the user can interact with other individuals and with the environment itself in the most natural and intuitive way. How does the emergence of the AmI paradigm influence the future of health care? Using a scenario-based approach, this paper outlines the possible role of AmI in health care by focusing on both its technological and relational nature. In this sense, clinicians and health care providers that want to exploit AmI potential need a significant attention to technology, ergonomics, project management, human factors and organizational changes in the structure of the relevant health service.
1998-03-01
Research Laboratory’s Virtual Reality Responsive Workbench (VRRWB) and Dragon software system which together address the problem of battle space...and describe the lessons which have been learned. Interactive graphics, workbench, battle space visualization, virtual reality , user interface.
Peña-Tapia, Elena; Martín-Barrio, Andrés; Olivares-Méndez, Miguel A.
2017-01-01
Multi-robot missions are a challenge for operators in terms of workload and situational awareness. These operators have to receive data from the robots, extract information, understand the situation properly, make decisions, generate the adequate commands, and send them to the robots. The consequences of excessive workload and lack of awareness can vary from inefficiencies to accidents. This work focuses on the study of future operator interfaces of multi-robot systems, taking into account relevant issues such as multimodal interactions, immersive devices, predictive capabilities and adaptive displays. Specifically, four interfaces have been designed and developed: a conventional, a predictive conventional, a virtual reality and a predictive virtual reality interface. The four interfaces have been validated by the performance of twenty-four operators that supervised eight multi-robot missions of fire surveillance and extinguishing. The results of the workload and situational awareness tests show that virtual reality improves the situational awareness without increasing the workload of operators, whereas the effects of predictive components are not significant and depend on their implementation. PMID:28749407
Mixing realities at Ismar 2009: scary and wondrous.
Stapleton, Christopher; Rolland, Jannick
2010-01-01
The Eighth IEEE International Symposium on Mixed and Augmented Reality (Ismar 2009) combined a traditional science-and-technology track with an art, media, and humanities track to provide a nontraditional cross-disciplinary view of an increasingly important and growing research area.
Real-time 3D human capture system for mixed-reality art and entertainment.
Nguyen, Ta Huynh Duy; Qui, Tran Cong Thien; Xu, Ke; Cheok, Adrian David; Teo, Sze Lee; Zhou, ZhiYing; Mallawaarachchi, Asitha; Lee, Shang Ping; Liu, Wei; Teo, Hui Siang; Thang, Le Nam; Li, Yu; Kato, Hirokazu
2005-01-01
A real-time system for capturing humans in 3D and placing them into a mixed reality environment is presented in this paper. The subject is captured by nine cameras surrounding her. Looking through a head-mounted-display with a camera in front pointing at a marker, the user can see the 3D image of this subject overlaid onto a mixed reality scene. The 3D images of the subject viewed from this viewpoint are constructed using a robust and fast shape-from-silhouette algorithm. The paper also presents several techniques to produce good quality and speed up the whole system. The frame rate of our system is around 25 fps using only standard Intel processor-based personal computers. Besides a remote live 3D conferencing and collaborating system, we also describe an application of the system in art and entertainment, named Magic Land, which is a mixed reality environment where captured avatars of human and 3D computer generated virtual animations can form an interactive story and play with each other. This system demonstrates many technologies in human computer interaction: mixed reality, tangible interaction, and 3D communication. The result of the user study not only emphasizes the benefits, but also addresses some issues of these technologies.
Virtual Reality: A Dream Come True or a Nightmare.
ERIC Educational Resources Information Center
Cornell, Richard; Bailey, Dan
Virtual Reality (VR) is a new medium which allows total stimulation of one's senses through human/computer interfaces. VR has applications in training simulators, nano-science, medicine, entertainment, electronic technology, and manufacturing. This paper focuses on some current and potential problems of virtual reality and virtual environments…
Will Anything Useful Come Out of Virtual Reality? Examination of a Naval Application
1993-05-01
The term virtual reality can encompass varying meanings, but some generally accepted attributes of a virtual environment are that it is immersive...technology, but at present there are few practical applications which are utilizing the broad range of virtual reality technology. This paper will discuss an...Operability, operator functions, Virtual reality , Man-machine interface, Decision aids/decision making, Decision support. ASW.
Jones, Jake S.
1999-01-01
An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch.
Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo
2014-04-15
Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients' brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies.
Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo
2014-01-01
Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients’ brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies. PMID:25206907
NASA Technical Reports Server (NTRS)
Bejczy, Antal K.
1995-01-01
This presentation focuses on the application of computer graphics or 'virtual reality' (VR) techniques as a human-computer interface tool in the operation of telerobotic systems. VR techniques offer very valuable task realization aids for planning, previewing and predicting robotic actions, operator training, and for visual perception of non-visible events like contact forces in robotic tasks. The utility of computer graphics in telerobotic operation can be significantly enhanced by high-fidelity calibration of virtual reality images to actual TV camera images. This calibration will even permit the creation of artificial (synthetic) views of task scenes for which no TV camera views are available.
Wearable computer for mobile augmented-reality-based controlling of an intelligent robot
NASA Astrophysics Data System (ADS)
Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino
2000-10-01
An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.
Three-Dimensional User Interfaces for Immersive Virtual Reality
NASA Technical Reports Server (NTRS)
vanDam, Andries
1997-01-01
The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.
Mixed reality virtual pets to reduce childhood obesity.
Johnsen, Kyle; Ahn, Sun Joo; Moore, James; Brown, Scott; Robertson, Thomas P; Marable, Amanda; Basu, Aryabrata
2014-04-01
Novel approaches are needed to reduce the high rates of childhood obesity in the developed world. While multifactorial in cause, a major factor is an increasingly sedentary lifestyle of children. Our research shows that a mixed reality system that is of interest to children can be a powerful motivator of healthy activity. We designed and constructed a mixed reality system that allowed children to exercise, play with, and train a virtual pet using their own physical activity as input. The health, happiness, and intelligence of each virtual pet grew as its associated child owner exercised more, reached goals, and interacted with their pet. We report results of a research study involving 61 children from a local summer camp that shows a large increase in recorded and observed activity, alongside observational evidence that the virtual pet was responsible for that change. These results, and the ease at which the system integrated into the camp environment, demonstrate the practical potential to impact the exercise behaviors of children with mixed reality.
Heads up and camera down: a vision-based tracking modality for mobile mixed reality.
DiVerdi, Stephen; Höllerer, Tobias
2008-01-01
Anywhere Augmentation pursues the goal of lowering the initial investment of time and money necessary to participate in mixed reality work, bridging the gap between researchers in the field and regular computer users. Our paper contributes to this goal by introducing the GroundCam, a cheap tracking modality with no significant setup necessary. By itself, the GroundCam provides high frequency, high resolution relative position information similar to an inertial navigation system, but with significantly less drift. We present the design and implementation of the GroundCam, analyze the impact of several design and run-time factors on tracking accuracy, and consider the implications of extending our GroundCam to different hardware configurations. Motivated by the performance analysis, we developed a hybrid tracker that couples the GroundCam with a wide area tracking modality via a complementary Kalman filter, resulting in a powerful base for indoor and outdoor mobile mixed reality work. To conclude, the performance of the hybrid tracker and its utility within mixed reality applications is discussed.
Innovative application of virtual display technique in virtual museum
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-09-01
Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.
Jones, J.S.
1999-01-12
An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment are disclosed. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch. 4 figs.
Evaluation of User Acceptance of Mixed Reality Technology
ERIC Educational Resources Information Center
Yusoff, Rasimah Che Mohd; Zaman, Halimah Badioze; Ahmad, Azlina
2011-01-01
This study investigates users' perception and acceptance of mixed reality (MR) technology. Acceptance of new information technologies has been important research area since 1990s. It is important to understand the reasons why people accept information technologies, as this can help to improve design, evaluation and prediction how users will…
Teaching and Learning in the Mixed-Reality Science Classroom
ERIC Educational Resources Information Center
Tolentino, Lisa; Birchfield, David; Megowan-Romanowicz, Colleen; Johnson-Glenberg, Mina C.; Kelliher, Aisling; Martinez, Christopher
2009-01-01
As emerging technologies become increasingly inexpensive and robust, there is an exciting opportunity to move beyond general purpose computing platforms to realize a new generation of K-12 technology-based learning environments. Mixed-reality technologies integrate real world components with interactive digital media to offer new potential to…
Incorporating Technology in Teaching Musical Instruments
ERIC Educational Resources Information Center
Prodan, Angelica
2017-01-01
After discussing some of the drawbacks of using Skype for long distance music lessons, Angelica Prodan describes three different types of Artificial Reality (Virtual Reality, Augmented Reality and Mixed or Merged Reality). She goes on to describe the beneficial applications of technology, with results otherwise impossible to achieve in areas such…
Brain-computer interface: changes in performance using virtual reality techniques.
Ron-Angevin, Ricardo; Díaz-Estrella, Antonio
2009-01-09
The ability to control electroencephalographic (EEG) signals when different mental tasks are carried out would provide a method of communication for people with serious motor function problems. This system is known as a brain-computer interface (BCI). Due to the difficulty of controlling one's own EEG signals, a suitable training protocol is required to motivate subjects, as it is necessary to provide some type of visual feedback allowing subjects to see their progress. Conventional systems of feedback are based on simple visual presentations, such as a horizontal bar extension. However, virtual reality is a powerful tool with graphical possibilities to improve BCI-feedback presentation. The objective of the study is to explore the advantages of the use of feedback based on virtual reality techniques compared to conventional systems of feedback. Sixteen untrained subjects, divided into two groups, participated in the experiment. A group of subjects was trained using a BCI system, which uses conventional feedback (bar extension), and another group was trained using a BCI system, which submits subjects to a more familiar environment, such as controlling a car to avoid obstacles. The obtained results suggest that EEG behaviour can be modified via feedback presentation. Significant differences in classification error rates between both interfaces were obtained during the feedback period, confirming that an interface based on virtual reality techniques can improve the feedback control, specifically for untrained subjects.
Naval Applications of Virtual Reality,
1993-01-01
Expert Virtual Reality Special Report , pp. 67- 72. 14. SUBJECT TERMS 15 NUMBER o0 PAGES man-machine interface virtual reality decision support...collective and individual performance. -" Virtual reality projects could help *y by Mark Gembicki Av-t-abilty CodesA Avafllat Idt Iofe and David Rousseau...alt- 67 VIRTUAL . REALITY SPECIAl, REPORT r-OPY avcriaikxb to DD)C qg .- 154,41X~~~~~~~~~~~~j 1411 iI..:41 T a].’ 1,1 4 1111 I 4 1 * .11 ~ 4 l.~w111511 I
Usability engineering for augmented reality: employing user-based studies to inform design.
Gabbard, Joseph L; Swan, J Edward
2008-01-01
A major challenge, and thus opportunity, in the field of human-computer interaction and specifically usability engineering is designing effective user interfaces for emerging technologies that have no established design guidelines or interaction metaphors or introduce completely new ways for users to perceive and interact with technology and the world around them. Clearly, augmented reality is one such emerging technology. We propose a usability engineering approach that employs user-based studies to inform design, by iteratively inserting a series of user-based studies into a traditional usability engineering lifecycle to better inform initial user interface designs. We present an exemplar user-based study conducted to gain insight into how users perceive text in outdoor augmented reality settings and to derive implications for design in outdoor augmented reality. We also describe lessons learned from our experiences conducting user-based studies as part of the design process.
A Cross-National Mixed-Method Study of Reality Pedagogy
ERIC Educational Resources Information Center
Sirrakos, George, Jr.; Fraser, Barry J.
2017-01-01
This mixed-methods cross-national study investigated the effectiveness of reality pedagogy (an approach in which teachers become part of students' activities, practices and rituals) in terms of changes in student perceptions of their learning environment and attitudes towards science. A questionnaire was administered to 142 students in grades 8-10…
Embedding Mixed-Reality Laboratories into E-Learning Systems for Engineering Education
ERIC Educational Resources Information Center
Al-Tikriti, Munther N.; Al-Aubidy, Kasim M.
2013-01-01
E-learning, virtual learning and mixed reality techniques are now a global integral part of the academic and educational systems. They provide easier access to educational opportunities to a very wide spectrum of individuals to pursue their educational and qualification objectives. These modern techniques have the potentials to improve the quality…
Teaching and Learning in the Mixed-Reality Science Classroom
NASA Astrophysics Data System (ADS)
Tolentino, Lisa; Birchfield, David; Megowan-Romanowicz, Colleen; Johnson-Glenberg, Mina C.; Kelliher, Aisling; Martinez, Christopher
2009-12-01
As emerging technologies become increasingly inexpensive and robust, there is an exciting opportunity to move beyond general purpose computing platforms to realize a new generation of K-12 technology-based learning environments. Mixed-reality technologies integrate real world components with interactive digital media to offer new potential to combine best practices in traditional science learning with the powerful affordances of audio/visual simulations. This paper introduces the realization of a learning environment called SMALLab, the Situated Multimedia Arts Learning Laboratory. We present a recent teaching experiment for high school chemistry students. A mix of qualitative and quantitative research documents the efficacy of this approach for students and teachers. We conclude that mixed-reality learning is viable in mainstream high school classrooms and that students can achieve significant learning gains when this technology is co-designed with educators.
ERIC Educational Resources Information Center
Jowsey, Susan; Aguayo, Claudio
2017-01-01
Mixed Reality learning environments can provide opportunities to educationally enhance previously isolated scientific concepts by using art and technology as mediums for understanding the world. Participatory experiences provide a kinetic means of comprehending often-abstract knowledge, creating the conditions for sensory learning that is…
Emboldened by Embodiment: Six Precepts for Research on Embodied Learning and Mixed Reality
ERIC Educational Resources Information Center
Lindgren, Robb; Johnson-Glenberg, Mina
2013-01-01
The authors describe an emerging paradigm of educational research that pairs theories of embodied learning with a class of immersive technologies referred to as "mixed reality" (MR). MR environments merge the digital with the physical, where, for example, students can use their bodies to simulate an orbit around a virtual planet. Recent…
An Assessment of a Mixed Reality Environment: Toward an Ethnomethodological Approach
ERIC Educational Resources Information Center
Dugdale, Julie; Pallamin, Nico; Pavard, Bernard
2006-01-01
Training firefighters is a difficult process in which emotions and nonverbal behaviors play an important role. The authors have developed a mixed reality environment for training a small group of firefighters, which takes into account these aspects. The assessment of the environment was made up of three phases: assessing the virtual agents to…
Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.
Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A
2013-01-01
Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.
Intelligent virtual reality in the setting of fuzzy sets
NASA Technical Reports Server (NTRS)
Dockery, John; Littman, David
1992-01-01
The authors have previously introduced the concept of virtual reality worlds governed by artificial intelligence. Creation of an intelligent virtual reality was further proposed as a universal interface for the handicapped. This paper extends consideration of intelligent virtual realty to a context in which fuzzy set principles are explored as a major tool for implementing theory in the domain of applications to the disabled.
[Display technologies for augmented reality in medical applications].
Eck, Ulrich; Winkler, Alexander
2018-04-01
One of the main challenges for modern surgery is the effective use of the many available imaging modalities and diagnostic methods. Augmented reality systems can be used in the future to blend patient and planning information into the view of surgeons, which can improve the efficiency and safety of interventions. In this article we present five visualization methods to integrate augmented reality displays into medical procedures and the advantages and disadvantages are explained. Based on an extensive literature review the various existing approaches for integration of augmented reality displays into medical procedures are divided into five categories and the most important research results for each approach are presented. A large number of mixed and augmented reality solutions for medical interventions have been developed as research prototypes; however, only very few systems have been tested on patients. In order to integrate mixed and augmented reality displays into medical practice, highly specialized solutions need to be developed. Such systems must comply with the requirements with respect to accuracy, fidelity, ergonomics and seamless integration into the surgical workflow.
NASA Astrophysics Data System (ADS)
Starodubtsev, Illya
2017-09-01
The paper describes the implementation of the system of interaction with virtual objects based on gestures. The paper describes the common problems of interaction with virtual objects, specific requirements for the interfaces for virtual and augmented reality.
HyFinBall: A Two-Handed, Hybrid 2D/3D Desktop VR Interface for Visualization
2013-01-01
user study . This is done in the context of a rich, visual analytics interface containing coordinated views with 2D and 3D visualizations and...the user interface (hardware and software), the design space, as well as preliminary results of a formal user study . This is done in the context of a ... virtual reality , user interface , two-handed interface , hybrid user interface , multi-touch, gesture,
ERIC Educational Resources Information Center
Sugimoto, Masanori
2011-01-01
This paper describes a system called GENTORO that uses a robot and a handheld projector for supporting children's storytelling activities. GENTORO differs from many existing systems in that children can make a robot play their own story in a physical space augmented by mixed-reality technologies. Pilot studies have been conducted to clarify the…
Multi-modal imaging, model-based tracking, and mixed reality visualisation for orthopaedic surgery
Fuerst, Bernhard; Tateno, Keisuke; Johnson, Alex; Fotouhi, Javad; Osgood, Greg; Tombari, Federico; Navab, Nassir
2017-01-01
Orthopaedic surgeons are still following the decades old workflow of using dozens of two-dimensional fluoroscopic images to drill through complex 3D structures, e.g. pelvis. This Letter presents a mixed reality support system, which incorporates multi-modal data fusion and model-based surgical tool tracking for creating a mixed reality environment supporting screw placement in orthopaedic surgery. A red–green–blue–depth camera is rigidly attached to a mobile C-arm and is calibrated to the cone-beam computed tomography (CBCT) imaging space via iterative closest point algorithm. This allows real-time automatic fusion of reconstructed surface and/or 3D point clouds and synthetic fluoroscopic images obtained through CBCT imaging. An adapted 3D model-based tracking algorithm with automatic tool segmentation allows for tracking of the surgical tools occluded by hand. This proposed interactive 3D mixed reality environment provides an intuitive understanding of the surgical site and supports surgeons in quickly localising the entry point and orienting the surgical tool during screw placement. The authors validate the augmentation by measuring target registration error and also evaluate the tracking accuracy in the presence of partial occlusion. PMID:29184659
The mixed reality of things: emerging challenges for human-information interaction
NASA Astrophysics Data System (ADS)
Spicer, Ryan P.; Russell, Stephen M.; Rosenberg, Evan Suma
2017-05-01
Virtual and mixed reality technology has advanced tremendously over the past several years. This nascent medium has the potential to transform how people communicate over distance, train for unfamiliar tasks, operate in challenging environments, and how they visualize, interact, and make decisions based on complex data. At the same time, the marketplace has experienced a proliferation of network-connected devices and generalized sensors that are becoming increasingly accessible and ubiquitous. As the "Internet of Things" expands to encompass a predicted 50 billion connected devices by 2020, the volume and complexity of information generated in pervasive and virtualized environments will continue to grow exponentially. The convergence of these trends demands a theoretically grounded research agenda that can address emerging challenges for human-information interaction (HII). Virtual and mixed reality environments can provide controlled settings where HII phenomena can be observed and measured, new theories developed, and novel algorithms and interaction techniques evaluated. In this paper, we describe the intersection of pervasive computing with virtual and mixed reality, identify current research gaps and opportunities to advance the fundamental understanding of HII, and discuss implications for the design and development of cyber-human systems for both military and civilian use.
Direct Manipulation in Virtual Reality
NASA Technical Reports Server (NTRS)
Bryson, Steve
2003-01-01
Virtual Reality interfaces offer several advantages for scientific visualization such as the ability to perceive three-dimensional data structures in a natural way. The focus of this chapter is direct manipulation, the ability for a user in virtual reality to control objects in the virtual environment in a direct and natural way, much as objects are manipulated in the real world. Direct manipulation provides many advantages for the exploration of complex, multi-dimensional data sets, by allowing the investigator the ability to intuitively explore the data environment. Because direct manipulation is essentially a control interface, it is better suited for the exploration and analysis of a data set than for the publishing or communication of features found in that data set. Thus direct manipulation is most relevant to the analysis of complex data that fills a volume of three-dimensional space, such as a fluid flow data set. Direct manipulation allows the intuitive exploration of that data, which facilitates the discovery of data features that would be difficult to find using more conventional visualization methods. Using a direct manipulation interface in virtual reality, an investigator can, for example, move a data probe about in space, watching the results and getting a sense of how the data varies within its spatial volume.
Location-Based Learning through Augmented Reality
ERIC Educational Resources Information Center
Chou, Te-Lien; Chanlin, Lih-Juan
2014-01-01
A context-aware and mixed-reality exploring tool cannot only effectively provide an information-rich environment to users, but also allows them to quickly utilize useful resources and enhance environment awareness. This study integrates Augmented Reality (AR) technology into smartphones to create a stimulating learning experience at a university…
NASA Technical Reports Server (NTRS)
2004-01-01
In 1984, researchers from Ames Research Center came together to develop advanced human interfaces for NASA s teleoperations that would come to be known as "virtual reality." The basis of the work theorized that if the sensory interfaces met a certain threshold and sufficiently supported each other, then the operator would feel present in the remote/synthetic environment, rather than present in their physical location. Twenty years later, this prolific research continues to pay dividends to society in the form of cutting-edge virtual reality products, such as an interactive audio simulation system.
Mixed-reality simulation for neurosurgical procedures.
Bova, Frank J; Rajon, Didier A; Friedman, William A; Murad, Gregory J; Hoh, Daniel J; Jacob, R Patrick; Lampotang, Samsun; Lizdas, David E; Lombard, Gwen; Lister, J Richard
2013-10-01
Surgical education is moving rapidly to the use of simulation for technical training of residents and maintenance or upgrading of surgical skills in clinical practice. To optimize the learning exercise, it is essential that both visual and haptic cues are presented to best present a real-world experience. Many systems attempt to achieve this goal through a total virtual interface. To demonstrate that the most critical aspect in optimizing a simulation experience is to provide the visual and haptic cues, allowing the training to fully mimic the real-world environment. Our approach has been to create a mixed-reality system consisting of a physical and a virtual component. A physical model of the head or spine is created with a 3-dimensional printer using deidentified patient data. The model is linked to a virtual radiographic system or an image guidance platform. A variety of surgical challenges can be presented in which the trainee must use the same anatomic and radiographic references required during actual surgical procedures. Using the aforementioned techniques, we have created simulators for ventriculostomy, percutaneous stereotactic lesion procedure for trigeminal neuralgia, and spinal instrumentation. The design and implementation of these platforms are presented. The system has provided the residents an opportunity to understand and appreciate the complex 3-dimensional anatomy of the 3 neurosurgical procedures simulated. The systems have also provided an opportunity to break procedures down into critical segments, allowing the user to concentrate on specific areas of deficiency.
Bashford, Luke; Mehring, Carsten
2016-01-01
To study body ownership and control, illusions that elicit these feelings in non-body objects are widely used. Classically introduced with the Rubber Hand Illusion, these illusions have been replicated more recently in virtual reality and by using brain-computer interfaces. Traditionally these illusions investigate the replacement of a body part by an artificial counterpart, however as brain-computer interface research develops it offers us the possibility to explore the case where non-body objects are controlled in addition to movements of our own limbs. Therefore we propose a new illusion designed to test the feeling of ownership and control of an independent supernumerary hand. Subjects are under the impression they control a virtual reality hand via a brain-computer interface, but in reality there is no causal connection between brain activity and virtual hand movement but correct movements are observed with 80% probability. These imitation brain-computer interface trials are interspersed with movements in both the subjects' real hands, which are in view throughout the experiment. We show that subjects develop strong feelings of ownership and control over the third hand, despite only receiving visual feedback with no causal link to the actual brain signals. Our illusion is crucially different from previously reported studies as we demonstrate independent ownership and control of the third hand without loss of ownership in the real hands.
Pas, Elise T; Johnson, Stacy R; Larson, Kristine E; Brandenburg, Linda; Church, Robin; Bradshaw, Catherine P
2016-12-01
Most approaches aiming to reduce behavior problems among youth with Autism Spectrum Disorder (ASD) focus on individual students; however, school personnel also need professional development to better support students. This study targeted teachers' skill development to promote positive outcomes for students with ASD. The sample included 19 teachers in two non-public special education settings serving students with moderate to severe ASD. Participating teachers received professional development and coaching in classroom management, with guided practice in a mixed-reality simulator. Repeated-measures ANOVAs examining externally-conducted classroom observations revealed statistically significant improvements in teacher management and student behavior over time. Findings suggest that coaching and guided practice in a mixed-reality simulator is perceived as acceptable and may reduce behavior problems among students with ASD.
NASA Astrophysics Data System (ADS)
Krum, David M.; Sadek, Ramy; Kohli, Luv; Olson, Logan; Bolas, Mark
2010-01-01
As part of the Institute for Creative Technologies and the School of Cinematic Arts at the University of Southern California, the Mixed Reality lab develops technologies and techniques for presenting realistic immersive training experiences. Such experiences typically place users within a complex ecology of social actors, physical objects, and collections of intents, motivations, relationships, and other psychological constructs. Currently, it remains infeasible to completely synthesize the interactivity and sensory signatures of such ecologies. For this reason, the lab advocates mixed reality methods for training and conducts experiments exploring such methods. Currently, the lab focuses on understanding and exploiting the elasticity of human perception with respect to representational differences between real and virtual environments. This paper presents an overview of three projects: techniques for redirected walking, displays for the representation of virtual humans, and audio processing to increase stress.
Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms.
Rutkowski, Tomasz M
2016-01-01
The paper reviews nine robotic and virtual reality (VR) brain-computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI-lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in applications to direct brain-robot and brain-virtual-reality-agent control interfaces using tactile and auditory BCI technologies. The BCI user intentions are decoded from the brainwaves in realtime using a non-invasive electroencephalography (EEG) and they are translated to a symbiotic robot or virtual reality agent thought-based only control. A communication protocol between the BCI output and the robot or the virtual environment is realized in a symbiotic communication scenario using an user datagram protocol (UDP), which constitutes an internet of things (IoT) control scenario. Results obtained from healthy users reproducing simple brain-robot and brain-virtual-agent control tasks in online experiments support the research goal of a possibility to interact with robotic devices and virtual reality agents using symbiotic thought-based BCI technologies. An offline BCI classification accuracy boosting method, using a previously proposed information geometry derived approach, is also discussed in order to further support the reviewed robotic and virtual reality thought-based control paradigms.
Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms
Rutkowski, Tomasz M.
2016-01-01
The paper reviews nine robotic and virtual reality (VR) brain–computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI–lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in applications to direct brain-robot and brain-virtual-reality-agent control interfaces using tactile and auditory BCI technologies. The BCI user intentions are decoded from the brainwaves in realtime using a non-invasive electroencephalography (EEG) and they are translated to a symbiotic robot or virtual reality agent thought-based only control. A communication protocol between the BCI output and the robot or the virtual environment is realized in a symbiotic communication scenario using an user datagram protocol (UDP), which constitutes an internet of things (IoT) control scenario. Results obtained from healthy users reproducing simple brain-robot and brain-virtual-agent control tasks in online experiments support the research goal of a possibility to interact with robotic devices and virtual reality agents using symbiotic thought-based BCI technologies. An offline BCI classification accuracy boosting method, using a previously proposed information geometry derived approach, is also discussed in order to further support the reviewed robotic and virtual reality thought-based control paradigms. PMID:27999538
A review of existing and potential computer user interfaces for modern radiology.
Iannessi, Antoine; Marcy, Pierre-Yves; Clatz, Olivier; Bertrand, Anne-Sophie; Sugimoto, Maki
2018-05-16
The digitalization of modern imaging has led radiologists to become very familiar with computers and their user interfaces (UI). New options for display and command offer expanded possibilities, but the mouse and keyboard remain the most commonly utilized, for usability reasons. In this work, we review and discuss different UI and their possible application in radiology. We consider two-dimensional and three-dimensional imaging displays in the context of interventional radiology, and discuss interest in touchscreens, kinetic sensors, eye detection, and augmented or virtual reality. We show that UI design specifically for radiologists is key for future use and adoption of such new interfaces. Next-generation UI must fulfil professional needs, while considering contextual constraints. • The mouse and keyboard remain the most utilized user interfaces for radiologists. • Touchscreen, holographic, kinetic sensors and eye tracking offer new possibilities for interaction. • 3D and 2D imaging require specific user interfaces. • Holographic display and augmented reality provide a third dimension to volume imaging. • Good usability is essential for adoption of new user interfaces by radiologists.
NASA Astrophysics Data System (ADS)
Demir, I.
2015-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.
Mixed reality temporal bone surgical dissector: mechanical design.
Hochman, Jordan Brent; Sepehri, Nariman; Rampersad, Vivek; Kraut, Jay; Khazraee, Milad; Pisa, Justyn; Unger, Bertram
2014-08-08
The Development of a Novel Mixed Reality (MR) Simulation. An evolving training environment emphasizes the importance of simulation. Current haptic temporal bone simulators have difficulty representing realistic contact forces and while 3D printed models convincingly represent vibrational properties of bone, they cannot reproduce soft tissue. This paper introduces a mixed reality model, where the effective elements of both simulations are combined; haptic rendering of soft tissue directly interacts with a printed bone model. This paper addresses one aspect in a series of challenges, specifically the mechanical merger of a haptic device with an otic drill. This further necessitates gravity cancelation of the work assembly gripper mechanism. In this system, the haptic end-effector is replaced by a high-speed drill and the virtual contact forces need to be repositioned to the drill tip from the mid wand. Previous publications detail generation of both the requisite printed and haptic simulations. Custom software was developed to reposition the haptic interaction point to the drill tip. A custom fitting, to hold the otic drill, was developed and its weight was offset using the haptic device. The robustness of the system to disturbances and its stable performance during drilling were tested. The experiments were performed on a mixed reality model consisting of two drillable rapid-prototyped layers separated by a free-space. Within the free-space, a linear virtual force model is applied to simulate drill contact with soft tissue. Testing illustrated the effectiveness of gravity cancellation. Additionally, the system exhibited excellent performance given random inputs and during the drill's passage between real and virtual components of the model. No issues with registration at model boundaries were encountered. These tests provide a proof of concept for the initial stages in the development of a novel mixed-reality temporal bone simulator.
ERIC Educational Resources Information Center
Miller, Carmen
1992-01-01
The first of two articles discusses virtual reality (VR) and online databases; the second one reports on an interview with Thomas A. Furness III, who defines VR and explains work at the Human Interface Technology Laboratory (HIT). Sidebars contain a glossary of VR terms and a conversation with Toni Emerson, the HIT lab's librarian. (LRW)
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface
Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele
2017-01-01
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source. PMID:28961198
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.
Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea
2017-09-29
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.
NASA Astrophysics Data System (ADS)
Lee, Sam; Lucas, Nathan P.; Ellis, R. Darin; Pandya, Abhilash
2012-06-01
This paper presents a seamlessly controlled human multi-robot system comprised of ground and aerial robots of semiautonomous nature for source localization tasks. The system combines augmented reality interfaces capabilities with human supervisor's ability to control multiple robots. The role of this human multi-robot interface is to allow an operator to control groups of heterogeneous robots in real time in a collaborative manner. It used advanced path planning algorithms to ensure obstacles are avoided and that the operators are free for higher-level tasks. Each robot knows the environment and obstacles and can automatically generate a collision-free path to any user-selected target. It displayed sensor information from each individual robot directly on the robot in the video view. In addition, a sensor data fused AR view is displayed which helped the users pin point source information or help the operator with the goals of the mission. The paper studies a preliminary Human Factors evaluation of this system in which several interface conditions are tested for source detection tasks. Results show that the novel Augmented Reality multi-robot control (Point-and-Go and Path Planning) reduced mission completion times compared to the traditional joystick control for target detection missions. Usability tests and operator workload analysis are also investigated.
Augmented Reality Implementation in Watch Catalog as e-Marketing Based on Mobile Aplication
NASA Astrophysics Data System (ADS)
Adrianto, D.; Luwinda, F. A.; Yesmaya, V.
2017-01-01
Augmented Reality is one of important methods to provide user with a better interactive user interface. In this research, Augmented Reality in Mobile Application will be applied to provide user with useful information related with Watch Catalogue. This research will be focused on design and implementation an application using Augmented Reality. The process model used in this research is Extreme Programming. Extreme Programming have a several steps which are planning, design, coding, and testing. The result of this research is Augmented Reality application based on Android. This research will be conclude that implementation of Augmented Reality based on Android in Watch Catalogue will help customer to collect the useful information related to the specific object of watch.
Presence within a mixed reality environment.
van Schaik, Paul; Turnbull, Triece; van Wersch, Anna; Drummond, Sarah
2004-10-01
Mixed reality environments represent a new approach to creating technology-mediated experiences. However, there is a lack of empirical research investigating users' actual experience. The aim of the current exploratory, non-experimental study was to establish levels of and identify factors associated with presence, within the framework of Schubert et al.'s model of presence. Using questionnaire and interview methods, the experience of the final performance of the Desert Rain mixed reality environment was investigated. Levels of general and spatial presence were relatively high, but levels of involvement and realness were not. Overall, intrinsic motivation, confidence and intention to re-visit Desert Rain were high. However, age was negatively associated with both spatial presence and confidence to play. Furthermore, various problems in navigating the environment were identified. Results are discussed in terms of Schubert's model and other theoretical perspectives. Implications for system design are presented.
Learning Molecular Structures in a Tangible Augmented Reality Environment
ERIC Educational Resources Information Center
Asai, Kikuo; Takase, Norio
2011-01-01
This article presents the characteristics of using a tangible table top environment produced by augmented reality (AR), aimed at improving the environment in which learners observe three-dimensional molecular structures. The authors perform two evaluation experiments. A performance test for a user interface demonstrates that learners with a…
Sensorimotor enhancement with a mixed reality system for balance and mobility rehabilitation.
Fung, Joyce; Perez, Claire F
2011-01-01
We have developed a mixed reality system incorporating virtual reality (VR), surface perturbations and light touch for gait rehabilitation. Haptic touch has emerged as a novel and efficient technique to improve postural control and dynamic stability. Our system combines visual display with the manipulation of physical environments and addition of haptic feedback to enhance balance and mobility post stroke. A research study involving 9 participants with stroke and 9 age-matched healthy individuals show that the haptic cue provided while walking is an effective means of improving gait stability in people post stroke, especially during challenging environmental conditions such as downslope walking.
Mixed virtual reality simulation--taking endoscopic simulation one step further.
Courteille, O; Felländer-Tsai, L; Hedman, L; Kjellin, A; Enochsson, L; Lindgren, G; Fors, U
2011-01-01
This pilot study aimed to assess medical students' appraisals of a "mixed" virtual reality simulation for endoscopic surgery (with a virtual patient case in addition to a virtual colonoscopy) as well as the impact of this simulation set-up on students' performance. Findings indicate that virtual patients can enhance contextualization of simulated endoscopy and thus facilitate an authentic learning environment, which is important in order to increase motivation.
Mixed reality framework for collective motion patterns of swarms with delay coupling
NASA Astrophysics Data System (ADS)
Szwaykowska, Klementyna; Schwartz, Ira
The formation of coherent patterns in swarms of interacting self-propelled autonomous agents is an important subject for many applications within the field of distributed robotic systems. However, there are significant logistical challenges associated with testing fully distributed systems in real-world settings. In this paper, we provide a rigorous theoretical justification for the use of mixed-reality experiments as a stepping stone to fully physical testing of distributed robotic systems. We also model and experimentally realize a mixed-reality large-scale swarm of delay-coupled agents. Our analyses, assuming agents communicating over an Erdos-Renyi network, demonstrate the existence of stable coherent patterns that can be achieved only with delay coupling and that are robust to decreasing network connectivity and heterogeneity in agent dynamics. We show how the bifurcation structure for emergence of different patterns changes with heterogeneity in agent acceleration capabilities and limited connectivity in the network as a function of coupling strength and delay. Our results are verified through simulation as well as preliminary experimental results of delay-induced pattern formation in a mixed-reality swarm. K. S. was a National Research Council postdoctoral fellow. I.B.S was supported by the U.S. Naval Research Laboratory funding (N0001414WX00023) and office of Naval Research (N0001414WX20610).
Virtual reality applications to automated rendezvous and capture
NASA Technical Reports Server (NTRS)
Hale, Joseph; Oneil, Daniel
1991-01-01
Virtual Reality (VR) is a rapidly developing Human/Computer Interface (HCI) technology. The evolution of high-speed graphics processors and development of specialized anthropomorphic user interface devices, that more fully involve the human senses, have enabled VR technology. Recently, the maturity of this technology has reached a level where it can be used as a tool in a variety of applications. This paper provides an overview of: VR technology, VR activities at Marshall Space Flight Center (MSFC), applications of VR to Automated Rendezvous and Capture (AR&C), and identifies areas of VR technology that requires further development.
Virtual reality in surgical training.
Lange, T; Indelicato, D J; Rosen, J M
2000-01-01
Virtual reality in surgery and, more specifically, in surgical training, faces a number of challenges in the future. These challenges are building realistic models of the human body, creating interface tools to view, hear, touch, feel, and manipulate these human body models, and integrating virtual reality systems into medical education and treatment. A final system would encompass simulators specifically for surgery, performance machines, telemedicine, and telesurgery. Each of these areas will need significant improvement for virtual reality to impact medicine successfully in the next century. This article gives an overview of, and the challenges faced by, current systems in the fast-changing field of virtual reality technology, and provides a set of specific milestones for a truly realistic virtual human body.
Visual Environment for Designing Interactive Learning Scenarios with Augmented Reality
ERIC Educational Resources Information Center
Mota, José Miguel; Ruiz-Rube, Iván; Dodero, Juan Manuel; Figueiredo, Mauro
2016-01-01
Augmented Reality (AR) technology allows the inclusion of virtual elements on a vision of actual physical environment for the creation of a mixed reality in real time. This kind of technology can be used in educational settings. However, the current AR authoring tools present several drawbacks, such as, the lack of a mechanism for tracking the…
Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality
2009-03-01
International Symposium on Mixed and Augmented Reality, pages 77–86, Sept. 2008. [12] M. A. Livingston, J. E. Swan II, J. L. Gabbard , T. H. Höllerer, D. Hix...D. Brown, Y. Baillot, J. L. Gabbard , and D. Hix. A perceptual matching technique for depth judgments in optical, see-through augmented reality. In
Linte, Cristian A.; Davenport, Katherine P.; Cleary, Kevin; Peters, Craig; Vosburgh, Kirby G.; Navab, Nassir; Edwards, Philip “Eddie”; Jannin, Pierre; Peters, Terry M.; Holmes, David R.; Robb, Richard A.
2013-01-01
Mixed reality environments for medical applications have been explored and developed over the past three decades in an effort to enhance the clinician’s view of anatomy and facilitate the performance of minimally invasive procedures. These environments must faithfully represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical instrument tracking, and display technology into a common framework centered around and registered to the patient. However, in spite of their reported benefits, few mixed reality environments have been successfully translated into clinical use. Several challenges that contribute to the difficulty in integrating such environments into clinical practice are presented here and discussed in terms of both technical and clinical limitations. This article should raise awareness among both developers and end-users toward facilitating a greater application of such environments in the surgical practice of the future. PMID:23632059
Virtual Reality: An Experiential Tool for Clinical Psychology
ERIC Educational Resources Information Center
Riva, Giuseppe
2009-01-01
Several Virtual Reality (VR) applications for the understanding, assessment and treatment of mental health problems have been developed in the last 15 years. Typically, in VR the patient learns to manipulate problematic situations related to his/her problem. In fact, VR can be described as an advanced form of human-computer interface that is able…
Mixed Reality Technology at NASA JPL
2016-05-16
NASA's JPL is a center of innovation in virtual and augmented reality, producing groundbreaking applications of these technologies to support a variety of missions. This video is a collection of unedited scenes released to the media.
Virtual reality for intelligent and interactive operating, training, and visualization systems
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen; Schluse, Michael
2000-10-01
Virtual Reality Methods allow a new and intuitive way of communication between man and machine. The basic idea of Virtual Reality (VR) is the generation of artificial computer simulated worlds, which the user not only can look at but also can interact with actively using data glove and data helmet. The main emphasis for the use of such techniques at the IRF is the development of a new generation of operator interfaces for the control of robots and other automation components and for intelligent training systems for complex tasks. The basic idea of the methods developed at the IRF for the realization of Projective Virtual Reality is to let the user work in the virtual world as he would act in reality. The user actions are recognized by the Virtual reality System and by means of new and intelligent control software projected onto the automation components like robots which afterwards perform the necessary actions in reality to execute the users task. In this operation mode the user no longer has to be a robot expert to generate tasks for robots or to program them, because intelligent control software recognizes the users intention and generated automatically the commands for nearly every automation component. Now, Virtual Reality Methods are ideally suited for universal man-machine-interfaces for the control and supervision of a big class of automation components, interactive training and visualization systems. The Virtual Reality System of the IRF-COSIMIR/VR- forms the basis for different projects starting with the control of space automation systems in the projects CIROS, VITAL and GETEX, the realization of a comprehensive development tool for the International Space Station and last but not least with the realistic simulation fire extinguishing, forest machines and excavators which will be presented in the final paper in addition to the key ideas of this Virtual Reality System.
ChemPreview: an augmented reality-based molecular interface.
Zheng, Min; Waller, Mark P
2017-05-01
Human computer interfaces make computational science more comprehensible and impactful. Complex 3D structures such as proteins or DNA are magnified by digital representations and displayed on two-dimensional monitors. Augmented reality has recently opened another door to access the virtual three-dimensional world. Herein, we present an augmented reality application called ChemPreview with the potential to manipulate bio-molecular structures at an atomistic level. ChemPreview is available at https://github.com/wallerlab/chem-preview/releases, and is built on top of the Meta 1 platform https://www.metavision.com/. ChemPreview can be used to interact with a protein in an intuitive way using natural hand gestures, thereby making it appealing to computational chemists or structural biologists. The ability to manipulate atoms in real world could eventually provide new and more efficient ways of extracting structural knowledge, or designing new molecules in silico. Copyright © 2017 Elsevier Inc. All rights reserved.
STS-134 crew in Virtual Reality Lab during their MSS/EVAA SUPT2 Team training
2010-08-27
JSC2010-E-121049 (27 Aug. 2010) --- NASA astronaut Andrew Feustel (foreground), STS-134 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration
STS-133 crew during MSS/EVAA TEAM training in Virtual Reality Lab
2010-10-01
JSC2010-E-170878 (1 Oct. 2010) --- NASA astronaut Michael Barratt, STS-133 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration
STS-134 crew in Virtual Reality Lab during their MSS/EVAA SUPT2 Team training
2010-08-27
JSC2010-E-121056 (27 Aug. 2010) --- NASA astronaut Gregory H. Johnson, STS-134 pilot, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration
STS-133 crew during MSS/EVAA TEAM training in Virtual Reality Lab
2010-10-01
JSC2010-E-170888 (1 Oct. 2010) --- NASA astronaut Nicole Stott, STS-133 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration
STS-133 crew during MSS/EVAA TEAM training in Virtual Reality Lab
2010-10-01
JSC2010-E-170882 (1 Oct. 2010) --- NASA astronaut Nicole Stott, STS-133 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration
ERIC Educational Resources Information Center
Chang, Hsin-Yi; Wu, Hsin-Kai; Hsu, Ying-Shao
2013-01-01
virtual objects or information overlaying physical objects or environments, resulting in a mixed reality in which virtual objects and real environments coexist in a meaningful way to augment learning…
Augmenting the Thermal Flux Experiment: A Mixed Reality Approach with the HoloLens
ERIC Educational Resources Information Center
Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.
2017-01-01
In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted…
The RoboCup Mixed Reality League - A Case Study
NASA Astrophysics Data System (ADS)
Gerndt, Reinhard; Bohnen, Matthias; da Silva Guerra, Rodrigo; Asada, Minoru
In typical mixed reality systems there is only a one-way interaction from real to virtual. A human user or the physics of a real object may influence the behavior of virtual objects, but real objects usually cannot be influenced by the virtual world. By introducing real robots into the mixed reality system, we allow a true two-way interaction between virtual and real worlds. Our system has been used since 2007 to implement the RoboCup mixed reality soccer games and other applications for research and edutainment. Our framework system is freely programmable to generate any virtual environment, which may then be further supplemented with virtual and real objects. The system allows for control of any real object based on differential drive robots. The robots may be adapted for different applications, e.g., with markers for identification or with covers to change shape and appearance. They may also be “equipped” with virtual tools. In this chapter we present the hardware and software architecture of our system and some applications. The authors believe this can be seen as a first implementation of Ivan Sutherland’s 1965 idea of the ultimate display: “The ultimate display would, of course, be a room within which the computer can control the existence of matter …” (Sutherland, 1965, Proceedings of IFIPS Congress 2:506-508).
ERIC Educational Resources Information Center
Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca
2009-01-01
The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…
Augmented Reality Imaging System: 3D Viewing of a Breast Cancer.
Douglas, David B; Boone, John M; Petricoin, Emanuel; Liotta, Lance; Wilson, Eugene
2016-01-01
To display images of breast cancer from a dedicated breast CT using Depth 3-Dimensional (D3D) augmented reality. A case of breast cancer imaged using contrast-enhanced breast CT (Computed Tomography) was viewed with the augmented reality imaging, which uses a head display unit (HDU) and joystick control interface. The augmented reality system demonstrated 3D viewing of the breast mass with head position tracking, stereoscopic depth perception, focal point convergence and the use of a 3D cursor and joy-stick enabled fly through with visualization of the spiculations extending from the breast cancer. The augmented reality system provided 3D visualization of the breast cancer with depth perception and visualization of the mass's spiculations. The augmented reality system should be further researched to determine the utility in clinical practice.
Mixed reality temporal bone surgical dissector: mechanical design
2014-01-01
Objective The Development of a Novel Mixed Reality (MR) Simulation. An evolving training environment emphasizes the importance of simulation. Current haptic temporal bone simulators have difficulty representing realistic contact forces and while 3D printed models convincingly represent vibrational properties of bone, they cannot reproduce soft tissue. This paper introduces a mixed reality model, where the effective elements of both simulations are combined; haptic rendering of soft tissue directly interacts with a printed bone model. This paper addresses one aspect in a series of challenges, specifically the mechanical merger of a haptic device with an otic drill. This further necessitates gravity cancelation of the work assembly gripper mechanism. In this system, the haptic end-effector is replaced by a high-speed drill and the virtual contact forces need to be repositioned to the drill tip from the mid wand. Previous publications detail generation of both the requisite printed and haptic simulations. Method Custom software was developed to reposition the haptic interaction point to the drill tip. A custom fitting, to hold the otic drill, was developed and its weight was offset using the haptic device. The robustness of the system to disturbances and its stable performance during drilling were tested. The experiments were performed on a mixed reality model consisting of two drillable rapid-prototyped layers separated by a free-space. Within the free-space, a linear virtual force model is applied to simulate drill contact with soft tissue. Results Testing illustrated the effectiveness of gravity cancellation. Additionally, the system exhibited excellent performance given random inputs and during the drill’s passage between real and virtual components of the model. No issues with registration at model boundaries were encountered. Conclusion These tests provide a proof of concept for the initial stages in the development of a novel mixed-reality temporal bone simulator. PMID:25927300
Augmenting the thermal flux experiment: A mixed reality approach with the HoloLens
NASA Astrophysics Data System (ADS)
Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.
2017-09-01
In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted displays, which allow one to embed virtual objects into the real surroundings, leading to a Mixed Reality (MR) experience. In such an environment, digital and real objects do not only coexist, but moreover are also able to interact with each other in real time. These concepts can be used to merge human perception of reality with digitally visualized sensor data, thereby making the invisible visible. As a first example, in this paper we introduce alongside the basic idea of this column an MR experiment in thermodynamics for a laboratory course for freshman students in physics or other science and engineering subjects that uses physical data from mobile devices for analyzing and displaying physical phenomena to students.
A Modular Set of Mixed Reality Simulators for blind and Guided Procedures
2015-08-01
W81XWH-14-1-0113 – Year 1 Report University of Florida Page 1 of 12 AWARD NUMBER: W81XWH-14-1-0113 TITLE: A Modular Set of Mixed Reality...Simulators for “blind” and Guided Procedures PRINCIPAL INVESTIGATOR: Samsun Lampotang CONTRACTING ORGANIZATION: University of Florida Gainesville, FL...designated by other documentation. W81XWH-14-1-0113 – Year 1 Report University of Florida Page 2 of 12 REPORT DOCUMENTATION PAGE Form Approved OMB No
Application of Virtual, Augmented, and Mixed Reality to Urology.
Hamacher, Alaric; Kim, Su Jin; Cho, Sung Tae; Pardeshi, Sunil; Lee, Seung Hyun; Eun, Sung-Jong; Whangbo, Taeg Keun
2016-09-01
Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected.
Application of Virtual, Augmented, and Mixed Reality to Urology
2016-01-01
Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected. PMID:27706017
Chen, Jiayin; Or, Calvin
2017-11-01
This study assessed the use of an immersive virtual reality (VR), a mouse and a touchscreen for one-directional pointing, multi-directional pointing, and dragging-and-dropping tasks involving targets of smaller and larger widths by young (n = 18; 18-30 years), middle-aged (n = 18; 40-55 years) and older adults (n = 18; 65-75 years). A three-way, mixed-factorial design was used for data collection. The dependent variables were the movement time required and the error rate. Our main findings were that the participants took more time and made more errors in using the VR input interface than in using the mouse or the touchscreen. This pattern applied in all three age groups in all tasks, except for multi-directional pointing with a larger target width among the older group. Overall, older adults took longer to complete the tasks and made more errors than young or middle-aged adults. Larger target widths yielded shorter movement times and lower error rates in pointing tasks, but larger targets yielded higher rates of error in dragging-and-dropping tasks. Our study indicated that any other virtual environments that are similar to those we tested may be more suitable for displaying scenes than for manipulating objects that are small and require fine control. Although interacting with VR is relatively difficult, especially for older adults, there is still potential for older adults to adapt to that interface. Furthermore, adjusting the width of objects according to the type of manipulation required might be an effective way to promote performance. Copyright © 2017 Elsevier Ltd. All rights reserved.
STS-134 crew in Virtual Reality Lab during their MSS/EVAA SUPT2 Team training
2010-08-27
JSC2010-E-121045 (27 Aug. 2010) --- NASA astronaut Andrew Feustel (right), STS-134 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. David Homan assisted Feustel. Photo credit: NASA or National Aeronautics and Space Administration
NASA Astrophysics Data System (ADS)
Tadokoro, Satoshi; Kitano, Hiroaki; Takahashi, Tomoichi; Noda, Itsuki; Matsubara, Hitoshi; Shinjoh, Atsushi; Koto, Tetsuo; Takeuchi, Ikuo; Takahashi, Hironao; Matsuno, Fumitoshi; Hatayama, Mitsunori; Nobe, Jun; Shimada, Susumu
2000-07-01
This paper introduces the RoboCup-Rescue Simulation Project, a contribution to the disaster mitigation, search and rescue problem. A comprehensive urban disaster simulator is constructed on distributed computers. Heterogeneous intelligent agents such as fire fighters, victims and volunteers conduct search and rescue activities in this virtual disaster world. A real world interface integrates various sensor systems and controllers of infrastructures in the real cities with the real world. Real-time simulation is synchronized with actual disasters, computing complex relationship between various damage factors and agent behaviors. A mission-critical man-machine interface provides portability and robustness of disaster mitigation centers, and augmented-reality interfaces for rescue in real disasters. It also provides a virtual- reality training function for the public. This diverse spectrum of RoboCup-Rescue contributes to the creation of the safer social system.
STS-105 Crew Training in VR Lab
2001-03-15
JSC2001-00751 (15 March 2001) --- Astronaut Scott J. Horowitz, STS-105 mission commander, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Discovery. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with International Space Station (ISS) elements.
Photographic coverage of STS-112 during EVA 3 in VR Lab.
2002-08-21
JSC2002-E-34622 (21 August 2002) --- Astronaut David A. Wolf, STS-112 mission specialist, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Atlantis. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with ISS elements.
2005-06-07
JSC2005-E-21191 (7 June 2005) --- Astronaut Steven G. MacLean, STS-115 mission specialist representing the Canadian Space Agency, uses the virtual reality lab at the Johnson Space Center to train for his duties aboard the space shuttle. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-105 Crew Training in VR Lab
2001-03-15
JSC2001-00758 (15 March 2001) --- Astronaut Frederick W. Sturckow, STS-105 pilot, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Discovery. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with International Space Station (ISS) elements.
2005-06-07
JSC2005-E-21192 (7 June 2005) --- Astronauts Christopher J. Ferguson (left), STS-115 pilot, and Daniel C. Burbank, mission specialist, use the virtual reality lab at the Johnson Space Center to train for their duties aboard the space shuttle. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
Understanding Mixed Code and Classroom Code-Switching: Myths and Realities
ERIC Educational Resources Information Center
Li, David C. S.
2008-01-01
Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…
Virtual reality applied to teletesting
NASA Astrophysics Data System (ADS)
van den Berg, Thomas J.; Smeenk, Roland J. M.; Mazy, Alain; Jacques, Patrick; Arguello, Luis; Mills, Simon
2003-05-01
The activity "Virtual Reality applied to Teletesting" is related to a wider European Space Agency (ESA) initiative of cost reduction, in particular the reduction of test costs. Reduction of costs of space related projects have to be performed on test centre operating costs and customer company costs. This can accomplished by increasing the automation and remote testing ("teletesting") capabilities of the test centre. Main problems related to teletesting are a lack of situational awareness and the separation of control over the test environment. The objective of the activity is to evaluate the use of distributed computing and Virtual Reality technology to support the teletesting of a payload under vacuum conditions, and to provide a unified man-machine interface for the monitoring and control of payload, vacuum chamber and robotics equipment. The activity includes the development and testing of a "Virtual Reality Teletesting System" (VRTS). The VRTS is deployed at one of the ESA certified test centres to perform an evaluation and test campaign using a real payload. The VRTS is entirely written in the Java programming language, using the J2EE application model. The Graphical User Interface runs as an applet in a Web browser, enabling easy access from virtually any place.
NASA Astrophysics Data System (ADS)
Bernardet, Ulysses; Bermúdez I Badia, Sergi; Duff, Armin; Inderbitzin, Martin; Le Groux, Sylvain; Manzolli, Jônatas; Mathews, Zenon; Mura, Anna; Väljamäe, Aleksander; Verschure, Paul F. M. J.
The eXperience Induction Machine (XIM) is one of the most advanced mixed-reality spaces available today. XIM is an immersive space that consists of physical sensors and effectors and which is conceptualized as a general-purpose infrastructure for research in the field of psychology and human-artifact interaction. In this chapter, we set out the epistemological rational behind XIM by putting the installation in the context of psychological research. The design and implementation of XIM are based on principles and technologies of neuromorphic control. We give a detailed description of the hardware infrastructure and software architecture, including the logic of the overall behavioral control. To illustrate the approach toward psychological experimentation, we discuss a number of practical applications of XIM. These include the so-called, persistent virtual community, the application in the research of the relationship between human experience and multi-modal stimulation, and an investigation of a mixed-reality social interaction paradigm.
Linte, Cristian A; Davenport, Katherine P; Cleary, Kevin; Peters, Craig; Vosburgh, Kirby G; Navab, Nassir; Edwards, Philip Eddie; Jannin, Pierre; Peters, Terry M; Holmes, David R; Robb, Richard A
2013-03-01
Mixed reality environments for medical applications have been explored and developed over the past three decades in an effort to enhance the clinician's view of anatomy and facilitate the performance of minimally invasive procedures. These environments must faithfully represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical instrument tracking, and display technology into a common framework centered around and registered to the patient. However, in spite of their reported benefits, few mixed reality environments have been successfully translated into clinical use. Several challenges that contribute to the difficulty in integrating such environments into clinical practice are presented here and discussed in terms of both technical and clinical limitations. This article should raise awareness among both developers and end-users toward facilitating a greater application of such environments in the surgical practice of the future. Copyright © 2013 Elsevier Ltd. All rights reserved.
LVC interaction within a mixed-reality training system
NASA Astrophysics Data System (ADS)
Pollock, Brice; Winer, Eliot; Gilbert, Stephen; de la Cruz, Julio
2012-03-01
The United States military is increasingly pursuing advanced live, virtual, and constructive (LVC) training systems for reduced cost, greater training flexibility, and decreased training times. Combining the advantages of realistic training environments and virtual worlds, mixed reality LVC training systems can enable live and virtual trainee interaction as if co-located. However, LVC interaction in these systems often requires constructing immersive environments, developing hardware for live-virtual interaction, tracking in occluded environments, and an architecture that supports real-time transfer of entity information across many systems. This paper discusses a system that overcomes these challenges to empower LVC interaction in a reconfigurable, mixed reality environment. This system was developed and tested in an immersive, reconfigurable, and mixed reality LVC training system for the dismounted warfighter at ISU, known as the Veldt, to overcome LVC interaction challenges and as a test bed for cuttingedge technology to meet future U.S. Army battlefield requirements. Trainees interact physically in the Veldt and virtually through commercial and developed game engines. Evaluation involving military trained personnel found this system to be effective, immersive, and useful for developing the critical decision-making skills necessary for the battlefield. Procedural terrain modeling, model-matching database techniques, and a central communication server process all live and virtual entity data from system components to create a cohesive virtual world across all distributed simulators and game engines in real-time. This system achieves rare LVC interaction within multiple physical and virtual immersive environments for training in real-time across many distributed systems.
Building the Joint Battlespace Infosphere. Volume 2: Interactive Information Technologies
1999-12-17
G. A . Vouros, “ A Knowledge- Based Methodology for Supporting Multilingual and User -Tailored Interfaces ,” Interacting With Computers, Vol. 9 (1998), p...project is to develop a two-handed user interface to the stereoscopic field analyzer, an interactive 3-D scientific visualization system. The...62 See http://www.hitl.washington.edu/research/vrd/. 63 R. Baumann and R. Clavel, “Haptic Interface for Virtual Reality Based
NASA Astrophysics Data System (ADS)
Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.
2000-08-01
We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.
Programmable personality interface for the dynamic infrared scene generator (IRSG2)
NASA Astrophysics Data System (ADS)
Buford, James A., Jr.; Mobley, Scott B.; Mayhall, Anthony J.; Braselton, William J.
1998-07-01
As scene generator platforms begin to rely specifically on commercial off-the-shelf (COTS) hardware and software components, the need for high speed programmable personality interfaces (PPIs) are required for interfacing to Infrared (IR) flight computer/processors and complex IR projectors in the hardware-in-the-loop (HWIL) simulation facilities. Recent technological advances and innovative applications of established technologies are beginning to allow development of cost effective PPIs to interface to COTS scene generators. At the U.S. Army Aviation and Missile Command (AMCOM) Missile Research, Development, and Engineering Center (MRDEC) researchers have developed such a PPI to reside between the AMCOM MRDEC IR Scene Generator (IRSG) and either a missile flight computer or the dynamic Laser Diode Array Projector (LDAP). AMCOM MRDEC has developed several PPIs for the first and second generation IRSGs (IRSG1 and IRSG2), which are based on Silicon Graphics Incorporated (SGI) Onyx and Onyx2 computers with Reality Engine 2 (RE2) and Infinite Reality (IR/IR2) graphics engines. This paper provides an overview of PPIs designed, integrated, tested, and verified at AMCOM MRDEC, specifically the IRSG2's PPI.
The Virtual Tablet: Virtual Reality as a Control System
NASA Technical Reports Server (NTRS)
Chronister, Andrew
2016-01-01
In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of hardware, write software to incorporate each piece of position or orientation data into a coherent description of the object's location in space, place the virtual analogue accordingly, and project the control interface onto it, resulting in a functioning object which has both a physical and a virtual presence. Additionally, the virtual environment was enhanced with two live video feeds from cameras mounted on the robotic device being used as an example target of the virtual interface. The working VT allows users to naturally interact with a control interface with little to no training and without the issues found in previous efforts.
STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus
2007-08-09
JSC2007-E-41539 (9 Aug. 2007) --- Astronaut Pamela A. Melroy, STS-120 commander, uses the virtual reality lab at Johnson Space Center to train for her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-111 Training in VR lab with Expedition IV and V Crewmembers
2001-10-18
JSC2001-E-39090 (18 October 2001) --- Cosmonaut Valeri G. Korzun, Expedition Five mission commander representing Rosaviakosmos, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties on the International Space Station (ISS). This type of computer interface paired with virtual reality training hardware and software helps the entire team for dealing with ISS elements.
STS-EVA Mass Ops training of the STS-117 EVA crewmembers
2006-11-01
JSC2006-E-47612 (1 Nov. 2006) --- Astronaut Steven R. Swanson, STS-117 mission specialist, uses the virtual reality lab at Johnson Space Center to train for his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus
2007-08-09
JSC2007-E-41532 (9 Aug. 2007) --- Astronaut Stephanie D. Wilson, STS-120 mission specialist, uses the virtual reality lab at Johnson Space Center to train for her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus
2007-08-09
JSC2007-E-41531 (9 Aug. 2007) --- Astronaut Pamela A. Melroy, STS-120 commander, uses the virtual reality lab at Johnson Space Center to train for her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
On the Usability and Likeability of Virtual Reality Games for Education: The Case of VR-ENGAGE
ERIC Educational Resources Information Center
Virvou, Maria; Katsionis, George
2008-01-01
Educational software games aim at increasing the students' motivation and engagement while they learn. However, if software games are targeted to school classrooms they have to be usable and likeable by all students. Usability of virtual reality games may be a problem because these games tend to have complex user interfaces so that they are more…
Augmented reality for anatomical education.
Thomas, Rhys Gethin; John, Nigel William; Delieu, John Michael
2010-03-01
The use of Virtual Environments has been widely reported as a method of teaching anatomy. Generally such environments only convey the shape of the anatomy to the student. We present the Bangor Augmented Reality Education Tool for Anatomy (BARETA), a system that combines Augmented Reality (AR) technology with models produced using Rapid Prototyping (RP) technology, to provide the student with stimulation for touch as well as sight. The principal aims of this work were to provide an interface more intuitive than a mouse and keyboard, and to evaluate such a system as a viable supplement to traditional cadaver based education.
Ultimate Realities: Deterministic and Evolutionary
Moxley, Roy A
2007-01-01
References to ultimate reality commonly turn up in the behavioral literature as references to determinism. However, this determinism is often difficult to interpret. There are different kinds of determinisms as well as different kinds of ultimate realities for a behaviorist to consider. To clarify some of the issues involved, the views of ultimate realities are treated as falling along a continuum, with extreme views of complete indeterminism and complete determinism at either end and various mixes in between. Doing so brings into play evolutionary realities and the movement from indeterminism to determinism, as in Peirce's evolutionary cosmology. In addition, this framework helps to show how the views of determinism by B. F. Skinner and other behaviorists have shifted over time. PMID:22478489
Predicting Upscaled Behavior of Aqueous Reactants in Heterogeneous Porous Media
NASA Astrophysics Data System (ADS)
Wright, E. E.; Hansen, S. K.; Bolster, D.; Richter, D. H.; Vesselinov, V. V.
2017-12-01
When modeling reactive transport, reaction rates are often overestimated due to the improper assumption of perfect mixing at the support scale of the transport model. In reality, fronts tend to form between participants in thermodynamically favorable reactions, leading to segregation of reactants into islands or fingers. When such a configuration arises, reactions are limited to the interface between the reactive solutes. Closure methods for estimating control-volume-effective reaction rates in terms of quantities defined at the control volume scale do not presently exist, but their development is crucial for effective field-scale modeling. We attack this problem through a combination of analytical and numerical means. Specifically, we numerically study reactive transport through an ensemble of realizations of two-dimensional heterogeneous porous media. We then employ regression analysis to calibrate an analytically-derived relationship between reaction rate and various dimensionless quantities representing conductivity-field heterogeneity and the respective strengths of diffusion, reaction and advection.
Nakajima, Sawako; Ino, Shuichi; Ifukube, Tohru
2007-01-01
Mixed Reality (MR) technologies have recently been explored in many areas of Human-Machine Interface (HMI) such as medicine, manufacturing, entertainment and education. However MR sickness, a kind of motion sickness is caused by sensory conflicts between the real world and virtual world. The purpose of this paper is to find out a new evaluation method of motion and MR sickness. This paper investigates a relationship between the whole-body vibration related to MR technologies and the motion aftereffect (MAE) phenomenon in the human visual system. This MR environment is modeled after advanced driver assistance systems in near-future vehicles. The seated subjects in the MR simulator were shaken in the pitch direction ranging from 0.1 to 2.0 Hz. Results show that MAE is useful for evaluation of MR sickness incidence. In addition, a method to reduce the MR sickness by auditory stimulation is proposed.
Multi-degree of freedom joystick for virtual reality simulation.
Head, M J; Nelson, C A; Siu, K C
2013-11-01
A modular control interface and simulated virtual reality environment were designed and created in order to determine how the kinematic architecture of a control interface affects minimally invasive surgery training. A user is able to selectively determine the kinematic configuration of an input device (number, type and location of degrees of freedom) for a specific surgical simulation through the use of modular joints and constraint components. Furthermore, passive locking was designed and implemented through the use of inflated latex tubing around rotational joints in order to allow a user to step away from a simulation without unwanted tool motion. It is believed that these features will facilitate improved simulation of a variety of surgical procedures and, thus, improve surgical skills training.
2000-11-01
importance of the sensation of presence, and cybersickness . The third day reviewed assessment methods and applications research. Speakers reviewed...of the sensation of presence, and cybersickness . The third day reviewed assessment methods and applications research. Speakers reviewed existing or...Reality technology. Presentations discussed sensory interfaces, measures of effectiveness, importance of the sensation of presence, and cybersickness
Advanced Technology for Portable Personal Visualization
1993-01-01
have no cable to drag. " We submitted a short article describing the ceiling tracker and the requirements demanded of trackers in see-through systems...Newspaper/Magazine Articles : "Virtual Reality: It’s All in the Mind," Atlanta Consnrution, 29 September 1992 "Virtual Reality: Exploring the Future...basic scientific investigation of the human haptic system or to serve as haptic interfaces for virtual environments and teleloperation. 2. Research
Preliminary development of augmented reality systems for spinal surgery
NASA Astrophysics Data System (ADS)
Nguyen, Nhu Q.; Ramjist, Joel M.; Jivraj, Jamil; Jakubovic, Raphael; Deorajh, Ryan; Yang, Victor X. D.
2017-02-01
Surgical navigation has been more actively deployed in open spinal surgeries due to the need for improved precision during procedures. This is increasingly difficult in minimally invasive surgeries due to the lack of visual cues caused by smaller exposure sites, and increases a surgeon's dependence on their knowledge of anatomical landmarks as well as the CT or MRI images. The use of augmented reality (AR) systems and registration technologies in spinal surgeries could allow for improvements to techniques by overlaying a 3D reconstruction of patient anatomy in the surgeon's field of view, creating a mixed reality visualization. The AR system will be capable of projecting the 3D reconstruction onto a field and preliminary object tracking on a phantom. Dimensional accuracy of the mixed media will also be quantified to account for distortions in tracking.
Surgery applications of virtual reality
NASA Technical Reports Server (NTRS)
Rosen, Joseph
1994-01-01
Virtual reality is a computer-generated technology which allows information to be displayed in a simulated, bus lifelike, environment. In this simulated 'world', users can move and interact as if they were actually a part of that world. This new technology will be useful in many different fields, including the field of surgery. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations, simulate and perform surgical procedures (telesurgery), and predict the outcomes of surgery. The authors of this paper describe the basic components of a virtual reality surgical system. These components include: the virtual world, the virtual tools, the anatomical model, the software platform, the host computer, the interface, and the head-coupled display. In the chapter they also review the progress towards using virtual reality for surgical training, planning, telesurgery, and predicting outcomes. Finally, the authors present a training system being developed for the practice of new procedures in abdominal surgery.
Virtual reality and brain computer interface in neurorehabilitation
Dahdah, Marie; Driver, Simon; Parsons, Thomas D.; Richter, Kathleen M.
2016-01-01
The potential benefit of technology to enhance recovery after central nervous system injuries is an area of increasing interest and exploration. The primary emphasis to date has been motor recovery/augmentation and communication. This paper introduces two original studies to demonstrate how advanced technology may be integrated into subacute rehabilitation. The first study addresses the feasibility of brain computer interface with patients on an inpatient spinal cord injury unit. The second study explores the validity of two virtual environments with acquired brain injury as part of an intensive outpatient neurorehabilitation program. These preliminary studies support the feasibility of advanced technologies in the subacute stage of neurorehabilitation. These modalities were well tolerated by participants and could be incorporated into patients' inpatient and outpatient rehabilitation regimens without schedule disruptions. This paper expands the limited literature base regarding the use of advanced technologies in the early stages of recovery for neurorehabilitation populations and speaks favorably to the potential integration of brain computer interface and virtual reality technologies as part of a multidisciplinary treatment program. PMID:27034541
Multi-modal virtual environment research at Armstrong Laboratory
NASA Technical Reports Server (NTRS)
Eggleston, Robert G.
1995-01-01
One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.
78 FR 74057 - Disapproval of State Implementation Plan Revisions; Clark County, Nevada
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-10
... technology-based standards should account for the practical realities of technology supports EPA's view that... section 110(a)(2)(A) through (M). In developing SIPs, states have broad authority to develop the mix of... a SIP must be met on a ``continuous'' basis, practical realities or circumstances may create...
Kamel Boulos, Maged N; Lu, Zhihan; Guerrero, Paul; Jennett, Charlene; Steed, Anthony
2017-02-20
The latest generation of virtual and mixed reality hardware has rekindled interest in virtual reality GIS (VRGIS) and augmented reality GIS (ARGIS) applications in health, and opened up new and exciting opportunities and possibilities for using these technologies in the personal and public health arenas. From smart urban planning and emergency training to Pokémon Go, this article offers a snapshot of some of the most remarkable VRGIS and ARGIS solutions for tackling public and environmental health problems, and bringing about safer and healthier living options to individuals and communities. The article also covers the main technical foundations and issues underpinning these solutions.
On the Structure of the Mixing Zone at an Unstable Contact Boundary
NASA Astrophysics Data System (ADS)
Meshkov, E. E.
2018-01-01
The interface between two media of different densities (contact boundary) moving with an acceleration directed from the less dense medium to the more dense one is unstable (Rayleigh-Taylor instability) [1, 2]. The initial perturbations of the interface grow indefinitely and, as a result, a medium mixing zone growing with time is formed at the interface. The structure of such a mixing zone at gas-gas and gas-liquid interfaces is discussed on the basis of laboratory experiments on shock tubes of various types. It is concluded that the regions of turbulent and laminar flows are combined in the mixing zone.
Mixed Methods for Mixed Reality: Understanding Users' Avatar Activities in Virtual Worlds
ERIC Educational Resources Information Center
Feldon, David F.; Kafai, Yasmin B.
2008-01-01
This paper examines the use of mixed methods for analyzing users' avatar-related activities in a virtual world. Server logs recorded keystroke-level activity for 595 participants over a six-month period in Whyville.net, an informal science website. Participants also completed surveys and participated in interviews regarding their experiences.…
Interactive voxel graphics in virtual reality
NASA Astrophysics Data System (ADS)
Brody, Bill; Chappell, Glenn G.; Hartman, Chris
2002-06-01
Interactive voxel graphics in virtual reality poses significant research challenges in terms of interface, file I/O, and real-time algorithms. Voxel graphics is not so new, as it is the focus of a good deal of scientific visualization. Interactive voxel creation and manipulation is a more innovative concept. Scientists are understandably reluctant to manipulate data. They collect or model data. A scientific analogy to interactive graphics is the generation of initial conditions for some model. It is used as a method to test those models. We, however, are in the business of creating new data in the form of graphical imagery. In our endeavor, science is a tool and not an end. Nevertheless, there is a whole class of interactions and associated data generation scenarios that are natural to our way of working and that are also appropriate to scientific inquiry. Annotation by sketching or painting to point to and distinguish interesting and important information is very significant for science as well as art. Annotation in 3D is difficult without a good 3D interface. Interactive graphics in virtual reality is an appropriate approach to this problem.
STS-134 crew and Expedition 24/25 crew member Shannon Walker
2010-03-25
JSC2010-E-043667 (25 March 2010) --- NASA astronaut Mark Kelly, STS-134 commander, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus
2007-08-09
JSC2007-E-41540 (9 Aug. 2007) --- Astronauts Pamela A. Melroy, STS-120 commander, and European Space Agency's (ESA) Paolo Nespoli, mission specialist, use the virtual reality lab at Johnson Space Center to train for their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-126 crew during preflight VR LAB MSS EVA2 training
2008-04-14
JSC2008-E-033771 (14 April 2008) --- Astronaut Eric A. Boe, STS-126 pilot, uses the virtual reality lab in the Space Vehicle Mockup Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus
2007-08-09
JSC2007-E-41541 (9 Aug. 2007) --- Astronauts Stephanie Wilson, STS-120 mission specialist, and Dan Tani, Expedition 16 flight engineer, use the virtual reality lab at Johnson Space Center to train for their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
Three-dimensional user interfaces for scientific visualization
NASA Technical Reports Server (NTRS)
VanDam, Andries (Principal Investigator)
1996-01-01
The focus of this grant was to experiment with novel user interfaces for scientific visualization applications using both desktop and virtual reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past three years, and subsumes all prior reports.
1999-01-01
34. twenty-first century. These papers illustrate topics such as a development ofvirtual environment applications, different uses ofVRML in information system...interfaces, an examination of research in virtual reality environment interfaces, and five approaches to supporting changes’ in virtuaI environments...we get false negatives that contribute to the probability of false rejection Prrj). { l � Taking these error probabilities into account, we define a
Mixed reality for robotic treatment of a splenic artery aneurysm.
Pietrabissa, Andrea; Morelli, Luca; Ferrari, Mauro; Peri, Andrea; Ferrari, Vincenzo; Moglia, Andrea; Pugliese, Luigi; Guarracino, Fabio; Mosca, Franco
2010-05-01
Techniques of mixed reality can successfully be used in preoperative planning of laparoscopic and robotic procedures and to guide surgical dissection and enhance its accuracy. A computer-generated three-dimensional (3D) model of the vascular anatomy of the spleen was obtained from the computed tomography (CT) dataset of a patient with a 3-cm splenic artery aneurysm. Using an environmental infrared localizer and a stereoscopic helmet, the surgeon can see the patient's anatomy in transparency (augmented or mixed reality). This arrangement simplifies correct positioning of trocars and locates surgical dissection directly on top of the aneurysm. In this way the surgeon limits unnecessary dissection, leaving intact the blood supply from the short gastric vessels and other collaterals. Based on preoperative planning, we were able to anticipate that the vascular exclusion of the aneurysm would result in partial splenic ischemia. To re-establish the flow to the spleen, end-to-end robotic anastomosis of the splenic artery with the Da Vinci surgical system was then performed. Finally, the aneurysm was fenestrated to exclude arterial refilling. The postoperative course was uneventful. A control CT scan 4 weeks after surgery showed a well-perfused and homogeneous splenic parenchyma. The final 3D model showed the fenestrated calcified aneurysm and patency of the re-anastomosed splenic artery. The described technique of robotic vascular exclusion of a splenic artery aneurysm, followed by re-anastomosis of the vessel, clearly demonstrates how this technology can reduce the invasiveness of the procedure, obviating an otherwise necessary splenectomy. Also, the use of intraoperative mixed-reality technology proved very useful in this case and is expected to play an increasing role in the operating room of the future.
A radiation-free mixed-reality training environment and assessment concept for C-arm-based surgery.
Stefan, Philipp; Habert, Séverine; Winkler, Alexander; Lazarovici, Marc; Fürmetz, Julian; Eck, Ulrich; Navab, Nassir
2018-06-25
The discrepancy of continuously decreasing opportunities for clinical training and assessment and the increasing complexity of interventions in surgery has led to the development of different training and assessment options like anatomical models, computer-based simulators or cadaver trainings. However, trainees, following training, assessment and ultimately performing patient treatment, still face a steep learning curve. To address this problem for C-arm-based surgery, we introduce a realistic radiation-free simulation system that combines patient-based 3D printed anatomy and simulated X-ray imaging using a physical C-arm. To explore the fidelity and usefulness of the proposed mixed-reality system for training and assessment, we conducted a user study with six surgical experts performing a facet joint injection on the simulator. In a technical evaluation, we show that our system simulates X-ray images accurately with an RMSE of 1.85 mm compared to real X-ray imaging. The participants expressed agreement with the overall realism of the simulation, the usefulness of the system for assessment and strong agreement with the usefulness of such a mixed-reality system for training of novices and experts. In a quantitative analysis, we furthermore evaluated the suitability of the system for the assessment of surgical skills and gather preliminary evidence for validity. The proposed mixed-reality simulation system facilitates a transition to C-arm-based surgery and has the potential to complement or even replace large parts of cadaver training, to provide a safe assessment environment and to reduce the risk for errors when proceeding to patient treatment. We propose an assessment concept and outline the steps necessary to expand the system into a test instrument that provides reliable and justified assessments scores indicative of surgical proficiency with sufficient evidence for validity.
ERIC Educational Resources Information Center
Bressler, D. M.; Bodzin, A. M.
2013-01-01
Current studies have reported that secondary students are highly engaged while playing mobile augmented reality (AR) learning games. Some researchers have posited that players' engagement may indicate a flow experience, but no research results have confirmed this hypothesis with vision-based AR learning games. This study investigated factors…
STS-132 crew during their MSS/SIMP EVA3 OPS 4 training
2010-01-28
JSC2010-E-014952 (28 Jan. 2010) --- NASA astronauts Michael Good (seated) and Garrett Reisman, both STS-132 mission specialists, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-109 Crew Training in VR Lab, Building 9
2001-08-08
JSC2001-E-24452 (8 August 2001) --- Astronauts John M. Grunsfeld (left), STS-109 payload commander, and Nancy J. Currie, mission specialist, use the virtual reality lab at the Johnson Space Center (JSC) to train for some of their duties aboard the Space Shuttle Columbia. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team to perform its duties during the fourth Hubble Space Telescope (HST) servicing mission.
STS-134 crew and Expedition 24/25 crew member Shannon Walker
2010-03-25
JSC2010-E-043666 (25 March 2010) --- NASA astronauts Mark Kelly (background), STS-134 commander; and Andrew Feustel, mission specialist, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-134 crew and Expedition 24/25 crew member Shannon Walker
2010-03-25
JSC2010-E-043668 (25 March 2010) --- NASA astronauts Mark Kelly (background), STS-134 commander; and Andrew Feustel, mission specialist, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-111 Training in VR lab with Expedition IV and V Crewmembers
2001-10-18
JSC2001-E-39082 (18 October 2001) --- Cosmonaut Valeri G. Korzun (left), Expedition Five mission commander, and astronaut Carl E. Walz, Expedition Four flight engineer, use the virtual reality lab at the Johnson Space Center (JSC) to train for their duties on the International Space Station (ISS). This type of computer interface paired with virtual reality training hardware and software helps the entire team for dealing with ISS elements. Korzun represents Rosaviakosmos.
STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus
2007-08-09
JSC2007-E-41533 (9 Aug. 2007) --- Astronauts Stephanie Wilson (left), STS-120 mission specialist; Sandra Magnus, Expedition 17 flight engineer; and Dan Tani, Expedition 16 flight engineer, use the virtual reality lab at Johnson Space Center to train for their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
Destination Mars Grand Opening
2016-09-18
Apollo 11 astronaut Buzz Aldrin, left and Erisa Hines of NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California, try out Microsoft HoloLens mixed reality headset during a preview of the new Destination: Mars experience at the Kennedy Space Center Visitor Complex. Destination: Mars gives guests an opportunity to “visit” several sites on Mars using real imagery from NASA’s Curiosity Mars Rover. Based on OnSight, a tool created by NASA’s Jet Propulsion Laboratory in Pasadena, California, the experience brings guests together with a holographic version of Aldrin and Curiosity rover driver Hines as they are guided to Mars using Microsoft HoloLens mixed reality headset. Photo credit: NASA/Charles Babir
A mixed reality simulator for feline abdominal palpation training in veterinary medicine.
Parkes, Rebecca; Forrest, Neil; Baillie, Sarah
2009-01-01
The opportunities for veterinary students to practice feline abdominal palpation are limited as cats have a low tolerance to being examined. Therefore, a mixed reality simulator was developed to complement clinical training. Two PHANToM premium haptic devices were positioned either side of a modified toy cat. Virtual models of the chest and some abdominal contents were superimposed on the physical model. The haptic properties of the virtual models were set by seven veterinarians; values were adjusted while the simulation was being palpated until the representation was satisfactory. Feedback from the veterinarians was encouraging suggesting that the simulator has a potential role in student training.
Utilizing media arts principles for developing effective interactive neurorehabilitation systems.
Rikakis, Thanassis
2011-01-01
This paper discusses how interactive neurorehabilitation systems can increase their effectiveness through systematic integration of media arts principles and practice. Media arts expertise can foster the development of complex yet intuitive extrinsic feedback displays that match the inherent complexity and intuitive nature of motor learning. Abstract, arts-based feedback displays can be powerful metaphors that provide re-contextualization, engagement and appropriate reward mechanisms for mature adults. Such virtual feedback displays must be seamlessly integrated with physical components to produce mixed reality training environments that promote active, generalizable learning. The proposed approaches are illustrated through examples from mixed reality rehabilitation systems developed by our team.
NASA Technical Reports Server (NTRS)
Johnson, David W.
1992-01-01
Virtual realities are a type of human-computer interface (HCI) and as such may be understood from a historical perspective. In the earliest era, the computer was a very simple, straightforward machine. Interaction was human manipulation of an inanimate object, little more than the provision of an explicit instruction set to be carried out without deviation. In short, control resided with the user. In the second era of HCI, some level of intelligence and control was imparted to the system to enable a dialogue with the user. Simple context sensitive help systems are early examples, while more sophisticated expert system designs typify this era. Control was shared more equally. In this, the third era of the HCI, the constructed system emulates a particular environment, constructed with rules and knowledge about 'reality'. Control is, in part, outside the realm of the human-computer dialogue. Virtual reality systems are discussed.
Strategies to combat poverty and their interface with health promotion.
dos Santos Oliveira, Simone Helena; Alves Monteiro, Maria Adelane; Vieira Lopes, Maria do Socorro; Silva de Brito, Daniele Mary; Vieira, Neiva Francenely Cunha; Barroso, Maria Grasiela Teixeira; Ximenes, Lorena Barbosa
2007-01-01
The population impoverishment is a social reality whose overcoming is necessary so that we can think about health as a positive concept. This study proposes a reflection on the coping strategies adopted by the Conjunto Palmeira, a Brazilian community in the Northeast, and their interface with health promotion. This community's reality is an example of overcoming social exclusion for different regions of Brazil and other countries. The history of the Conjunto and the collective strategies of empowerment for coping with poverty and search for human development are initially presented. After that, we establish the relationship of those strategies with the action fields for health promotion. Finally, we consider that the mutual responsibility of the community with its health and its relationship with the environment in which they live are means of promoting transformation towards the conquest of a worthy social space.
Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.
Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J
2011-11-01
To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.
New Visions of Reality: Multimedia and Education.
ERIC Educational Resources Information Center
Ambron, Sueann
1986-01-01
Multimedia is a powerful tool that will change both the way we look at knowledge and our vision of reality, as well as our educational system and the business world. Multimedia as used here refers to the innovation of mixing text, audio, and video through the use of a computer. Not only will there be new products emerging from multimedia uses, but…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayes, Birchard P; Michel, Kelly D; Few, Douglas A
From stereophonic, positional sound to high-definition imagery that is crisp and clean, high fidelity computer graphics enhance our view, insight, and intuition regarding our environments and conditions. Contemporary 3-D modeling tools offer an open architecture framework that enables integration with other technologically innovative arenas. One innovation of great interest is Augmented Reality, the merging of virtual, digital environments with physical, real-world environments creating a mixed reality where relevant data and information augments the real or actual experience in real-time by spatial or semantic context. Pairing 3-D virtual immersive models with a dynamic platform such as semi-autonomous robotics or personnel odometrymore » systems to create a mixed reality offers a new and innovative design information verification inspection capability, evaluation accuracy, and information gathering capability for nuclear facilities. Our paper discusses the integration of two innovative technologies, 3-D visualizations with inertial positioning systems, and the resulting augmented reality offered to the human inspector. The discussion in the paper includes an exploration of human and non-human (surrogate) inspections of a nuclear facility, integrated safeguards knowledge within a synchronized virtual model operated, or worn, by a human inspector, and the anticipated benefits to safeguards evaluations of facility operations.« less
STS-134 crew and Expedition 24/25 crew member Shannon Walker
2010-03-25
JSC2010-E-043673 (25 March 2010) --- NASA astronauts Gregory H. Johnson, STS-134 pilot; and Shannon Walker, Expedition 24/25 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-134 crew and Expedition 24/25 crew member Shannon Walker
2010-03-25
JSC2010-E-043661 (25 March 2010) --- NASA astronauts Gregory H. Johnson, STS-134 pilot; and Shannon Walker, Expedition 24/25 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-132 crew during their MSS/SIMP EVA3 OPS 4 training
2010-01-28
JSC2010-E-014953 (28 Jan. 2010) --- NASA astronauts Piers Sellers, STS-132 mission specialist; and Tracy Caldwell Dyson, Expedition 23/24 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-132 crew during their MSS/SIMP EVA3 OPS 4 training
2010-01-28
JSC2010-E-014949 (28 Jan. 2010) --- NASA astronauts Piers Sellers, STS-132 mission specialist; and Tracy Caldwell Dyson, Expedition 23/24 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
2000-07-01
acceptance is not as simple a matter as it may first appear. Several points must be kept in mind. (1) Risk is a fundamental reality . (2) Risk...1) Proper preparation of an SSPP requires coming to grips with the hard realities of program execution. It involves the exami- nation and...Interfaces. (32:48) Since the conduct of a system safety program will eventually touch on virtually every other element of a system devel- opment program, a
STS-132 crew during their MSS/SIMP EVA3 OPS 4 training
2010-01-28
JSC2010-E-014956 (28 Jan. 2010) --- NASA astronauts Ken Ham (left foreground), STS-132 commander; Michael Good, mission specialist; and Tony Antonelli (right), pilot, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-131 crew during VR Lab MSS/EVAB SUPT3 Team 91016 training
2009-09-25
JSC2009-E-214346 (25 Sept. 2009) --- Japan Aerospace Exploration Agency (JAXA) astronaut Naoko Yamazaki, STS-131 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-131 crew during VR Lab MSS/EVAB SUPT3 Team 91016 training
2009-09-25
JSC2009-E-214328 (25 Sept. 2009) --- Japan Aerospace Exploration Agency (JAXA) astronaut Naoko Yamazaki, STS-131 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-132 crew during their MSS/SIMP EVA3 OPS 4 training
2010-01-28
JSC2010-E-014951 (28 Jan. 2010) --- NASA astronauts Michael Good (seated), Garrett Reisman (right foreground), both STS-132 mission specialists; and Tony Antonelli, pilot, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-111 Training in VR lab with Expedition IV and V Crewmembers
2001-10-18
JSC2001-E-39085 (18 October 2001) --- Cosmonaut Valeri G. Korzun (left), Expedition Five mission commander, astronaut Peggy A. Whitson, Expedition Five flight engineer, and astronaut Carl E. Walz, Expedition Four flight engineer, use the virtual reality lab at the Johnson Space Center (JSC) to train for their duties on the International Space Station (ISS). This type of computer interface paired with virtual reality training hardware and software helps the entire team for dealing with ISS elements. Korzun represents Rosaviakosmos.
STS-133 crew training in VR Lab with replacement crew member Steve Bowen
2011-01-24
JSC2011-E-006293 (24 Jan. 2011) --- NASA astronaut Michael Barratt, STS-133 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration
Photographic coverage of STS-112 during EVA 3 in VR Lab.
2002-08-21
JSC2002-E-34625 (21 Aug. 2002) --- Astronaut Sandra H. Magnus (left), STS-112 mission specialist, uses the virtual reality lab at NASA?s Johnson Space Center (JSC) to train for her duties aboard the space shuttle Atlantis. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with ISS elements. Lead SSRMS instructor Elizabeth C. Bloomer assisted Magnus. Astronaut Ellen Ochoa (standing) looks on. Photo credit: NASA
STS-134 crew and Expedition 24/25 crew member Shannon Walker
2010-03-25
JSC2010-E-043662 (25 March 2010) --- NASA astronauts Gregory H. Johnson, STS-134 pilot; and Shannon Walker, Expedition 24/25 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.
STS-131 crew during VR Lab MSS/EVAB SUPT3 Team 91016 training
2009-09-25
JSC2009-E-214321 (25 Sept. 2009) --- NASA astronauts James P. Dutton Jr., STS-131 pilot; and Stephanie Wilson, mission specialist, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.
STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus
2007-08-09
JSC2007-E-41538 (9 Aug. 2007) --- Astronauts Stephanie Wilson, STS-120 mission specialist; Sandra Magnus, Expedition 17 flight engineer; and Dan Tani, Expedition 16 flight engineer, use the virtual reality lab at Johnson Space Center to train for their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements. A computer display is visible in the foreground.
Virtual reality and hallucination: a technoetic perspective
NASA Astrophysics Data System (ADS)
Slattery, Diana R.
2008-02-01
Virtual Reality (VR), especially in a technologically focused discourse, is defined by a class of hardware and software, among them head-mounted displays (HMDs), navigation and pointing devices; and stereoscopic imaging. This presentation examines the experiential aspect of VR. Putting "virtual" in front of "reality" modifies the ontological status of a class of experience-that of "reality." Reality has also been modified [by artists, new media theorists, technologists and philosophers] as augmented, mixed, simulated, artificial, layered, and enhanced. Modifications of reality are closely tied to modifications of perception. Media theorist Roy Ascott creates a model of three "VR's": Verifiable Reality, Virtual Reality, and Vegetal (entheogenically induced) Reality. The ways in which we shift our perceptual assumptions, create and verify illusions, and enter "the willing suspension of disbelief" that allows us entry into imaginal worlds is central to the experience of VR worlds, whether those worlds are explicitly representational (robotic manipulations by VR) or explicitly imaginal (VR artistic creations). The early rhetoric surrounding VR was interwoven with psychedelics, a perception amplified by Timothy Leary's presence on the historic SIGGRAPH panel, and the Wall Street Journal's tag of VR as "electronic LSD." This paper discusses the connections-philosophical, social-historical, and psychological-perceptual between these two domains.
Can walking motions improve visually induced rotational self-motion illusions in virtual reality?
Riecke, Bernhard E; Freiberg, Jacob B; Grechkin, Timofey Y
2015-02-04
Illusions of self-motion (vection) can provide compelling sensations of moving through virtual environments without the need for complex motion simulators or large tracked physical walking spaces. Here we explore the interaction between biomechanical cues (stepping along a rotating circular treadmill) and visual cues (viewing simulated self-rotation) for providing stationary users a compelling sensation of rotational self-motion (circular vection). When tested individually, biomechanical and visual cues were similarly effective in eliciting self-motion illusions. However, in combination they yielded significantly more intense self-motion illusions. These findings provide the first compelling evidence that walking motions can be used to significantly enhance visually induced rotational self-motion perception in virtual environments (and vice versa) without having to provide for physical self-motion or motion platforms. This is noteworthy, as linear treadmills have been found to actually impair visually induced translational self-motion perception (Ash, Palmisano, Apthorp, & Allison, 2013). Given the predominant focus on linear walking interfaces for virtual-reality locomotion, our findings suggest that investigating circular and curvilinear walking interfaces offers a promising direction for future research and development and can help to enhance self-motion illusions, presence and immersion in virtual-reality systems. © 2015 ARVO.
Active tactile exploration using a brain-machine-brain interface.
O'Doherty, Joseph E; Lebedev, Mikhail A; Ifft, Peter J; Zhuang, Katie Z; Shokur, Solaiman; Bleuler, Hannes; Nicolelis, Miguel A L
2011-10-05
Brain-machine interfaces use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. It is hoped that brain-machine interfaces can be used to restore the normal sensorimotor functions of the limbs, but so far they have lacked tactile sensation. Here we report the operation of a brain-machine-brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and allows signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex. Monkeys performed an active exploration task in which an actuator (a computer cursor or a virtual-reality arm) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in the primary motor cortex. ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search for and distinguish one of three visually identical objects, using the virtual-reality arm to identify the unique artificial texture associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic or even virtual prostheses.
Kneeshaw, T.A.; McGuire, J.T.; Smith, E.W.; Cozzarelli, I.M.
2007-01-01
This paper presents small-scale push-pull tests designed to evaluate the kinetic controls on SO42 - reduction in situ at mixing interfaces between a wetland and aquifer impacted by landfill leachate at the Norman Landfill research site, Norman, OK. Quantifying the rates of redox reactions initiated at interfaces is of great interest because interfaces have been shown to be zones of increased biogeochemical transformations and thus may play an important role in natural attenuation. To mimic the aquifer-wetland interface and evaluate reaction rates, SO42 --rich anaerobic aquifer water (??? 100 mg / L SO42 -) was introduced into SO42 --depleted wetland porewater via push-pull tests. Results showed SO42 - reduction was stimulated by the mixing of these waters and first-order rate coefficients were comparable to those measured in other push-pull studies. However, rate data were complex involving either multiple first-order rate coefficients or a more complex rate order. In addition, a lag phase was observed prior to SO42 - reduction that persisted until the mixing interface between test solution and native water was recovered, irrespective of temporal and spatial constraints. The lag phase was not eliminated by the addition of electron donor (acetate) to the injected test solution. Subsequent push-pull tests designed to elucidate the nature of the lag phase support the importance of the mixing interface in controlling terminal electron accepting processes. These data suggest redox reactions may occur rapidly at the mixing interface between injected and native waters but not in the injected bulk water mass. Under these circumstances, push-pull test data should be evaluated to ensure the apparent rate is actually a function of time and that complexities in rate data be considered. ?? 2007 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Minagawa, Masahiro; Takahashi, Noriko
2016-02-01
To investigate the lifetime improvement mechanism caused by mixing at the heterojunction interface, organic light-emitting diodes (OLEDs) with stacked and mixed 4,4‧-bis[N-(1-naphthyl)-N-phenyl-amino]-biphenyl (α-NPD)/tris(8-hydroxyquinoline)aluminum (Alq3) interfaces were fabricated, and changes in their displacement current due to continuous operation were measured. A decrease in accumulated holes at the α-NPD/Alq3 interface was observed in the stacked configuration devices over longer operations. These results indicate that the injected hole density was reduced during continuous operation, implying that the carrier balance became uneven in the emission region. However, few accumulated holes and changes in the displacement current due to continuous operation were observed in the devices having the mixed layer. Therefore, it was deduced that the number of holes concentrated between the α-NPD and Alq3 layers was decreased by mixing at the heterojunction interface, and that the change in the number of holes was smaller during continuous operation, resulting in less degradation.
Virtual Reality Simulation of the International Space Welding Experiment
NASA Technical Reports Server (NTRS)
Phillips, James A.
1996-01-01
Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.
Ranky, Richard G; Sivak, Mark L; Lewis, Jeffrey A; Gade, Venkata K; Deutsch, Judith E; Mavroidis, Constantinos
2014-06-05
Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider's lower extremities. The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders.
Rodríguez Patino, Juan M; Cejudo Fernández, Marta; Carrera Sánchez, Cecilio; Rodríguez Niño, Ma Rosario
2007-09-01
The structural and shear characteristics of mixed monolayers formed by an adsorbed Na-caseinate film and a spread monoglyceride (monopalmitin or monoolein) on the previously adsorbed protein film have been analyzed. Measurements of the surface pressure (pi)-area (A) isotherm and surface shear viscosity (eta(s)) were obtained at 20 degrees C and at pH 7 in a modified Wilhelmy-type film balance. The structural and shear characteristics of the mixed films depend on the surface pressure and on the composition of the mixed film. At surface pressures lower than the equilibrium surface pressure of Na-caseinate (at pi
Assistant Principals and Reform: A Socialization Paradox?
ERIC Educational Resources Information Center
Best, Marguerita L.
2013-01-01
Framed in the critical race theory of structuration (CRTS), this sequential explanatory mixed methods study seeks to identify the socialization practices by examining the realities of practices of assistant principals and the ways in which they impact the disciplinary actions of assistant principals at middle and high schools. The mixed methods…
Efficient Verification of Holograms Using Mobile Augmented Reality.
Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter
2016-07-01
Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users.
Surgeon Design Interface for Patient-Specific Concentric Tube Robots
Morimoto, Tania K.; Greer, Joseph D.; Hsieh, Michael H.; Okamura, Allison M.
2017-01-01
Concentric tube robots have potential for use in a wide variety of surgical procedures due to their small size, dexterity, and ability to move in highly curved paths. Unlike most existing clinical robots, the design of these robots can be developed and manufactured on a patient- and procedure-specific basis. The design of concentric tube robots typically requires significant computation and optimization, and it remains unclear how the surgeon should be involved. We propose to use a virtual reality-based design environment for surgeons to easily and intuitively visualize and design a set of concentric tube robots for a specific patient and procedure. In this paper, we describe a novel patient-specific design process in the context of the virtual reality interface. We also show a resulting concentric tube robot design, created by a pediatric urologist to access a kidney stone in a pediatric patient. PMID:28656124
Surgeon Design Interface for Patient-Specific Concentric Tube Robots.
Morimoto, Tania K; Greer, Joseph D; Hsieh, Michael H; Okamura, Allison M
2016-06-01
Concentric tube robots have potential for use in a wide variety of surgical procedures due to their small size, dexterity, and ability to move in highly curved paths. Unlike most existing clinical robots, the design of these robots can be developed and manufactured on a patient- and procedure-specific basis. The design of concentric tube robots typically requires significant computation and optimization, and it remains unclear how the surgeon should be involved. We propose to use a virtual reality-based design environment for surgeons to easily and intuitively visualize and design a set of concentric tube robots for a specific patient and procedure. In this paper, we describe a novel patient-specific design process in the context of the virtual reality interface. We also show a resulting concentric tube robot design, created by a pediatric urologist to access a kidney stone in a pediatric patient.
A Case-Based Study with Radiologists Performing Diagnosis Tasks in Virtual Reality.
Venson, José Eduardo; Albiero Berni, Jean Carlo; Edmilson da Silva Maia, Carlos; Marques da Silva, Ana Maria; Cordeiro d'Ornellas, Marcos; Maciel, Anderson
2017-01-01
In radiology diagnosis, medical images are most often visualized slice by slice. At the same time, the visualization based on 3D volumetric rendering of the data is considered useful and has increased its field of application. In this work, we present a case-based study with 16 medical specialists to assess the diagnostic effectiveness of a Virtual Reality interface in fracture identification over 3D volumetric reconstructions. We developed a VR volume viewer compatible with both the Oculus Rift and handheld-based head mounted displays (HMDs). We then performed user experiments to validate the approach in a diagnosis environment. In addition, we assessed the subjects' perception of the 3D reconstruction quality, ease of interaction and ergonomics, and also the users opinion on how VR applications can be useful in healthcare. Among other results, we have found a high level of effectiveness of the VR interface in identifying superficial fractures on head CTs.
An augmented reality haptic training simulator for spinal needle procedures.
Sutherland, Colin; Hashtrudi-Zaad, Keyvan; Sellens, Rick; Abolmaesumi, Purang; Mousavi, Parvin
2013-11-01
This paper presents the prototype for an augmented reality haptic simulation system with potential for spinal needle insertion training. The proposed system is composed of a torso mannequin, a MicronTracker2 optical tracking system, a PHANToM haptic device, and a graphical user interface to provide visual feedback. The system allows users to perform simulated needle insertions on a physical mannequin overlaid with an augmented reality cutaway of patient anatomy. A tissue model based on a finite-element model provides force during the insertion. The system allows for training without the need for the presence of a trained clinician or access to live patients or cadavers. A pilot user study demonstrates the potential and functionality of the system.
Communication Architecture in Mixed-Reality Simulations of Unmanned Systems
2018-01-01
Verification of the correct functionality of multi-vehicle systems in high-fidelity scenarios is required before any deployment of such a complex system, e.g., in missions of remote sensing or in mobile sensor networks. Mixed-reality simulations where both virtual and physical entities can coexist and interact have been shown to be beneficial for development, testing, and verification of such systems. This paper deals with the problems of designing a certain communication subsystem for such highly desirable realistic simulations. Requirements of this communication subsystem, including proper addressing, transparent routing, visibility modeling, or message management, are specified prior to designing an appropriate solution. Then, a suitable architecture of this communication subsystem is proposed together with solutions to the challenges that arise when simultaneous virtual and physical message transmissions occur. The proposed architecture can be utilized as a high-fidelity network simulator for vehicular systems with implicit mobility models that are given by real trajectories of the vehicles. The architecture has been utilized within multiple projects dealing with the development and practical deployment of multi-UAV systems, which support the architecture’s viability and advantages. The provided experimental results show the achieved similarity of the communication characteristics of the fully deployed hardware setup to the setup utilizing the proposed mixed-reality architecture. PMID:29538290
Mixed reality ventriculostomy simulation: experience in neurosurgical residency.
Hooten, Kristopher G; Lister, J Richard; Lombard, Gwen; Lizdas, David E; Lampotang, Samsun; Rajon, Didier A; Bova, Frank; Murad, Gregory J A
2014-12-01
Medicine and surgery are turning toward simulation to improve on limited patient interaction during residency training. Many simulators today use virtual reality with augmented haptic feedback with little to no physical elements. In a collaborative effort, the University of Florida Department of Neurosurgery and the Center for Safety, Simulation & Advanced Learning Technologies created a novel "mixed" physical and virtual simulator to mimic the ventriculostomy procedure. The simulator contains all the physical components encountered for the procedure with superimposed 3-D virtual elements for the neuroanatomical structures. To introduce the ventriculostomy simulator and its validation as a necessary training tool in neurosurgical residency. We tested the simulator in more than 260 residents. An algorithm combining time and accuracy was used to grade performance. Voluntary postperformance surveys were used to evaluate the experience. Results demonstrate that more experienced residents have statistically significant better scores and completed the procedure in less time than inexperienced residents. Survey results revealed that most residents agreed that practice on the simulator would help with future ventriculostomies. This mixed reality simulator provides a real-life experience, and will be an instrumental tool in training the next generation of neurosurgeons. We have now implemented a standard where incoming residents must prove efficiency and skill on the simulator before their first interaction with a patient.
Communication Architecture in Mixed-Reality Simulations of Unmanned Systems.
Selecký, Martin; Faigl, Jan; Rollo, Milan
2018-03-14
Verification of the correct functionality of multi-vehicle systems in high-fidelity scenarios is required before any deployment of such a complex system, e.g., in missions of remote sensing or in mobile sensor networks. Mixed-reality simulations where both virtual and physical entities can coexist and interact have been shown to be beneficial for development, testing, and verification of such systems. This paper deals with the problems of designing a certain communication subsystem for such highly desirable realistic simulations. Requirements of this communication subsystem, including proper addressing, transparent routing, visibility modeling, or message management, are specified prior to designing an appropriate solution. Then, a suitable architecture of this communication subsystem is proposed together with solutions to the challenges that arise when simultaneous virtual and physical message transmissions occur. The proposed architecture can be utilized as a high-fidelity network simulator for vehicular systems with implicit mobility models that are given by real trajectories of the vehicles. The architecture has been utilized within multiple projects dealing with the development and practical deployment of multi-UAV systems, which support the architecture's viability and advantages. The provided experimental results show the achieved similarity of the communication characteristics of the fully deployed hardware setup to the setup utilizing the proposed mixed-reality architecture.
Phase transition of LB films of mixed diblock copolymer at the air/water interface
NASA Astrophysics Data System (ADS)
Seo, Y. S.; Kim, K. S.; Samuilov, V.; Rafailovich, M. H.; Sokolov, J.; Lammertink, Rob G. H.; Vancso, G. J.
2000-03-01
We have studied the morphology of Langmuir blodgett films at the air/water interface of mixed diblock copolymer films. Solutions of poly(styrene-b-ferrocenyldimethylsilane) and PS-b-P2VP mixed in a ratio of 20/80 in chloroform were spread at the air/water interface. The morphology of the films was studied with AFM as a function of the surface pressure and the diblock copolymer molecular weight. The results show that the two diblock copolymers can be induced to mix at the air/water interface with increasing surface pressure. A reversible transition from spherical to cylindrical morphologies is induced in the mixture which can not be observed in films formed of the two components separately. The effective surface phase diagram as a function of block copolymer composition and pressure will be presented.
Krasovsky, Tal; Weiss, Patrice L; Kizony, Rachel
2018-04-06
Texting while walking (TeWW) has become common among people of all ages, and mobile phone use during gait is increasingly associated with pedestrian injury. Although dual-task walking performance is known to decline with age, data regarding the effect of age on dual-task performance in ecological settings are limited. The objective of this study was to evaluate the effect of age, environment (indoors/outdoors), and mixed reality (merging of real and virtual environments) on TeWW performance. A cross-sectional design was used. Young (N = 30; 27.8 ± 4.4 years) and older (N = 20; 68.9 ± 3.9 years) adults performed single and dual-task texting and walking indoors and outdoors, with and without a mixed reality display. Participants also completed evaluations of visual scanning and cognitive flexibility (Trail Making Test) and functional mobility (Timed Up and Go). Indoors, similar interference to walking and texting occurred for both groups, but only older adults' gait variability increased under dual task conditions. Outdoors, TeWW was associated with larger age-related differences in gait variability, texting accuracy, and gait dual-task costs. Young adults with better visual scanning and cognitive flexibility performed TeWW with lower gait costs (r = 0.52 to r = 0.65). The mixed reality display was unhelpful and did not modify walking or texting. Older adults tested in this study were relatively high-functioning. Gaze of participants was not directly monitored. Although young and older adults possess the resources necessary for TeWW, older adults pay an additional "price" when dual-tasking, especially outdoors. TeWW may have potential as an ecologically-valid assessment and/or an intervention paradigm for dual task performance among older adults as well as for clinical populations.
Jourdain, Laureline S; Schmitt, Christophe; Leser, Martin E; Murray, Brent S; Dickinson, Eric
2009-09-01
We report on the interfacial properties of electrostatic complexes of protein (sodium caseinate) with a highly sulfated polysaccharide (dextran sulfate). Two routes were investigated for preparation of adsorbed layers at the n-tetradecane-water interface at pH = 6. Bilayers were made by the layer-by-layer deposition technique whereby polysaccharide was added to a previously established protein-stabilized interface. Mixed layers were made by the conventional one-step method in which soluble protein-polysaccharide complexes were adsorbed directly at the interface. Protein + polysaccharide systems gave a slower decay of interfacial tension and stronger dilatational viscoelastic properties than the protein alone, but there was no significant difference in dilatational properties between mixed layers and bilayers. Conversely, shear rheology experiments exhibited significant differences between the two kinds of interfacial layers, with the mixed system giving much stronger interfacial films than the bilayer system, i.e., shear viscosities and moduli at least an order of magnitude higher. The film shear viscoelasticity was further enhanced by acidification of the biopolymer mixture to pH = 2 prior to interface formation. Taken together, these measurements provide insight into the origin of previously reported differences in stability properties of oil-in-water emulsions made by the bilayer and mixed layer approaches. Addition of a proteolytic enzyme (trypsin) to both types of interfaces led to a significant increase in the elastic modulus of the film, suggesting that the enzyme was adsorbed at the interface via complexation with dextran sulfate. Overall, this study has confirmed the potential of shear rheology as a highly sensitive probe of associative electrostatic interactions and interfacial structure in mixed biopolymer layers.
Transforming an educational virtual reality simulation into a work of fine art.
Panaiotis; Addison, Laura; Vergara, Víctor M; Hakamata, Takeshi; Alverson, Dale C; Saiki, Stanley M; Caudell, Thomas Preston
2008-01-01
This paper outlines user interface and interaction issues, technical considerations, and problems encountered in transforming an educational VR simulation of a reified kidney nephron into an interactive artwork appropriate for a fine arts museum.
Single-interface Richtmyer-Meshkov turbulent mixing at the Los Alamos Vertical Shock Tube
Wilson, Brandon Merrill; Mejia Alvarez, Ricardo; Prestridge, Katherine Philomena
2016-04-12
We studied Mach number and initial conditions effects on Richtmyer–Meshkov (RM) mixing by the vertical shock tube (VST) at Los Alamos National Laboratory (LANL). At the VST, a perturbed stable light-to-heavy (air–SF 6, A=0.64) interface is impulsively accelerated with a shock wave to induce RM mixing. We investigate changes to both large and small scales of mixing caused by changing the incident Mach number (Ma=1.3 and 1.45) and the three-dimensional (3D) perturbations on the interface. Simultaneous density (quantitative planar laser-induced fluorescence (PLIF)) and velocity (particle image velocimetry (PIV)) measurements are used to characterize preshock initial conditions and the dynamic shockedmore » interface. Initial conditions and fluid properties are characterized before shock. Using two types of dynamic measurements, time series (N=5 realizations at ten locations) and statistics (N=100 realizations at a single location) of the density and velocity fields, we calculate several mixing quantities. Mix width, density-specific volume correlations, density–vorticity correlations, vorticity, enstrophy, strain, and instantaneous dissipation rate are examined at one downstream location. Results indicate that large-scale mixing, such as the mix width, is strongly dependent on Mach number, whereas small scales are strongly influenced by initial conditions. Lastly, the enstrophy and strain show focused mixing activity in the spike regions.« less
On the use of Augmented Reality techniques in learning and interpretation of cardiologic data.
Lamounier, Edgard; Bucioli, Arthur; Cardoso, Alexandre; Andrade, Adriano; Soares, Alcimar
2010-01-01
Augmented Reality is a technology which provides people with more intuitive ways of interaction and visualization, close to those in real world. The amount of applications using Augmented Reality is growing every day, and results can be already seen in several fields such as Education, Training, Entertainment and Medicine. The system proposed in this article intends to provide a friendly and intuitive interface based on Augmented Reality for heart beating evaluation and visualization. Cardiologic data is loaded from several distinct sources: simple standards of heart beating frequencies (for example situations like running or sleeping), files of heart beating signals, scanned electrocardiographs and real time data acquisition of patient's heart beating. All this data is processed to produce visualization within Augmented Reality environments. The results obtained in this research have shown that the developed system is able to simplify the understanding of concepts about heart beating and its functioning. Furthermore, the system can help health professionals in the task of retrieving, processing and converting data from all the sources handled by the system, with the support of an edition and visualization mode.
STS-116 and Expedition 12 Preflight Training, VR Lab Bldg. 9.
2005-05-06
JSC2005-E-18147 (6 May 2005) --- Astronauts Sunita L. Williams (left), Expedition 14 flight engineer, and Joan E. Higginbotham, STS-116 mission specialist, use the virtual reality lab at the Johnson Space Center to train for their duties aboard the space shuttle. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements. Williams will join Expedition 14 in progress and serve as a flight engineer after traveling to the station on space shuttle mission STS-116.
Passive scalar dynamics near the turbulent/nonturbulent interface in a jet
NASA Astrophysics Data System (ADS)
Taveira, Rodrigo R.; da Silva, Carlos
2011-11-01
The present work uses several direct numerical simulations (DNS) of turbulent planar jets at Reynolds number ranging from Reλ = 120 to Reλ = 160 and Schmidt numbers raging from Sc = 0 . 7 to 7.0 to analyze the nature and properties of the ``scalar interface'' and to investigate the dynamics of turbulent mixing of a passive scalar. Specifically, we employ conditional statistics in relation to the distance from the T/NT interface in order to eliminate the intermittency that affects common turbulence statistics close to the jet edge. The physical mechanisms behind scalar mixing near the T/NT interfaces and their associated turbulent scales and topology are investigated. A sharp scalar interface exists separating the Turbulent and the irrotational flow regions. The thickness of this scalar interface δθ is also of the order of the Taylor micro-scale, λ. However, the thickness of the scalar gradient variance <θ2 >I (where Gj = ∂ θ / ∂xj) is much smaller. Very intense scalar gradient sheet structures along regions of intense strain, in particular at the T/NT interface. The scalar gradient transport equation is analyzed in order to further investigate the physical mechanism of scalar turbulent mixing at the jet edge. Almost all mixing takes place in a confined region close to the interface, beyond which they become reduced to an almost in perfect - balance between production and dissipation of scalar variance.
NASA Astrophysics Data System (ADS)
Huang, Wen Deng; Chen, Guang De; Yuan, Zhao Lin; Yang, Chuang Hua; Ye, Hong Gang; Wu, Ye Long
2016-02-01
The theoretical investigations of the interface optical phonons, electron-phonon couplings and its ternary mixed effects in zinc-blende spherical quantum dots are obtained by using the dielectric continuum model and modified random-element isodisplacement model. The features of dispersion curves, electron-phonon coupling strengths, and its ternary mixed effects for interface optical phonons in a single zinc-blende GaN/AlxGa1-xN spherical quantum dot are calculated and discussed in detail. The numerical results show that there are three branches of interface optical phonons. One branch exists in low frequency region; another two branches exist in high frequency region. The interface optical phonons with small quantum number l have more important contributions to the electron-phonon interactions. It is also found that ternary mixed effects have important influences on the interface optical phonon properties in a single zinc-blende GaN/AlxGa1-xN quantum dot. With the increase of Al component, the interface optical phonon frequencies appear linear changes, and the electron-phonon coupling strengths appear non-linear changes in high frequency region. But in low frequency region, the frequencies appear non-linear changes, and the electron-phonon coupling strengths appear linear changes.
Diffusive mixing through velocity profile variation in microchannels
NASA Astrophysics Data System (ADS)
Yakhshi-Tafti, Ehsan; Cho, Hyoung J.; Kumar, Ranganathan
2011-03-01
Rapid mixing does not readily occur at low Reynolds number flows encountered in microdevices; however, it can be enhanced by passive diffusive mixing schemes. This study of micromixing of two miscible fluids is based on the principle that (1) increased velocity at the interface of co-flowing fluids results in increased diffusive mass flux across their interface, and (2) diffusion interfaces between two liquids progress transversely as the flow proceeds downstream. A passive micromixer is proposed that takes advantage of the peak velocity variation, inducing diffusive mixing. The effect of flow variation on the enhancement of diffusive mixing is investigated analytically and experimentally. Variation of the flow profile is confirmed using micro-Particle Image Velocimetry (μPIV) and mixing is evaluated by color variations resulting from the mixing of pH indicator and basic solutions. Velocity profile variations obtained from μPIV show a shift in peak velocities. The mixing efficiency of the Σ-micromixer is expected to be higher than that for a T-junction channel and can be as high as 80%. The mixing efficiency decreases with Reynolds number and increases with downstream length, exhibiting a power law.
Secoli, R; Zondervan, D; Reinkensmeyer, D
2012-01-01
For children with a severe disability, such as can arise from cerebral palsy, becoming independent in mobility is a critical goal. Currently, however, driver's training for powered wheelchair use is labor intensive, requiring hand-over-hand assistance from a skilled therapist to keep the trainee safe. This paper describes the design of a mixed reality environment for semi-autonomous training of wheelchair driving skills. In this system, the wheelchair is used as the gaming input device, and users train driving skills by maneuvering through floor-projected games created with a multi-projector system and a multi-camera tracking system. A force feedback joystick assists in steering and enhances safety.
Interface dissolution control of the 14C profile in marine sediment
Keir, R.S.; Michel, R.L.
1993-01-01
The process of carbonate dissolution at the sediment-water interface has two possible endmember boundary conditions. Either the carbonate particles dissolve mostly before they are incorporated into the sediment by bioturbation (interface dissolution), or the vertical mixing is rapid relative to their extermination rate (homogeneous dissolution). In this study, a detailed radiocarbon profile was determined in deep equatorial Pacific sediment that receives a high rate of carbonate supply. In addition, a box model of sediment mixing was used to simulate radiocarbon, carbonate content and excess thorium profiles that result from either boundary process following a dissolution increase. Results from homogeneous dissolution imply a strong, very recent erosional event, while interface dissolution suggests that moderately increased dissolution began about 10,000 years ago. In order to achieve the observed mixed layer radiocarbon age, increased homogeneous dissolution would concentrate a greater amount of clay and 230Th than is observed, while for interface dissolution the predicted concentrations are too small. These results together with small discontinuities beneath the mixed layer in 230Th profiles suggest a two-stage increase in interface dissolution in the deep Pacific, the first occurring near the beginning of the Holocene and the second more recently, roughly 5000 years ago. ?? 1993.
Chromium silicide formation by ion mixing
NASA Technical Reports Server (NTRS)
Shreter, U.; So, F. C. T.; Nicolet, M.-A.
1984-01-01
The formation of CrSi2 by ion mixing was studied as a function of temperature, silicide thickness and irradiated interface. Samples were prepared by annealing evaporated couples of Cr on Si and Si on Cr at 450 C for short times to form Si/CrSi2/Cr sandwiches. Xenon beams with energies up to 300 keV and fluences up to 8 x 10 to the 15th per sq cm were used for mixing at temperatures between 20 and 300 C. Penetrating only the Cr/CrSi2 interface at temperatures above 150 C induces further growth of the silicide as a uniform stoichiometric layer. The growth rate does not depend on the thickness of the initially formed silicide at least up to a thickness of 150 nm. The amount of growth depends linearly on the density of energy deposited at the interface. The growth is temperature dependent with an apparent activation energy of 0.2 eV. Irradiating only through the Si/CrSi2 interface does not induce silicide growth. It is concluded that the formation of CrSi2 by ion beam mixing is an interface-limited process and that the limiting reaction occurs at the Cr/CrSi2 interface.
Graphical user interface concepts for tactical augmented reality
NASA Astrophysics Data System (ADS)
Argenta, Chris; Murphy, Anne; Hinton, Jeremy; Cook, James; Sherrill, Todd; Snarski, Steve
2010-04-01
Applied Research Associates and BAE Systems are working together to develop a wearable augmented reality system under the DARPA ULTRA-Vis program†. Our approach to achieve the objectives of ULTRAVis, called iLeader, incorporates a full color 40° field of view (FOV) see-thru holographic waveguide integrated with sensors for full position and head tracking to provide an unobtrusive information system for operational maneuvers. iLeader will enable warfighters to mark-up the 3D battle-space with symbologic identification of graphical control measures, friendly force positions and enemy/target locations. Our augmented reality display provides dynamic real-time painting of symbols on real objects, a pose-sensitive 360° representation of relevant object positions, and visual feedback for a variety of system activities. The iLeader user interface and situational awareness graphical representations are highly intuitive, nondisruptive, and always tactically relevant. We used best human-factors practices, system engineering expertise, and cognitive task analysis to design effective strategies for presenting real-time situational awareness to the military user without distorting their natural senses and perception. We present requirements identified for presenting information within a see-through display in combat environments, challenges in designing suitable visualization capabilities, and solutions that enable us to bring real-time iconic command and control to the tactical user community.
NASA Astrophysics Data System (ADS)
Dastageeri, H.; Storz, M.; Koukofikis, A.; Knauth, S.; Coors, V.
2016-09-01
Providing mobile location-based information for pedestrians faces many challenges. On one hand the accuracy of localisation indoors and outdoors is restricted due to technical limitations of GPS and Beacons. Then again only a small display is available to display information as well as to develop a user interface. Plus, the software solution has to consider the hardware characteristics of mobile devices during the implementation process for aiming a performance with minimum latency. This paper describes our approach by including a combination of image tracking and GPS or Beacons to ensure orientation and precision of localisation. To communicate the information on Points of Interest (POIs), we decided to choose Augmented Reality (AR). For this concept of operations, we used besides the display also the acceleration and positions sensors as a user interface. This paper especially goes into detail on the optimization of the image tracking algorithms, the development of the video-based AR player for the Android platform and the evaluation of videos as an AR element in consideration of providing a good user experience. For setting up content for the POIs or even generate a tour we used and extended the Open Geospatial Consortium (OGC) standard Augmented Reality Markup Language (ARML).
Use of display technologies for augmented reality enhancement
NASA Astrophysics Data System (ADS)
Harding, Kevin
2016-06-01
Augmented reality (AR) is seen as an important tool for the future of user interfaces as well as training applications. An important application area for AR is expected to be in the digitization of training and worker instructions used in the Brilliant Factory environment. The transition of work instructions methods from printed pages in a book or taped to a machine to virtual simulations is a long step with many challenges along the way. A variety of augmented reality tools are being explored today for industrial applications that range from simple programmable projections in the work space to 3D displays and head mounted gear. This paper will review where some of these tool are today and some of the pros and cons being considered for the future worker environment.
Virtual reality as a tool for cross-cultural communication: an example from military team training
NASA Astrophysics Data System (ADS)
Downes-Martin, Stephen; Long, Mark; Alexander, Joanna R.
1992-06-01
A major problem with communication across cultures, whether professional or national, is that simple language translation if often insufficient to communicate the concepts. This is especially true when the communicators come from highly specialized fields of knowledge or from national cultures with long histories of divergence. This problem becomes critical when the goal of the communication is national negotiation dealing with such high risk items as arms negotiation or trade wars. Virtual Reality technology has considerable potential for facilitating communication across cultures, by immersing the communicators within multiple visual representations of the concepts, and providing control over those representations. Military distributed team training provides a model for virtual reality suitable for cross cultural communication such as negotiation. In both team training and negotiation, the participants must cooperate, agree on a set of goals, and achieve mastery over the concepts being negotiated. Team training technologies suitable for supporting cross cultural negotiation exist (branch wargaming, computer image generation and visualization, distributed simulation), and have developed along different lines than traditional virtual reality technology. Team training de-emphasizes the realism of physiological interfaces between the human and the virtual reality, and emphasizes the interaction of humans with each other and with intelligent simulated agents within the virtual reality. This approach to virtual reality is suggested as being more fruitful for future work.
HTC Vive MeVisLab integration via OpenVR for medical applications
Egger, Jan; Gall, Markus; Wallner, Jürgen; Boechat, Pedro; Hann, Alexander; Li, Xing; Chen, Xiaojun; Schmalstieg, Dieter
2017-01-01
Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection. PMID:28323840
HTC Vive MeVisLab integration via OpenVR for medical applications.
Egger, Jan; Gall, Markus; Wallner, Jürgen; Boechat, Pedro; Hann, Alexander; Li, Xing; Chen, Xiaojun; Schmalstieg, Dieter
2017-01-01
Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection.
Numerical studies of the effects of jet-induced mixing on liquid-vapor interface condensation
NASA Technical Reports Server (NTRS)
Lin, Chin-Shun
1989-01-01
Numerical solutions of jet-induced mixing in a partially full cryogenic tank are presented. An axisymmetric laminar jet is discharged from the central part of the tank bottom toward the liquid-vapor interface. Liquid is withdrawn at the same volume flow rate from the outer part of the tank. The jet is at a temperature lower than the interface, which is maintained at a certain saturation temperature. The interface is assumed to be flat and shear-free and the condensation-induced velocity is assumed to be negligibly small compared with radial interface velocity. Finite-difference method is used to solve the nondimensional form of steady state continuity, momentum, and energy equations. Calculations are conducted for jet Reynolds numbers ranging from 150 to 600 and Prandtl numbers ranging from 0.85 to 2.65. The effects of above stated parameters on the condensation Nusselt and Stanton numbers which characterize the steady-state interface condensation process are investigated. Detailed analysis to gain a better understanding of the fundamentals of fluid mixing and interface condensation is performed.
Virtual reality and robotics for stroke rehabilitation: where do we go from here?
Wade, Eric; Winstein, Carolee J
2011-01-01
Promoting functional recovery after stroke requires collaborative and innovative approaches to neurorehabilitation research. Task-oriented training (TOT) approaches that include challenging, adaptable, and meaningful activities have led to successful outcomes in several large-scale multisite definitive trials. This, along with recent technological advances of virtual reality and robotics, provides a fertile environment for furthering clinical research in neurorehabilitation. Both virtual reality and robotics make use of multimodal sensory interfaces to affect human behavior. In the therapeutic setting, these systems can be used to quantitatively monitor, manipulate, and augment the users' interaction with their environment, with the goal of promoting functional recovery. This article describes recent advances in virtual reality and robotics and the synergy with best clinical practice. Additionally, we describe the promise shown for automated assessments and in-home activity-based interventions. Finally, we propose a broader approach to ensuring that technology-based assessment and intervention complement evidence-based practice and maintain a patient-centered perspective.
Vertical variation of mixing within porous sediment beds below turbulent flows
Chandler, I. D.; Pearson, J. M.; van Egmond, R.
2016-01-01
Abstract River ecosystems are influenced by contaminants in the water column, in the pore water and adsorbed to sediment particles. When exchange across the sediment‐water interface (hyporheic exchange) is included in modeling, the mixing coefficient is often assumed to be constant with depth below the interface. Novel fiber‐optic fluorometers have been developed and combined with a modified EROSIMESS system to quantify the vertical variation in mixing coefficient with depth below the sediment‐water interface. The study considered a range of particle diameters and bed shear velocities, with the permeability Péclet number, PeK between 1000 and 77,000 and the shear Reynolds number, Re*, between 5 and 600. Different parameterization of both an interface exchange coefficient and a spatially variable in‐sediment mixing coefficient are explored. The variation of in‐sediment mixing is described by an exponential function applicable over the full range of parameter combinations tested. The empirical relationship enables estimates of the depth to which concentrations of pollutants will penetrate into the bed sediment, allowing the region where exchange will occur faster than molecular diffusion to be determined. PMID:27635104
ERIC Educational Resources Information Center
Perez, Ernest
1997-01-01
Examines the practical realities of upgrading Intel personal computers in libraries, considering budgets and technical personnel availability. Highlights include adding RAM; putting in faster processor chips, including clock multipliers; new hard disks; CD-ROM speed; motherboards and interface cards; cost limits and economic factors; and…
Garretson, Justin R [Albuquerque, NM; Parker, Eric P [Albuquerque, NM; Gladwell, T Scott [Albuquerque, NM; Rigdon, J Brian [Edgewood, NM; Oppel, III, Fred J.
2012-05-29
Apparatus and methods for modifying the operation of a robotic vehicle in a real environment to emulate the operation of the robotic vehicle in a mixed reality environment include a vehicle sensing system having a communications module attached to the robotic vehicle for communicating operating parameters related to the robotic vehicle in a real environment to a simulation controller for simulating the operation of the robotic vehicle in a mixed (live, virtual and constructive) environment wherein the affects of virtual and constructive entities on the operation of the robotic vehicle (and vice versa) are simulated. These effects are communicated to the vehicle sensing system which generates a modified control command for the robotic vehicle including the effects of virtual and constructive entities, causing the robot in the real environment to behave as if virtual and constructive entities existed in the real environment.
Schuster-Amft, Corina; Eng, Kynan; Lehmann, Isabelle; Schmid, Ludwig; Kobashi, Nagisa; Thaler, Irène; Verra, Martin L; Henneke, Andrea; Signer, Sandra; McCaskey, Michael; Kiper, Daniel
2014-09-06
In recent years, virtual reality has been introduced to neurorehabilitation, in particular with the intention of improving upper-limb training options and facilitating motor function recovery. The proposed study incorporates a quantitative part and a qualitative part, termed a mixed-methods approach: (1) a quantitative investigation of the efficacy of virtual reality training compared to conventional therapy in upper-limb motor function are investigated, (2a) a qualitative investigation of patients' experiences and expectations of virtual reality training and (2b) a qualitative investigation of therapists' experiences using the virtual reality training system in the therapy setting. At three participating clinics, 60 patients at least 6 months after stroke onset will be randomly allocated to an experimental virtual reality group (EG) or to a control group that will receive conventional physiotherapy or occupational therapy (16 sessions, 45 minutes each, over the course of 4 weeks). Using custom data gloves, patients' finger and arm movements will be displayed in real time on a monitor, and they will move and manipulate objects in various virtual environments. A blinded assessor will test patients' motor and cognitive performance twice before, once during, and twice after the 4-week intervention. The primary outcome measure is the Box and Block Test. Secondary outcome measures are the Chedoke-McMaster Stroke Assessments (hand, arm and shoulder pain subscales), the Chedoke-McMaster Arm and Hand Activity Inventory, the Line Bisection Test, the Stroke Impact Scale, the MiniMentalState Examination and the Extended Barthel Index. Semistructured face-to-face interviews will be conducted with patients in the EG after intervention finalization with a focus on the patients' expectations and experiences regarding the virtual reality training. Therapists' perspectives on virtual reality training will be reviewed in three focus groups comprising four to six occupational therapists and physiotherapists. The interviews will help to gain a deeper understanding of the phenomena under investigation to provide sound recommendations for the implementation of the virtual reality training system for routine use in neurorehabilitation complementing the quantitative clinical assessments. Cliniclatrials.gov Identifier: NCT01774669 (15 January 2013).
Schlesinger, Matthew
2015-12-01
The interface theory offers a rich blend of logic and mathematical modeling with a dash of evolutionary story-telling, leading to the conclusion that perceptual experience and physical reality are only loosely related. Is the theory convincing? I would have to say "almost"; although it certainly has many elements working in its favor, ultimately, I also found that some important questions were ignored or left unanswered (e.g., a more fully articulated account of how evolutionary mechanisms operate on perception). I am quite optimistic that the next iteration of the theory will be able to address these issues.
2014-01-01
Background Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. Methods In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. Results The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider’s lower extremities. Conclusions The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders. PMID:24902780
Projection Mapping User Interface for Disabled People
Simutis, Rimvydas; Maskeliūnas, Rytis
2018-01-01
Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities. PMID:29686827
Projection Mapping User Interface for Disabled People.
Gelšvartas, Julius; Simutis, Rimvydas; Maskeliūnas, Rytis
2018-01-01
Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities.
Ambient Intelligence in Multimeda and Virtual Reality Environments for the rehabilitation
NASA Astrophysics Data System (ADS)
Benko, Attila; Cecilia, Sik Lanyi
This chapter presents a general overview about the use of multimedia and virtual reality in rehabilitation and assistive and preventive healthcare. This chapter deals with multimedia, virtual reality applications based AI intended for use by medical doctors, nurses, special teachers and further interested persons. It describes methods how multimedia and virtual reality is able to assist their work. These include the areas how multimedia and virtual reality can help the patients everyday life and their rehabilitation. In the second part of the chapter we present the Virtual Therapy Room (VTR) a realized application for aphasic patients that was created for practicing communication and expressing emotions in a group therapy setting. The VTR shows a room that contains a virtual therapist and four virtual patients (avatars). The avatars are utilizing their knowledge base in order to answer the questions of the user providing an AI environment for the rehabilitation. The user of the VTR is the aphasic patient who has to solve the exercises. The picture that is relevant for the actual task appears on the virtual blackboard. Patient answers questions of the virtual therapist. Questions are about pictures describing an activity or an object in different levels. Patient can ask an avatar for answer. If the avatar knows the answer the avatars emotion changes to happy instead of sad. The avatar expresses its emotions in different dimensions. Its behavior, face-mimic, voice-tone and response also changes. The emotion system can be described as a deterministic finite automaton where places are emotion-states and the transition function of the automaton is derived from the input-response reaction of an avatar. Natural language processing techniques were also implemented in order to establish highquality human-computer interface windows for each of the avatars. Aphasic patients are able to interact with avatars via these interfaces. At the end of the chapter we visualize the possible future research field.
Zhu, Huaping; Sun, Yaoru; Zeng, Jinhua; Sun, Hongyu
2011-05-01
Previous studies have suggested that the dysfunction of the human mirror neuron system (hMNS) plays an important role in the autism spectrum disorder (ASD). In this work, we propose a novel training program from our interdisciplinary research to improve mirror neuron functions of autistic individuals by using a BCI system with virtual reality technology. It is a promising approach for the autism to learn and develop social communications in a VR environment. A test method for this hypothesis is also provided. Copyright © 2011 Elsevier Ltd. All rights reserved.
Sensor fusion and augmented reality with the SAFIRE system
NASA Astrophysics Data System (ADS)
Saponaro, Philip; Treible, Wayne; Phelan, Brian; Sherbondy, Kelly; Kambhamettu, Chandra
2018-04-01
The Spectrally Agile Frequency-Incrementing Reconfigurable (SAFIRE) mobile radar system was developed and exercised at an arid U.S. test site. The system can detect hidden target using radar, a global positioning system (GPS), dual stereo color cameras, and dual stereo thermal cameras. An Augmented Reality (AR) software interface allows the user to see a single fused video stream containing the SAR, color, and thermal imagery. The stereo sensors allow the AR system to display both fused 2D imagery and 3D metric reconstructions, where the user can "fly" around the 3D model and switch between the modalities.
Interfaces for Advanced Computing.
ERIC Educational Resources Information Center
Foley, James D.
1987-01-01
Discusses the coming generation of supercomputers that will have the power to make elaborate "artificial realities" that facilitate user-computer communication. Illustrates these technological advancements with examples of the use of head-mounted monitors which are connected to position and orientation sensors, and gloves that track finger and…
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J.; Latorre, José M.; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis. PMID:29209193
Brain-computer interface using P300 and virtual reality: a gaming approach for treating ADHD.
Rohani, Darius Adam; Sorensen, Helge B D; Puthusserypady, Sadasivan
2014-01-01
This paper presents a novel brain-computer interface (BCI) system aiming at the rehabilitation of attention-deficit/hyperactive disorder in children. It uses the P300 potential in a series of feedback games to improve the subjects' attention. We applied a support vector machine (SVM) using temporal and template-based features to detect these P300 responses. In an experimental setup using five subjects, an average error below 30% was achieved. To make it more challenging the BCI system has been embedded inside an immersive 3D virtual reality (VR) classroom with simulated distractions, which was created by combining a low-cost infrared camera and an "off-axis perspective projection" algorithm. This system is intended for kids by operating with four electrodes, as well as a non-intrusive VR setting. With the promising results, and considering the simplicity of the scheme, we hope to encourage future studies to adapt the techniques presented in this study.
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J; Latorre, José M; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis.
NASA Astrophysics Data System (ADS)
Pieper, Steven D.; McKenna, Michael; Chen, David; McDowall, Ian E.
1994-04-01
We are interested in the application of computer animation to surgery. Our current project, a navigation and visualization tool for knee arthroscopy, relies on real-time computer graphics and the human interface technologies associated with virtual reality. We believe that this new combination of techniques will lead to improved surgical outcomes and decreased health care costs. To meet these expectations in the medical field, the system must be safe, usable, and cost-effective. In this paper, we outline some of the most important hardware and software specifications in the areas of video input and output, spatial tracking, stereoscopic displays, computer graphics models and libraries, mass storage and network interfaces, and operating systems. Since this is a fairly new combination of technologies and a new application, our justification for our specifications are drawn from the current generation of surgical technology and by analogy to other fields where virtual reality technology has been more extensively applied and studied.
The Dynamics of Turbulent Scalar Mixing near the Edge of a Shear Layer
NASA Astrophysics Data System (ADS)
Taveira, R. M. R.; da Silva, C. B.; Pereira, J. C. F.
2011-12-01
In free shear flows a sharp and convoluted turbulent/nonturbulent (T/NT) interface separates the outer fluid region, where the flow is essentially irrotational, from the shear layer turbulent region. It was found recently that the entrainment mechanism is mainly caused by small scale ("nibbling") motions (Westerweel et al. (2005)). The dynamics of this interface is crucial to understand important exchanges of enstrophy and scalars that can be conceived as a three-stage process of entrainment, dispersion and diffusion (Dimotakis (2005)). A thorough understanding of scalar mixing and transport is of indisputable relevance to control turbulent combustion, propulsion and contaminant dispersion (Stanley et al. (2002)). The present work uses several DNS of turbulent jets at Reynolds number ranging from Reλ = 120 to Reλ = 160 (da Silva & Taveira (2010)) and a Schmidt number Sc = 0.7 to analyze the "scalar interface" and turbulent mixing of a passive scalar. Specifically, we employ conditional statistics, denoted by langlerangleI, in order to eliminate the intermittency that affects statistics close to the jet edge. The physical mechanisms behind scalar mixing near the T/NT interfaces, their scales and topology are investigated detail. Analysis of the instantaneous fields showed intense scalar gradient sheet-like structures along regions of persistent strain, in particular at the T/NT interface. The scalar gradient transport equation, at the jet edge, showed that almost all mixing mechanisms are taking place in a confined region, beyond which they become reduced to an almost in perfect balance between production and dissipation of scalar variance. At the T/NT interface transport mechanisms are the ones responsible for the growth in the scalar fluctuations to the entrained fluid, where convection plays a dominant role, smoothing scalar gradients inside the interface and boosting them as far as
Challenges to the development of complex virtual reality surgical simulations.
Seymour, N E; Røtnes, J S
2006-11-01
Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.
NASA Astrophysics Data System (ADS)
Kal, S.; Kasko, I.; Ryssel, H.
1995-10-01
The influence of ion-beam mixing on ultra-thin cobalt silicide (CoSi2) formation was investigated by characterizing the ion-beam mixed and unmixed CoSi2 films. A Ge+ ion-implantation through the Co film prior to silicidation causes an interface mixing of the cobalt film with the silicon substrate and results in improved silicide-to-silicon interface roughness. Rapid thermal annealing was used to form Ge+ ion mixed and unmixed thin CoSi2 layer from 10 nm sputter deposited Co film. The silicide films were characterized by secondary neutral mass spectroscopy, x-ray diffraction, tunneling electron microscopy (TEM), Rutherford backscattering, and sheet resistance measurements. The experi-mental results indicate that the final rapid thermal annealing temperature should not exceed 800°C for thin (<50 nm) CoSi2 preparation. A comparison of the plan-view and cross-section TEM micrographs of the ion-beam mixed and unmixed CoSi2 films reveals that Ge+ ion mixing (45 keV, 1 × 1015 cm-2) produces homogeneous silicide with smooth silicide-to-silicon interface.
NASA Technical Reports Server (NTRS)
Dumas, Joseph D., II
1998-01-01
Several virtual reality I/O peripherals were successfully configured and integrated as part of the author's 1997 Summer Faculty Fellowship work. These devices, which were not supported by the developers of VR software packages, use new software drivers and configuration files developed by the author to allow them to be used with simulations developed using those software packages. The successful integration of these devices has added significant capability to the ANVIL lab at MSFC. In addition, the author was able to complete the integration of a networked virtual reality simulation of the Space Shuttle Remote Manipulator System docking Space Station modules which was begun as part of his 1996 Fellowship. The successful integration of this simulation demonstrates the feasibility of using VR technology for ground-based training as well as on-orbit operations.
Exercise/recreation facility for a Lunar or Mars analog
NASA Technical Reports Server (NTRS)
1991-01-01
Discussed here is a project to design an exercise/recreation station for an earth based simulator of a lunar or Martian habitat. Specifically, researchers designed a stationary bicycle that will help people keep fit and prevent muscular atrophy while stationed in space. To help with motivation and provide an element of recreation during the workout, the bicycle is enhanced by a virtual reality system. The system will simulate various riding situations and the choice of mountain bike or road bike. The bike employs a magnetic brake that provides continuously changing tension to simulate actual riding conditions. This braking system will be interfaced directly with the virtual reality system. Also integrated into the virtual reality system will be a monitoring system that regulates heart rate, work rate, and other functions during the course of the session.
ERIC Educational Resources Information Center
Smith, Thomas M.; Cannata, Marisa; Haynes, Katherine Taylor
2016-01-01
Background/Context: Mixed methods research conveys multiple advantages to the study of complex phenomena and large organizations or systems. The benefits are derived from drawing on the strengths of qualitative methods to answer questions about how and why a phenomenon occurs and those of quantitative methods to examine how often a phenomenon…
ERIC Educational Resources Information Center
Henderson, Joyce Herod
2013-01-01
Our schools are considered a place of safety for learning, however, the unfortunate reality is that schools may face crises and violence. Leadership styles vary among school leaders and provide the framework for handling daily challenges. This mixed-methods research design was used to investigate the individual leadership styles of public school…
Reality check: Shedding new light on the restoration needs of mixed-conifer forests
Marie Oliver; Thomas Spies; Andrew. Merschel
2014-01-01
Until recently, scientific understanding of the history and ecology of the Pacific Northwest's mixed-conifer forests east of the Cascade Range was minimal. As a result, forest managers have had limited ability to restore the health of publicly owned forests that show signs of acute stress caused by insects, disease, grazing, logging, and wildfire. A...
Plasma kinetic effects on atomistic mix in one dimension and at structured interfaces (I)
NASA Astrophysics Data System (ADS)
Yin, L.; Albright, B. J.; Vold, E. L.; Taitano, W.; Chacon, L.; Simakov, A.
2017-10-01
Kinetic effects on interfacial mix are examined using VPIC simulations. In 1D, comparisons are made to the results of analytic theory in the small Knudsen number limit. While the bulk mixing properties of interfaces are in general agreement, differences arise near the low-concentration fronts during the early evolution of a sharp interface when the species' perpendicular scattering rate dominates over the slowing down rate. In kinetic simulations, the diffusion velocities can be larger or comparable to the ion thermal speeds, and the Knudsen number can be large. Super-diffusive growth in mix widths (Δx ta where a >=1/2) is seen before transition to the slow diffusive process predicted from theory (a =1/2). Mixing at interfaces leads to persistent, bulk, hydrodynamic features in the center of mass flow profiles as a result of diffusion and momentum conservation. These conclusions are drawn from VPIC results together with simulations from the RAGE hydrodynamics code with an implementation of diffusion and viscosity from theory and an implicit Vlasov-Fokker-Planck code iFP. In perturbed 2D and 3D interfaces, it is found that 1D ambipolarity is still valid and that initial perturbations flatten out on a-few-ps time scale, implying that finite diffusivity and viscosity can slow instability growth in ICF and HED settings. Work supported by the LANL ASC and Science programs.
Decohesion Elements using Two and Three-Parameter Mixed-Mode Criteria
NASA Technical Reports Server (NTRS)
Davila, Carlos G.; Camanho, Pedro P.
2001-01-01
An eight-node decohesion element implementing different criteria to predict delamination growth under mixed-mode loading is proposed. The element is used at the interface between solid finite elements to model the initiation and propagation of delamination. A single displacement-based damage parameter is used in a softening law to track the damage state of the interface. The power law criterion and a three-parameter mixed-mode criterion are used to predict delamination growth. The accuracy of the predictions is evaluated in single mode delamination and in the mixed-mode bending tests.
Zarka, David; Cevallos, Carlos; Petieau, Mathieu; Hoellinger, Thomas; Dan, Bernard; Cheron, Guy
2014-01-01
Biological motion observation has been recognized to produce dynamic change in sensorimotor activation according to the observed kinematics. Physical plausibility of the spatial-kinematic relationship of human movement may play a major role in the top-down processing of human motion recognition. Here, we investigated the time course of scalp activation during observation of human gait in order to extract and use it on future integrated brain-computer interface using virtual reality (VR). We analyzed event related potentials (ERP), the event related spectral perturbation (ERSP) and the inter-trial coherence (ITC) from high-density EEG recording during video display onset (−200–600 ms) and the steady state visual evoked potentials (SSVEP) inside the video of human walking 3D-animation in three conditions: Normal; Upside-down (inverted images); and Uncoordinated (pseudo-randomly mixed images). We found that early visual evoked response P120 was decreased in Upside-down condition. The N170 and P300b amplitudes were decreased in Uncoordinated condition. In Upside-down and Uncoordinated conditions, we found decreased alpha power and theta phase-locking. As regards gamma oscillation, power was increased during the Upside-down animation and decreased during the Uncoordinated animation. An SSVEP-like response oscillating at about 10 Hz was also described showing that the oscillating pattern is enhanced 300 ms after the heel strike event only in the Normal but not in the Upside-down condition. Our results are consistent with most of previous point-light display studies, further supporting possible use of virtual reality for neurofeedback applications. PMID:25278847
Mechanically Compliant Electronic Materials for Wearable Photovoltaics and Human-Machine Interfaces
NASA Astrophysics Data System (ADS)
O'Connor, Timothy Francis, III
Applications of stretchable electronic materials for human-machine interfaces are described herein. Intrinsically stretchable organic conjugated polymers and stretchable electronic composites were used to develop stretchable organic photovoltaics (OPVs), mechanically robust wearable OPVs, and human-machine interfaces for gesture recognition, American Sign Language Translation, haptic control of robots, and touch emulation for virtual reality, augmented reality, and the transmission of touch. The stretchable and wearable OPVs comprise active layers of poly-3-alkylthiophene:phenyl-C61-butyric acid methyl ester (P3AT:PCBM) and transparent conductive electrodes of poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) (PEDOT:PSS) and devices could only be fabricated through a deep understanding of the connection between molecular structure and the co-engineering of electronic performance with mechanical resilience. The talk concludes with the use of composite piezoresistive sensors two smart glove prototypes. The first integrates stretchable strain sensors comprising a carbon-elastomer composite, a wearable microcontroller, low energy Bluetooth, and a 6-axis accelerometer/gyroscope to construct a fully functional gesture recognition glove capable of wirelessly translating American Sign Language to text on a cell phone screen. The second creates a system for the haptic control of a 3D printed robot arm, as well as the transmission of touch and temperature information.
Generating Contextual Descriptions of Virtual Reality (VR) Spaces
NASA Astrophysics Data System (ADS)
Olson, D. M.; Zaman, C. H.; Sutherland, A.
2017-12-01
Virtual reality holds great potential for science communication, education, and research. However, interfaces for manipulating data and environments in virtual worlds are limited and idiosyncratic. Furthermore, speech and vision are the primary modalities by which humans collect information about the world, but the linking of visual and natural language domains is a relatively new pursuit in computer vision. Machine learning techniques have been shown to be effective at image and speech classification, as well as at describing images with language (Karpathy 2016), but have not yet been used to describe potential actions. We propose a technique for creating a library of possible context-specific actions associated with 3D objects in immersive virtual worlds based on a novel dataset generated natively in virtual reality containing speech, image, gaze, and acceleration data. We will discuss the design and execution of a user study in virtual reality that enabled the collection and the development of this dataset. We will also discuss the development of a hybrid machine learning algorithm linking vision data with environmental affordances in natural language. Our findings demonstrate that it is possible to develop a model which can generate interpretable verbal descriptions of possible actions associated with recognized 3D objects within immersive VR environments. This suggests promising applications for more intuitive user interfaces through voice interaction within 3D environments. It also demonstrates the potential to apply vast bodies of embodied and semantic knowledge to enrich user interaction within VR environments. This technology would allow for applications such as expert knowledge annotation of 3D environments, complex verbal data querying and object manipulation in virtual spaces, and computer-generated, dynamic 3D object affordances and functionality during simulations.
SU-C-209-06: Improving X-Ray Imaging with Computer Vision and Augmented Reality
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacDougall, R.D.; Scherrer, B; Don, S
Purpose: To determine the feasibility of using a computer vision algorithm and augmented reality interface to reduce repeat rates and improve consistency of image quality and patient exposure in general radiography. Methods: A prototype device, designed for use with commercially available hardware (Microsoft Kinect 2.0) capable of depth sensing and high resolution/frame rate video, was mounted to the x-ray tube housing as part of a Philips DigitalDiagnost digital radiography room. Depth data and video was streamed to a Windows 10 PC. Proprietary software created an augmented reality interface where overlays displayed selectable information projected over real-time video of the patient.more » The information displayed prior to and during x-ray acquisition included: recognition and position of ordered body part, position of image receptor, thickness of anatomy, location of AEC cells, collimated x-ray field, degree of patient motion and suggested x-ray technique. Pre-clinical data was collected in a volunteer study to validate patient thickness measurements and x-ray images were not acquired. Results: Proprietary software correctly identified ordered body part, measured patient motion, and calculated thickness of anatomy. Pre-clinical data demonstrated accuracy and precision of body part thickness measurement when compared with other methods (e.g. laser measurement tool). Thickness measurements provided the basis for developing a database of thickness-based technique charts that can be automatically displayed to the technologist. Conclusion: The utilization of computer vision and commercial hardware to create an augmented reality view of the patient and imaging equipment has the potential to drastically improve the quality and safety of x-ray imaging by reducing repeats and optimizing technique based on patient thickness. Society of Pediatric Radiology Pilot Grant; Washington University Bear Cub Fund.« less
Development and application of virtual reality for man/systems integration
NASA Technical Reports Server (NTRS)
Brown, Marcus
1991-01-01
While the graphical presentation of computer models signified a quantum leap over presentations limited to text and numbers, it still has the problem of presenting an interface barrier between the human user and the computer model. The user must learn a command language in order to orient themselves in the model. For example, to move left from the current viewpoint of the model, they might be required to type 'LEFT' at a keyboard. This command is fairly intuitive, but if the viewpoint moves far enough that there are no visual cues overlapping with the first view, the user does not know if the viewpoint has moved inches, feet, or miles to the left, or perhaps remained in the same position, but rotated to the left. Until the user becomes quite familiar with the interface language of the computer model presentation, they will be proned to lossing their bearings frequently. Even a highly skilled user will occasionally get lost in the model. A new approach to presenting type type of information is to directly interpret the user's body motions as the input language for determining what view to present. When the user's head turns 45 degrees to the left, the viewpoint should be rotated 45 degrees to the left. Since the head moves through several intermediate angles between the original view and the final one, several intermediate views should be presented, providing the user with a sense of continuity between the original view and the final one. Since the primary way a human physically interacts with their environment should monitor the movements of the user's hands and alter objects in the virtual model in a way consistent with the way an actual object would move when manipulated using the same hand movements. Since this approach to the man-computer interface closely models the same type of interface that humans have with the physical world, this type of interface is often called virtual reality, and the model is referred to as a virtual world. The task of this summer fellowship was to set up a virtual reality system at MSFC and begin applying it to some of the questions which concern scientists and engineers involved in space flight. A brief discussion of this work is presented.
Kobayashi, Leo; Zhang, Xiao Chi; Collins, Scott A; Karim, Naz; Merck, Derek L
2018-01-01
Augmented reality (AR), mixed reality (MR), and virtual reality devices are enabling technologies that may facilitate effective communication in healthcare between those with information and knowledge (clinician/specialist; expert; educator) and those seeking understanding and insight (patient/family; non-expert; learner). Investigators initiated an exploratory program to enable the study of AR/MR use-cases in acute care clinical and instructional settings. Academic clinician educators, computer scientists, and diagnostic imaging specialists conducted a proof-of-concept project to 1) implement a core holoimaging pipeline infrastructure and open-access repository at the study institution, and 2) use novel AR/MR techniques on off-the-shelf devices with holoimages generated by the infrastructure to demonstrate their potential role in the instructive communication of complex medical information. The study team successfully developed a medical holoimaging infrastructure methodology to identify, retrieve, and manipulate real patients' de-identified computed tomography and magnetic resonance imagesets for rendering, packaging, transfer, and display of modular holoimages onto AR/MR headset devices and connected displays. Holoimages containing key segmentations of cervical and thoracic anatomic structures and pathology were overlaid and registered onto physical task trainers for simulation-based "blind insertion" invasive procedural training. During the session, learners experienced and used task-relevant anatomic holoimages for central venous catheter and tube thoracostomy insertion training with enhanced visual cues and haptic feedback. Direct instructor access into the learner's AR/MR headset view of the task trainer was achieved for visual-axis interactive instructional guidance. Investigators implemented a core holoimaging pipeline infrastructure and modular open-access repository to generate and enable access to modular holoimages during exploratory pilot stage applications for invasive procedure training that featured innovative AR/MR techniques on off-the-shelf headset devices.
Exploratory Application of Augmented Reality/Mixed Reality Devices for Acute Care Procedure Training
Kobayashi, Leo; Zhang, Xiao Chi; Collins, Scott A.; Karim, Naz; Merck, Derek L.
2018-01-01
Introduction Augmented reality (AR), mixed reality (MR), and virtual reality devices are enabling technologies that may facilitate effective communication in healthcare between those with information and knowledge (clinician/specialist; expert; educator) and those seeking understanding and insight (patient/family; non-expert; learner). Investigators initiated an exploratory program to enable the study of AR/MR use-cases in acute care clinical and instructional settings. Methods Academic clinician educators, computer scientists, and diagnostic imaging specialists conducted a proof-of-concept project to 1) implement a core holoimaging pipeline infrastructure and open-access repository at the study institution, and 2) use novel AR/MR techniques on off-the-shelf devices with holoimages generated by the infrastructure to demonstrate their potential role in the instructive communication of complex medical information. Results The study team successfully developed a medical holoimaging infrastructure methodology to identify, retrieve, and manipulate real patients’ de-identified computed tomography and magnetic resonance imagesets for rendering, packaging, transfer, and display of modular holoimages onto AR/MR headset devices and connected displays. Holoimages containing key segmentations of cervical and thoracic anatomic structures and pathology were overlaid and registered onto physical task trainers for simulation-based “blind insertion” invasive procedural training. During the session, learners experienced and used task-relevant anatomic holoimages for central venous catheter and tube thoracostomy insertion training with enhanced visual cues and haptic feedback. Direct instructor access into the learner’s AR/MR headset view of the task trainer was achieved for visual-axis interactive instructional guidance. Conclusion Investigators implemented a core holoimaging pipeline infrastructure and modular open-access repository to generate and enable access to modular holoimages during exploratory pilot stage applications for invasive procedure training that featured innovative AR/MR techniques on off-the-shelf headset devices. PMID:29383074
NASA Astrophysics Data System (ADS)
Su, Xiaotao; Garofalini, Stephen H.
2005-06-01
Molecular-dynamics simulations of intergranular films (IGF) containing Si, O, N, and Ca in contact with Si3N4 surfaces containing different levels of interface mixing of the species from the IGF with the crystal surfaces were performed using a multibody interatomic potential. This mixing is equivalent to the formation of a roughened silicon oxynitride crystal surface. With significant interphase mixing at the crystal surfaces, less ordering into the IGF caused by the compositionally modified oxynitride interfaces is observed. Such results are in contrast to our earlier data that showed significant ordering into the IGF induced by the ideally terminated crystal surfaces with no interphase mixing. In all cases, the central position of the first peak in the Si-O pair distribution function (PDF) at the interface ranges from 1.62 to 1.64 Å, consistent with recent experimental findings. The central position of the first peak in the Si-N PDF ranges from 1.72 to 1.73 Å, consistent with experimental results. With increased interphase mixing, the intensity as well as the area of the first peak of the Si-O and Si-N PDFs for Si attached to the crystal decreases, indicating the decrease of coordination number of O or N with these silicon. Such combined decrease in coordination indicates a significant remnant of vacancies in the crystal surfaces due to the exchange process used here. The results imply a significant effect of interface roughness on the extent of ordering in the amorphous IGF induced by the crystal surface.
NASA Astrophysics Data System (ADS)
Zhu, Guo; Sun, Jiangping; Zhang, Libin; Gan, Zhiyin
2018-06-01
The temperature effects on the growth of Cu thin film on Si (0 0 1) in the context of magnetron sputtering deposition were systematically studied using molecular dynamics (MD) method. To improve the comparability of simulation results at varying temperatures, the initial status data of incident Cu atoms used in all simulations were read from an identical file via LAMMPS-Python interface. In particular, crystalline microstructure, interface mixing and internal stress of Cu thin film deposited at different temperatures were investigated in detail. With raising the substrate temperature, the interspecies mixed volume and the proportion of face-centered cubic (fcc) structure in the deposited film both increased, while the internal compressive stress decreased. It was found that the fcc structure in the deposited Cu thin films was 〈1 1 1〉 oriented, which was reasonably explained by surface energy minimization and the selectivity of bombardment energy to the crystalline planes. The quantified analysis of interface mixing revealed that the diffusion of Cu atoms dominated the interface mixing, and the injection of incident Cu atoms resulted in the densification of phase near the film-substrate interface. More important, the distribution of atomic stress indicated that the compressive stress was mainly originated from the film-substrate interface, which might be attributed to the densification of interfacial phase at the initial stage of film deposition.
Cyber entertainment system using an immersive networked virtual environment
NASA Astrophysics Data System (ADS)
Ihara, Masayuki; Honda, Shinkuro; Kobayashi, Minoru; Ishibashi, Satoshi
2002-05-01
Authors are examining a cyber entertainment system that applies IPT (Immersive Projection Technology) displays to the entertainment field. This system enables users who are in remote locations to communicate with each other so that they feel as if they are together. Moreover, the system enables those users to experience a high degree of presence, this is due to provision of stereoscopic vision as well as a haptic interface and stereo sound. This paper introduces this system from the viewpoint of space sharing across the network and elucidates its operation using the theme of golf. The system is developed by integrating avatar control, an I/O device, communication links, virtual interaction, mixed reality, and physical simulations. Pairs of these environments are connected across the network. This allows the two players to experience competition. An avatar of each player is displayed by the other player's IPT display in the remote location and is driven by only two magnetic sensors. That is, in the proposed system, users don't need to wear any data suit with a lot of sensors and they are able to play golf without any encumbrance.
GLIMPSE: Google Glass interface for sensory feedback in myoelectric hand prostheses.
Markovic, Marko; Karnal, Hemanth; Graimann, Bernhard; Farina, Dario; Dosen, Strahinja
2017-06-01
Providing sensory feedback to the user of the prosthesis is an important challenge. The common approach is to use tactile stimulation, which is easy to implement but requires training and has limited information bandwidth. In this study, we propose an alternative approach based on augmented reality. We have developed the GLIMPSE, a Google Glass application which connects to the prosthesis via a Bluetooth interface and renders the prosthesis states (EMG signals, aperture, force and contact) using augmented reality (see-through display) and sound (bone conduction transducer). The interface was tested in healthy subjects that used the prosthesis with (FB group) and without (NFB group) feedback during a modified clothespins test that allowed us to vary the difficulty of the task. The outcome measures were the number of unsuccessful trials, the time to accomplish the task, and the subjective ratings of the relevance of the feedback. There was no difference in performance between FB and NFB groups in the case of a simple task (basic, same-color clothespins test), but the feedback significantly improved the performance in a more complex task (pins of different resistances). Importantly, the GLIMPSE feedback did not increase the time to accomplish the task. Therefore, the supplemental feedback might be useful in the tasks which are more demanding, and thereby less likely to benefit from learning and feedforward control. The subjects integrated the supplemental feedback with the intrinsic sources (vision and muscle proprioception), developing their own idiosyncratic strategies to accomplish the task. The present study demonstrates a novel self-contained, ready-to-deploy, wearable feedback interface. The interface was successfully tested and was proven to be feasible and functionally beneficial. The GLIMPSE can be used as a practical solution but also as a general and flexible instrument to investigate closed-loop prosthesis control.
Electronic effects and fundamental physics studied in molecular interfaces.
Pope, Thomas; Du, Shixuan; Gao, Hong-Jun; Hofer, Werner A
2018-05-29
Scanning probe instruments in conjunction with a very low temperature environment have revolutionized the ability of building, functionalizing, and analysing two dimensional interfaces in the last twenty years. In addition, the availability of fast, reliable, and increasingly sophisticated methods to simulate the structure and dynamics of these interfaces allow us to capture even very small effects at the atomic and molecular level. In this review we shall focus largely on metal surfaces and organic molecular compounds and show that building systems from the bottom up and controlling the physical properties of such systems is no longer within the realm of the desirable, but has become day to day reality in our best laboratories.
The flotation and adsorption of mixed collectors on oxide and silicate minerals.
Xu, Longhua; Tian, Jia; Wu, Houqin; Lu, Zhongyuan; Sun, Wei; Hu, Yuehua
2017-12-01
The analysis of flotation and adsorption of mixed collectors on oxide and silicate minerals is of great importance for both industrial applications and theoretical research. Over the past years, significant progress has been achieved in understanding the adsorption of single collectors in micelles as well as at interfaces. By contrast, the self-assembly of mixed collectors at liquid/air and solid/liquid interfaces remains a developing area as a result of the complexity of the mixed systems involved and the limited availability of suitable analytical techniques. In this work, we systematically review the processes involved in the adsorption of mixed collectors onto micelles and at interface by examining four specific points, namely, theoretical background, factors that affect adsorption, analytical techniques, and self-assembly of mixed surfactants at the mineral/liquid interface. In the first part, the theoretical background of collector mixtures is introduced, together with several core solution theories, which are classified according to their application in the analysis of physicochemical properties of mixed collector systems. In the second part, we discuss the factors that can influence adsorption, including factors related to the structure of collectors and environmental conditions. We summarize their influence on the adsorption of mixed systems, with the objective to provide guidance on the progress achieved in this field to date. Advances in measurement techniques can greatly promote our understanding of adsorption processes. In the third part, therefore, modern techniques such as optical reflectometry, neutron scattering, neutron reflectometry, thermogravimetric analysis, fluorescence spectroscopy, ultrafiltration, atomic force microscopy, analytical ultracentrifugation, X-ray photoelectron spectroscopy, Vibrational Sum Frequency Generation Spectroscopy and molecular dynamics simulations are introduced in virtue of their application. Finally, focusing on oxide and silicate minerals, we review and summarize the flotation and adsorption of three most widely used mixed surfactant systems (anionic-cationic, anionic-nonionic, and cationic-nonionic) at the liquid/mineral interface in order to fully understand the self-assembly progress. In the end, the paper gives a brief future outlook of the possible development in the mixed surfactants. Copyright © 2017 Elsevier B.V. All rights reserved.
Expectations and Reality: Evaluating Patterns of Learning Behaviour Using Audit Trails
ERIC Educational Resources Information Center
Kennedy, Gregor E.; Judd, Terry S.
2007-01-01
Developers of educational multimedia programs have expectations about the way in which they will be used. These expectations can be broadly categorised as either functional (primarily related to the interface) or educational (related to learning designs, processes and outcomes). However, student users will not always engage with educational…
Inviting other professions to help reduce wildfire property losses
A. Fege; J. Absher
2007-01-01
Preventing structure loss has become a major focal point of wildland firefighting. Most days it feels like wildland fire professionals and land managers are become more and more responsible for reducing property losses in the wildland/urban interface (WUI).What if this impression-and reality-could change?
Multimedia Courseware in an Open Systems Environment: A Federal Strategy.
ERIC Educational Resources Information Center
Moline, Judi; And Others
The Portable Courseware Project (PORTCO) of the U.S. Department of Defense (DoD) is typical of projects worldwide that require standard software interfaces. This paper articulates the strategy whereby the federal multimedia courseware initiative leverages the open systems movement and the new realities of information technology. The federal…
Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality
Mehta, Dhwani; Siddiqui, Mohammad Faridul Haque
2018-01-01
Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human–Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam. PMID:29389845
Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality.
Mehta, Dhwani; Siddiqui, Mohammad Faridul Haque; Javaid, Ahmad Y
2018-02-01
Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human-Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam.
Renaud, Patrice; Joyal, Christian; Stoleru, Serge; Goyette, Mathieu; Weiskopf, Nikolaus; Birbaumer, Niels
2011-01-01
This chapter proposes a prospective view on using a real-time functional magnetic imaging (rt-fMRI) brain-computer interface (BCI) application as a new treatment for pedophilia. Neurofeedback mediated by interactive virtual stimuli is presented as the key process in this new BCI application. Results on the diagnostic discriminant power of virtual characters depicting sexual stimuli relevant to pedophilia are given. Finally, practical and ethical implications are briefly addressed. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Gutensohn, Michael
2018-01-01
The task for this project was to design, develop, test, and deploy a facial recognition system for the Kennedy Space Center Augmented/Virtual Reality Lab. This system will serve as a means of user authentication as part of the NUI of the lab. The overarching goal is to create a seamless user interface that will allow the user to initiate and interact with AR and VR experiences without ever needing to use a mouse or keyboard at any step in the process.
Nifakos, Sokratis; Zary, Nabil
2014-01-01
The research community has called for the development of effective educational interventions for addressing prescription behaviour since antimicrobial resistance remains a global health issue. Examining the potential to displace the educational process from Personal Computers to Mobile devices, in this paper we investigated a new method of integration of Virtual Patients into Mobile devices with augmented reality technology, enriching the practitioner's education in prescription behavior. Moreover, we also explored which information are critical during the prescription behavior education and we visualized these information on real context with augmented reality technology, simultaneously with a running Virtual Patient's scenario. Following this process, we set the educational frame of experiential knowledge to a mixed (virtual and real) environment.
Exercise/recreation facility for a lunar or Mars analog
NASA Technical Reports Server (NTRS)
1991-01-01
The University of Idaho, NASA/USRA project for the 1990-91 school year is an exercise/recreation station for an Earth-based simulator of a lunar or martian habitat. Specifically, a stationary bicycle that will help people keep fit and prevent muscular atrophy while stationed in space was designed. To help with motivation and provide an element of recreation during the workout, the bicycle is to be enhanced by a virtual reality system. The system simulates various riding situations, including the choice of a mountain bike or a road bike. The bike employs a magnetic brake that provides continuously changing tension to simulate actual riding conditions. This braking system is interfaced directly with the virtual reality system. Also, integrated into the virtual reality display will be a monitoring system that regulates heart rate, work rate, and other functions during the course of the session.
Development and human factors analysis of neuronavigation vs. augmented reality.
Pandya, Abhilash; Siadat, Mohammad-Reza; Auner, Greg; Kalash, Mohammad; Ellis, R Darin
2004-01-01
This paper is focused on the human factors analysis comparing a standard neuronavigation system with an augmented reality system. We use a passive articulated arm (Microscribe, Immersion technology) to track a calibrated end-effector mounted video camera. In real time, we superimpose the live video view with the synchronized graphical view of CT-derived segmented object(s) of interest within a phantom skull. Using the same robotic arm, we have developed a neuronavigation system able to show the end-effector of the arm on orthogonal CT scans. Both the AR and the neuronavigation systems have been shown to be within 3mm of accuracy. A human factors study was conducted in which subjects were asked to draw craniotomies and answer questions to gage their understanding of the phantom objects. The human factors study included 21 subjects and indicated that the subjects performed faster, with more accuracy and less errors using the Augmented Reality interface.
Mixing and transient interface condensation of a liquid hydrogen tank
NASA Technical Reports Server (NTRS)
Lin, C. S.; Hasan, M. M.; Nyland, T. W.
1993-01-01
Experiments were conducted to investigate the effect of axial jet-induced mixing on the pressure reduction of a thermally stratified liquid hydrogen tank. The tank was nearly cylindrical, having a volume of about 0.144 cu m with 0.559 m in diameter and 0.711 m length. A mixer/pump unit, which had a jet nozzle outlet of 0.0221 m in diameter was located 0.178 m from the tank bottom and was installed inside the tank to generate the axial jet mixing and tank fluid circulation. Mixing tests began with the tank pressures at which the thermal stratification results in 4.9-6.2 K liquid subcooling. The mixing time and transient vapor condensation rate at the liquid-vapor interface are determined. Two mixing time correlations, based on the thermal equilibrium and pressure equilibrium, are developed and expressed as functions of system and buoyancy parameters. The limited liquid hydrogen data of the present study shows that the modified steady state condensation rate correlation may be used to predict the transient condensation rate in a mixing process if the instantaneous values of jet sub cooling and turbulence intensity at the interface are employed.
De Mauro, Alessandro; Carrasco, Eduardo; Oyarzun, David; Ardanza, Aitor; Frizera Neto, Anselmo; Torricelli, Diego; Pons, José Luis; Gil, Angel; Florez, Julian
2011-01-01
Cerebrovascular accidents (CVA) and spinal cord injuries (SCI) are the most common causes of paralysis and paresis with reported prevalence of 12,000 cases per million and 800 cases per million, respectively. Disabilities that follow CVA (hemiplegia) or SCI (paraplegia, tetraplegia) severely impair motor functions (e.g., standing, walking, reaching and grasping) and prevent the affected individuals from healthy-like, full and autonomous participation in daily activities. Our research focuses on the development of a new virtual reality (VR) system combined with wearable neurorobotics (NR), motor-neuroprosthetics (MNP) and brain neuro-machine interface (BNMI) to overcome the major limitations of current rehabilitation solutions.
The electronic-commerce-oriented virtual merchandise model
NASA Astrophysics Data System (ADS)
Fang, Xiaocui; Lu, Dongming
2004-03-01
Electronic commerce has been the trend of commerce activities. Providing with Virtual Reality interface, electronic commerce has better expressing capacity and interaction means. But most of the applications of virtual reality technology in EC, 3D model is only the appearance description of merchandises. There is almost no information concerned with commerce information and interaction information. This resulted in disjunction of virtual model and commerce information. So we present Electronic Commerce oriented Virtual Merchandise Model (ECVMM), which combined a model with commerce information, interaction information and figure information of virtual merchandise. ECVMM with abundant information provides better support to information obtainment and communication in electronic commerce.
Ahn, Woojin; Dargar, Saurabh; Halic, Tansel; Lee, Jason; Li, Baichun; Pan, Junjun; Sankaranarayanan, Ganesh; Roberts, Kurt; De, Suvranu
2014-01-01
The first virtual-reality-based simulator for Natural Orifice Translumenal Endoscopic Surgery (NOTES) is developed called the Virtual Translumenal Endoscopic Surgery Trainer (VTESTTM). VTESTTM aims to simulate hybrid NOTES cholecystectomy procedure using a rigid scope inserted through the vaginal port. The hardware interface is designed for accurate motion tracking of the scope and laparoscopic instruments to reproduce the unique hand-eye coordination. The haptic-enabled multimodal interactive simulation includes exposing the Calot's triangle and detaching the gall bladder while performing electrosurgery. The developed VTESTTM was demonstrated and validated at NOSCAR 2013.
Effectiveness of a Virtual Reality Forest on People With Dementia: A Mixed Methods Pilot Study.
Moyle, Wendy; Jones, Cindy; Dwan, Toni; Petrovich, Tanya
2018-05-08
To measure and describe the effectiveness of a Virtual Reality Forest (VRF) on engagement, apathy, and mood states of people with dementia, and explore the experiences of staff, people with dementia and their families. A mixed-methods study conducted between February and May 2016. Ten residents with dementia, 10 family members, and 9 care staff were recruited from 2 residential aged care facilities, operated by one care provider, located in Victoria, Australia. Residents participated in one facilitated VRF session. Residents' mood, apathy, and engagement were measured by the Observed Emotion Rating Scale, Person-Environment Apathy Rating Scale, and Types of Engagement. All participants were interviewed. Overall, the VRF was perceived by residents, family members, and staff to have a positive effect. During the VRF experience, residents experienced more pleasure (p = .008) and a greater level of alertness (p < .001). They also experienced a greater level of fear/anxiety during the forest experience than the comparative normative sample (p = .016). This initial, small-scale study represents the first to introduce the VRF activity and describe the impact on people with dementia. The VRF was perceived to have a positive effect on people with dementia, although, compared to the normative sample, a greater level of fear/anxiety during the VRF was experienced. This study suggests virtual reality may have the potential to improve quality of life, and the outcomes can be used to inform the development of future Virtual Reality activities for people with dementia.
Calibration of RGBD camera and cone-beam CT for 3D intra-operative mixed reality visualization.
Lee, Sing Chun; Fuerst, Bernhard; Fotouhi, Javad; Fischer, Marius; Osgood, Greg; Navab, Nassir
2016-06-01
This work proposes a novel algorithm to register cone-beam computed tomography (CBCT) volumes and 3D optical (RGBD) camera views. The co-registered real-time RGBD camera and CBCT imaging enable a novel augmented reality solution for orthopedic surgeries, which allows arbitrary views using digitally reconstructed radiographs overlaid on the reconstructed patient's surface without the need to move the C-arm. An RGBD camera is rigidly mounted on the C-arm near the detector. We introduce a calibration method based on the simultaneous reconstruction of the surface and the CBCT scan of an object. The transformation between the two coordinate spaces is recovered using Fast Point Feature Histogram descriptors and the Iterative Closest Point algorithm. Several experiments are performed to assess the repeatability and the accuracy of this method. Target registration error is measured on multiple visual and radio-opaque landmarks to evaluate the accuracy of the registration. Mixed reality visualizations from arbitrary angles are also presented for simulated orthopedic surgeries. To the best of our knowledge, this is the first calibration method which uses only tomographic and RGBD reconstructions. This means that the method does not impose a particular shape of the phantom. We demonstrate a marker-less calibration of CBCT volumes and 3D depth cameras, achieving reasonable registration accuracy. This design requires a one-time factory calibration, is self-contained, and could be integrated into existing mobile C-arms to provide real-time augmented reality views from arbitrary angles.
Alahverdjieva, V S; Grigoriev, D O; Fainerman, V B; Aksenenko, E V; Miller, R; Möhwald, H
2008-02-21
The competitive adsorption at the air-water interface from mixed adsorption layers of hen egg-white lysozyme with a non-ionic surfactant (C10DMPO) was studied and compared to the mixture with an ionic surfactant (SDS) using bubble and drop shape analysis tensiometry, ellipsometry, and surface dilational rheology. The set of equilibrium and kinetic data of the mixed solutions is described by a thermodynamic model developed recently. The theoretical description of the mixed system is based on the model parameters for the individual components.
Doppler Lidar Vector Retrievals and Atmospheric Data Visualization in Mixed/Augmented Reality
NASA Astrophysics Data System (ADS)
Cherukuru, Nihanth Wagmi
Environmental remote sensing has seen rapid growth in the recent years and Doppler wind lidars have gained popularity primarily due to their non-intrusive, high spatial and temporal measurement capabilities. While lidar applications early on, relied on the radial velocity measurements alone, most of the practical applications in wind farm control and short term wind prediction require knowledge of the vector wind field. Over the past couple of years, multiple works on lidars have explored three primary methods of retrieving wind vectors viz., using homogeneous windfield assumption, computationally extensive variational methods and the use of multiple Doppler lidars. Building on prior research, the current three-part study, first demonstrates the capabilities of single and dual Doppler lidar retrievals in capturing downslope windstorm-type flows occurring at Arizona's Barringer Meteor Crater as a part of the METCRAX II field experiment. Next, to address the need for a reliable and computationally efficient vector retrieval for adaptive wind farm control applications, a novel 2D vector retrieval based on a variational formulation was developed and applied on lidar scans from an offshore wind farm and validated with data from a cup and vane anemometer installed on a nearby research platform. Finally, a novel data visualization technique using Mixed Reality (MR)/ Augmented Reality (AR) technology is presented to visualize data from atmospheric sensors. MR is an environment in which the user's visual perception of the real world is enhanced with live, interactive, computer generated sensory input (in this case, data from atmospheric sensors like Doppler lidars). A methodology using modern game development platforms is presented and demonstrated with lidar retrieved wind fields. In the current study, the possibility of using this technology to visualize data from atmospheric sensors in mixed reality is explored and demonstrated with lidar retrieved wind fields as well as a few earth science datasets for education and outreach activities.
Perfect mixing of immiscible macromolecules at fluid interfaces
NASA Astrophysics Data System (ADS)
Sheiko, Sergei S.; Zhou, Jing; Arnold, Jamie; Neugebauer, Dorota; Matyjaszewski, Krzysztof; Tsitsilianis, Constantinos; Tsukruk, Vladimir V.; Carrillo, Jan-Michael Y.; Dobrynin, Andrey V.; Rubinstein, Michael
2013-08-01
The difficulty of mixing chemically incompatible substances—in particular macromolecules and colloidal particles—is a canonical problem limiting advances in fields ranging from health care to materials engineering. Although the self-assembly of chemically different moieties has been demonstrated in coordination complexes, supramolecular structures, and colloidal lattices among other systems, the mechanisms of mixing largely rely on specific interfacing of chemically, physically or geometrically complementary objects. Here, by taking advantage of the steric repulsion between brush-like polymers tethered to surface-active species, we obtained long-range arrays of perfectly mixed macromolecules with a variety of polymer architectures and a wide range of chemistries without the need of encoding specific complementarity. The net repulsion arises from the significant increase in the conformational entropy of the brush-like polymers with increasing distance between adjacent macromolecules at fluid interfaces. This entropic-templating assembly strategy enables long-range patterning of thin films on sub-100 nm length scales.
Synergistic interface behavior of strontium adsorption using mixed microorganisms.
Hu, Wenyuan; Dong, Faqin; Yang, Guangmin; Peng, Xin; Huang, Xiaojun; Liu, Mingxue; Zhang, Jing
2017-08-10
The proper handling of low-level radioactive waste is crucial to promote the sustainable development of nuclear power. Research into the mechanism for interactions between bacterium and radionuclides is the starting point for achieving successful remediation of radionuclides with microorganisms. Using Sr(II) as a simulation radionuclide and the mixed microorganisms of Saccharomyces cerevisiae and Bacillus subtilis as the biological adsorbent, this study investigates behavior at the interface between Sr(II) and the microorganisms as well as the mechanisms governing that behavior. The results show that the optimal ratio of mixed microorganisms is S. cerevisiae 2.0 g L -1 to B. subtilis 0.05 g L -1 , and the optimal pH is about 6.3. Sr(II) biosorption onto the mixed microorganisms is spontaneous and endothermic in nature. The kinetics and the equilibrium isotherm data of the biosorption process can be described with pseudo-second-order equation and the Langmuir isotherm equation, respectively. The key interaction between the biological adsorbent and Sr(II) involves shared electronic pairs arising from chemical reactions via bond complexation or electronic exchange, and spectral and energy spectrum analysis show that functional groups (e.g., hydroxyl, carboxyl, amino, amide) at the interface between the radionuclide and the mixed microorganisms are the main active sites of the interface reactions.
Chow, Joyce A.; Törnros, Martin E.; Waltersson, Marie; Richard, Helen; Kusoffsky, Madeleine; Lundström, Claes F.; Kurti, Arianit
2017-01-01
Context: Within digital pathology, digitalization of the grossing procedure has been relatively underexplored in comparison to digitalization of pathology slides. Aims: Our investigation focuses on the interaction design of an augmented reality gross pathology workstation and refining the interface so that information and visualizations are easily recorded and displayed in a thoughtful view. Settings and Design: The work in this project occurred in two phases: the first phase focused on implementation of an augmented reality grossing workstation prototype while the second phase focused on the implementation of an incremental prototype in parallel with a deeper design study. Subjects and Methods: Our research institute focused on an experimental and “designerly” approach to create a digital gross pathology prototype as opposed to focusing on developing a system for immediate clinical deployment. Statistical Analysis Used: Evaluation has not been limited to user tests and interviews, but rather key insights were uncovered through design methods such as “rapid ethnography” and “conversation with materials”. Results: We developed an augmented reality enhanced digital grossing station prototype to assist pathology technicians in capturing data during examination. The prototype uses a magnetically tracked scalpel to annotate planned cuts and dimensions onto photographs taken of the work surface. This article focuses on the use of qualitative design methods to evaluate and refine the prototype. Our aims were to build on the strengths of the prototype's technology, improve the ergonomics of the digital/physical workstation by considering numerous alternative design directions, and to consider the effects of digitalization on personnel and the pathology diagnostics information flow from a wider perspective. A proposed interface design allows the pathology technician to place images in relation to its orientation, annotate directly on the image, and create linked information. Conclusions: The augmented reality magnetically tracked scalpel reduces tool switching though limitations in today's augmented reality technology fall short of creating an ideal immersive workflow by requiring the use of a monitor. While this technology catches up, we recommend focusing efforts on enabling the easy creation of layered, complex reports, linking, and viewing information across systems. Reflecting upon our results, we argue for digitalization to focus not only on how to record increasing amounts of data but also how these data can be accessed in a more thoughtful way that draws upon the expertise and creativity of pathology professionals using the systems. PMID:28966831
Chow, Joyce A; Törnros, Martin E; Waltersson, Marie; Richard, Helen; Kusoffsky, Madeleine; Lundström, Claes F; Kurti, Arianit
2017-01-01
Within digital pathology, digitalization of the grossing procedure has been relatively underexplored in comparison to digitalization of pathology slides. Our investigation focuses on the interaction design of an augmented reality gross pathology workstation and refining the interface so that information and visualizations are easily recorded and displayed in a thoughtful view. The work in this project occurred in two phases: the first phase focused on implementation of an augmented reality grossing workstation prototype while the second phase focused on the implementation of an incremental prototype in parallel with a deeper design study. Our research institute focused on an experimental and "designerly" approach to create a digital gross pathology prototype as opposed to focusing on developing a system for immediate clinical deployment. Evaluation has not been limited to user tests and interviews, but rather key insights were uncovered through design methods such as " rapid ethnography " and " conversation with materials ". We developed an augmented reality enhanced digital grossing station prototype to assist pathology technicians in capturing data during examination. The prototype uses a magnetically tracked scalpel to annotate planned cuts and dimensions onto photographs taken of the work surface. This article focuses on the use of qualitative design methods to evaluate and refine the prototype. Our aims were to build on the strengths of the prototype's technology, improve the ergonomics of the digital/physical workstation by considering numerous alternative design directions, and to consider the effects of digitalization on personnel and the pathology diagnostics information flow from a wider perspective. A proposed interface design allows the pathology technician to place images in relation to its orientation, annotate directly on the image, and create linked information. The augmented reality magnetically tracked scalpel reduces tool switching though limitations in today's augmented reality technology fall short of creating an ideal immersive workflow by requiring the use of a monitor. While this technology catches up, we recommend focusing efforts on enabling the easy creation of layered, complex reports, linking, and viewing information across systems. Reflecting upon our results, we argue for digitalization to focus not only on how to record increasing amounts of data but also how these data can be accessed in a more thoughtful way that draws upon the expertise and creativity of pathology professionals using the systems.
Bovine insulin-phosphatidylcholine mixed Langmuir monolayers: behavior at the air-water interface.
Pérez-López, S; Blanco-Vila, N M; Vila-Romeu, N
2011-08-04
The behavior of the binary mixed Langmuir monolayers of bovine insulin (INS) and phosphatidylcholine (PC) spread at the air-water interface was investigated under various subphase conditions. Pure and mixed monolayers were spread on water, on NaOH and phosphate-buffered solutions of pH 7.4, and on Zn(2+)-containing solutions. Miscibility and interactions between the components were studied on the basis of the analysis of the surface pressure (π)-mean molecular area (A) isotherms, surface compression modulus (C(s)(-1))-π curves, and plots of A versus mole fraction of INS (X(INS)). Our results indicate that intermolecular interactions between INS and PC depend on both the monolayer state and the structural characteristics of INS at the interface, which are strongly influenced by the subphase pH and salt content. Brewster angle microscopy (BAM) was applied to investigate the peptide aggregation pattern at the air-water interface in the presence of the studied lipid under any experimental condition investigated. The influence of the lipid on the INS behavior at the interface strongly depends on the subphase conditions.
Optical architecture of HoloLens mixed reality headset
NASA Astrophysics Data System (ADS)
Kress, Bernard C.; Cummings, William J.
2017-06-01
HoloLens by Microsoft Corp. is the world's first untethered Mixed Reality (MR) Head Mounted Display (HMD) system, released to developers in March 2016 as a Development Kit. We review in this paper the various display requirements and subsequent optical hardware choices we made for HoloLens. Its main achievements go along performance and comfort for the user: it is the first fully untethered MR headset, with the highest angular resolution and the industry's largest eyebox. It has the first inside-out global sensor fusion system including precise head tracking and 3D mapping all controlled by a fully custom on-board GPU. Based on such achievements, HoloLens came out as the most advanced MR system today. Additional features may be implemented in next generations MR headsets, leading to the ultimate experience for the user, and securing the upcoming fabulous AR/MR market predicted by most analysts.
A mixed reality approach for stereo-tomographic quantification of lung nodules.
Chen, Mianyi; Kalra, Mannudeep K; Yun, Wenbing; Cong, Wenxiang; Yang, Qingsong; Nguyen, Terry; Wei, Biao; Wang, Ge
2016-05-25
To reduce the radiation dose and the equipment cost associated with lung CT screening, in this paper we propose a mixed reality based nodule measurement method with an active shutter stereo imaging system. Without involving hundreds of projection views and subsequent image reconstruction, we generated two projections of an iteratively placed ellipsoidal volume in the field of view and merging these synthetic projections with two original CT projections. We then demonstrated the feasibility of measuring the position and size of a nodule by observing whether projections of an ellipsoidal volume and the nodule are overlapped from a human observer's visual perception through the active shutter 3D vision glasses. The average errors of measured nodule parameters are less than 1 mm in the simulated experiment with 8 viewers. Hence, it could measure real nodules accurately in the experiments with physically measured projections.
Kotranza, Aaron; Lind, D Scott; Lok, Benjamin
2012-07-01
We investigate the efficacy of incorporating real-time feedback of user performance within mixed-reality environments (MREs) for training real-world tasks with tightly coupled cognitive and psychomotor components. This paper presents an approach to providing real-time evaluation and visual feedback of learner performance in an MRE for training clinical breast examination (CBE). In a user study of experienced and novice CBE practitioners (n = 69), novices receiving real-time feedback performed equivalently or better than more experienced practitioners in the completeness and correctness of the exam. A second user study (n = 8) followed novices through repeated practice of CBE in the MRE. Results indicate that skills improvement in the MRE transfers to the real-world task of CBE of human patients. This initial case study demonstrates the efficacy of MREs incorporating real-time feedback for training real-world cognitive-psychomotor tasks.
Design of a home-based adaptive mixed reality rehabilitation system for stroke survivors.
Baran, Michael; Lehrer, Nicole; Siwiak, Diana; Chen, Yinpeng; Duff, Margaret; Ingalls, Todd; Rikakis, Thanassis
2011-01-01
This paper presents the design of a home-based adaptive mixed reality system (HAMRR) for upper extremity stroke rehabilitation. The goal of HAMRR is to help restore motor function to chronic stroke survivors by providing an engaging long-term reaching task therapy at home. The system uses an intelligent adaptation scheme to create a continuously challenging and unique multi-year therapy experience. The therapy is overseen by a physical therapist, but day-to-day use of the system can be independently set up and completed by a stroke survivor. The HAMMR system tracks movement of the wrist and torso and provides real-time, post-trial, and post-set feedback to encourage the stroke survivor to self-assess his or her movement and engage in active learning of new movement strategies. The HAMRR system consists of a custom table, chair, and media center, and is designed to easily integrate into any home.
NASA Astrophysics Data System (ADS)
Bandopadhyay, Aditya; Le Borgne, Tanguy; Méheust, Yves; Dentz, Marco
2017-02-01
Mixing fronts, where fluids of different chemical compositions mix with each other, are known to represent hotspots of chemical reaction in hydrological systems. These fronts are typically subjected to velocity gradients, ranging from the pore scale due to no slip boundary conditions at fluid solid interfaces, to the catchment scale due to permeability variations and complex geometry of the Darcy velocity streamlines. A common trait of these processes is that the mixing interface is strained by shear. Depending on the Péclet number Pe , which represents the ratio of the characteristic diffusion time to the characteristic shear time, and the Damköhler number Da , which represents the ratio of the characteristic diffusion time to the characteristic reaction time, the local reaction rates can be strongly impacted by the dynamics of the mixing interface. So far, this impact has been characterized mostly either in kinetics-limited or in mixing-limited conditions, that is, for either low or high Da. Here the coupling of shear flow and chemical reactivity is investigated for arbitrary Damköhler numbers, for a bimolecular reaction and an initial interface with separated reactants. Approximate analytical expressions for the global production rate and reactive mixing scale are derived based on a reactive lamella approach that allows for a general coupling between stretching enhanced mixing and chemical reactions. While for Pe < Da , reaction kinetics and stretching effects are decoupled, a scenario which we name "weak stretching", for Pe > Da , we uncover a "strong stretching" scenario where new scaling laws emerge from the interplay between reaction kinetics, diffusion, and stretching. The analytical results are validated against numerical simulations. These findings shed light on the effect of flow heterogeneity on the enhancement of chemical reaction and the creation of spatially localized hotspots of reactivity for a broad range of systems ranging from kinetic limited to mixing limited situations.
ERIC Educational Resources Information Center
Barbanell, Patricia; Falco, John; Newman, Diana
As museums throughout the world enter the interactive arena of digital communications, a need has emerged to access strategies of program development that seamlessly interface with existing missions and resources. This paper describes how Project VIEW, a US Department of Education Technology Innovation Challenge Grant, collaborates with major…
NASA Technical Reports Server (NTRS)
Leifer, Larry; Michalowski, Stefan; Vanderloos, Machiel
1991-01-01
The Stanford/VA Interactive Robotics Laboratory set out in 1978 to test the hypothesis that industrial robotics technology could be applied to serve the manipulation needs of severely impaired individuals. Five generations of hardware, three generations of system software, and over 125 experimental subjects later, we believe that genuine utility is achievable. The experience includes development of over 65 task applications using voiced command, joystick control, natural language command and 3D object designation technology. A brief foray into virtual environments, using flight simulator technology, was instructive. If reality and virtuality come for comparable prices, you cannot beat reality. A detailed review of assistive robot anatomy and the performance specifications needed to achieve cost/beneficial utility will be used to support discussion of the future of rehabilitation telerobotics. Poised on the threshold of commercial viability, but constrained by the high cost of technically adequate manipulators, this worthy application domain flounders temporarily. In the long run, it will be the user interface that governs utility.
Availability of the OGC geoprocessing standard: March 2011 reality check
NASA Astrophysics Data System (ADS)
Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier
2012-10-01
This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.
A Web-based cost-effective training tool with possible application to brain injury rehabilitation.
Wang, Peijun; Kreutzer, Ina Anna; Bjärnemo, Robert; Davies, Roy C
2004-06-01
Virtual reality (VR) has provoked enormous interest in the medical community. In particular, VR offers therapists new approaches for improving rehabilitation effects. However, most of these VR assistant tools are not very portable, extensible or economical. Due to the vast amount of 3D data, they are not suitable for Internet transfer. Furthermore, in order to run these VR systems smoothly, special hardware devices are needed. As a result, existing VR assistant tools tend to be available in hospitals but not in patients' homes. To overcome these disadvantages, as a case study, this paper proposes a Web-based Virtual Ticket Machine, called WBVTM, using VRML [VRML Consortium, The Virtual Reality Modeling Language: International Standard ISO/IEC DIS 14772-1, 1997, available at ], Java and EAI (External Authoring Interface) [Silicon Graphics, Inc., The External Authoring Interface (EAI), available at ], to help people with acquired brain injury (ABI) to relearn basic living skills at home at a low cost. As these technologies are open standard and feature usability on the Internet, WBVTM achieves the goals of portability, easy accessibility and cost-effectiveness.
Yap, Hwa Jen; Taha, Zahari; Md Dawal, Siti Zawiah; Chang, Siow-Wee
2014-01-01
Traditional robotic work cell design and programming are considered inefficient and outdated in current industrial and market demands. In this research, virtual reality (VR) technology is used to improve human-robot interface, whereby complicated commands or programming knowledge is not required. The proposed solution, known as VR-based Programming of a Robotic Work Cell (VR-Rocell), consists of two sub-programmes, which are VR-Robotic Work Cell Layout (VR-RoWL) and VR-based Robot Teaching System (VR-RoT). VR-RoWL is developed to assign the layout design for an industrial robotic work cell, whereby VR-RoT is developed to overcome safety issues and lack of trained personnel in robot programming. Simple and user-friendly interfaces are designed for inexperienced users to generate robot commands without damaging the robot or interrupting the production line. The user is able to attempt numerous times to attain an optimum solution. A case study is conducted in the Robotics Laboratory to assemble an electronics casing and it is found that the output models are compatible with commercial software without loss of information. Furthermore, the generated KUKA commands are workable when loaded into a commercial simulator. The operation of the actual robotic work cell shows that the errors may be due to the dynamics of the KUKA robot rather than the accuracy of the generated programme. Therefore, it is concluded that the virtual reality based solution approach can be implemented in an industrial robotic work cell. PMID:25360663
Yap, Hwa Jen; Taha, Zahari; Dawal, Siti Zawiah Md; Chang, Siow-Wee
2014-01-01
Traditional robotic work cell design and programming are considered inefficient and outdated in current industrial and market demands. In this research, virtual reality (VR) technology is used to improve human-robot interface, whereby complicated commands or programming knowledge is not required. The proposed solution, known as VR-based Programming of a Robotic Work Cell (VR-Rocell), consists of two sub-programmes, which are VR-Robotic Work Cell Layout (VR-RoWL) and VR-based Robot Teaching System (VR-RoT). VR-RoWL is developed to assign the layout design for an industrial robotic work cell, whereby VR-RoT is developed to overcome safety issues and lack of trained personnel in robot programming. Simple and user-friendly interfaces are designed for inexperienced users to generate robot commands without damaging the robot or interrupting the production line. The user is able to attempt numerous times to attain an optimum solution. A case study is conducted in the Robotics Laboratory to assemble an electronics casing and it is found that the output models are compatible with commercial software without loss of information. Furthermore, the generated KUKA commands are workable when loaded into a commercial simulator. The operation of the actual robotic work cell shows that the errors may be due to the dynamics of the KUKA robot rather than the accuracy of the generated programme. Therefore, it is concluded that the virtual reality based solution approach can be implemented in an industrial robotic work cell.
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Vauderwange, Oliver; Mandal, Avikarsha; Javahiraly, Nicolas; Curticapean, Dan
2016-09-01
Practical exercises are a crucial part of many curricula. Even simple exercises can improve the understanding of the underlying subject. Most experimental setups require special hardware. To carry out e. g. a lens experiments the students need access to an optical bench, various lenses, light sources, apertures and a screen. In our previous publication we demonstrated the use of augmented reality visualization techniques in order to let the students prepare with a simulated experimental setup. Within the context of our intended blended learning concept we want to utilize augmented or virtual reality techniques for stationary laboratory exercises. Unlike applications running on mobile devices, stationary setups can be extended more easily with additional interfaces and thus allow for more complex interactions and simulations in virtual reality (VR) and augmented reality (AR). The most significant difference is the possibility to allow interactions beyond touching a screen. The LEAP Motion controller is a small inexpensive device that allows for the tracking of the user's hands and fingers in three dimensions. It is conceivable to allow the user to interact with the simulation's virtual elements by the user's very hand position, movement and gesture. In this paper we evaluate possible applications of the LEAP Motion controller for simulated experiments in augmented and virtual reality. We pay particular attention to the devices strengths and weaknesses and want to point out useful and less useful application scenarios.
NASA Astrophysics Data System (ADS)
Ogneva, T. S.; Lazurenko, D. V.; Bataev, I. A.; Mali, V. I.; Esikov, M. A.; Bataev, A. A.
2016-04-01
The Ni-Al multilayer composite was fabricated using explosive welding. The zones of mixing of Ni and Al are observed at the composite interfaces after the welding. The composition of these zones is inhomogeneous. Continuous homogeneous intermetallic layers are formed at the interface after heat treatment at 620 °C during 5 h These intermetallic layers consist of NiAl3 and Ni2Al3 phases. The presence of mixed zones significantly accelerates the growth rate of intermetallic phases at the initial stages of heating.
The effectiveness of virtual reality distraction for pain reduction: a systematic review.
Malloy, Kevin M; Milling, Leonard S
2010-12-01
Virtual reality technology enables people to become immersed in a computer-simulated, three-dimensional environment. This article provides a comprehensive review of controlled research on the effectiveness of virtual reality (VR) distraction for reducing pain. To be included in the review, studies were required to use a between-subjects or mixed model design in which VR distraction was compared with a control condition or an alternative intervention in relieving pain. An exhaustive search identified 11 studies satisfying these criteria. VR distraction was shown to be effective for reducing experimental pain, as well as the discomfort associated with burn injury care. Studies of needle-related pain provided less consistent findings. Use of more sophisticated virtual reality technology capable of fully immersing the individual in a virtual environment was associated with greater relief. Overall, controlled research suggests that VR distraction may be a useful tool for clinicians who work with a variety of pain problems. Copyright © 2010 Elsevier Ltd. All rights reserved.
Heat of mixing and morphological stability
NASA Technical Reports Server (NTRS)
Nandapurkar, P.; Poirier, D. R.
1988-01-01
A mathematical model, which incorporates heat of mixing in the energy balance, has been developed to analyze the morphological stability of a planar solid-liquid interface during the directional solidification of a binary alloy. It is observed that the stability behavior is almost that predicted by the analysis of Mullins and Sekerka (1963) at low growth velocities, while deviations in the critical concentration of about 20-25 percent are observed under rapid solidification conditions for certain systems. The calculations indicate that a positive heat of mixing makes the planar interface more unstable, whereas a negative heat of mixing makes it more stable, in terms of the critical concentration.
NASA Astrophysics Data System (ADS)
Laminack, William; Gole, James
2015-12-01
A unique MEMS/NEMS approach is presented for the modeling of a detection platform for mixed gas interactions. Mixed gas analytes interact with nanostructured decorating metal oxide island sites supported on a microporous silicon substrate. The Inverse Hard/Soft acid/base (IHSAB) concept is used to assess a diversity of conductometric responses for mixed gas interactions as a function of these nanostructured metal oxides. The analyte conductometric responses are well represented using a combination diffusion/absorption-based model for multi-gas interactions where a newly developed response absorption isotherm, based on the Fermi distribution function is applied. A further coupling of this model with the IHSAB concept describes the considerations in modeling of multi-gas mixed analyte-interface, and analyte-analyte interactions. Taking into account the molecular electronic interaction of both the analytes with each other and an extrinsic semiconductor interface we demonstrate how the presence of one gas can enhance or diminish the reversible interaction of a second gas with the extrinsic semiconductor interface. These concepts demonstrate important considerations in the array-based formats for multi-gas sensing and its applications.
Effect of a mixed reality-based intervention on arm, hand, and finger function on chronic stroke.
Colomer, Carolina; Llorens, Roberto; Noé, Enrique; Alcañiz, Mariano
2016-05-11
Virtual and mixed reality systems have been suggested to promote motor recovery after stroke. Basing on the existing evidence on motor learning, we have developed a portable and low-cost mixed reality tabletop system that transforms a conventional table in a virtual environment for upper limb rehabilitation. The system allows intensive and customized training of a wide range of arm, hand, and finger movements and enables interaction with tangible objects, while providing audiovisual feedback of the participants' performance in gamified tasks. This study evaluates the clinical effectiveness and the acceptance of an experimental intervention with the system in chronic stroke survivors. Thirty individuals with stroke were included in a reversal (A-B-A) study. Phase A consisted of 30 sessions of conventional physical therapy. Phase B consisted of 30 training sessions with the experimental system. Both interventions involved flexion and extension of the elbow, wrist, and fingers, and grasping of different objects. Sessions were 45-min long and were administered three to five days a week. The body structures (Modified Ashworth Scale), functions (Motricity Index, Fugl-Meyer Assessment Scale), activities (Manual Function Test, Wolf Motor Function Test, Box and Blocks Test, Nine Hole Peg Test), and participation (Motor Activity Log) were assessed before and after each phase. Acceptance of the system was also assessed after phase B (System Usability Scale, Intrinsic Motivation Inventory). Significant improvement was detected after the intervention with the system in the activity, both in arm function measured by the Wolf Motor Function Test (p < 0.01) and finger dexterity measured by the Box and Blocks Test (p < 0.01) and the Nine Hole Peg Test (p < 0.01); and participation (p < 0.01), which was maintained to the end of the study. The experimental system was reported as highly usable, enjoyable, and motivating. Our results support the clinical effectiveness of mixed reality interventions that satisfy the motor learning principles for upper limb rehabilitation in chronic stroke survivors. This characteristic, together with the low cost of the system, its portability, and its acceptance could promote the integration of these systems in the clinical practice as an alternative to more expensive systems, such as robotic instruments.
Visualizing the process of interaction in a 3D environment
NASA Astrophysics Data System (ADS)
Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh
2007-03-01
As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.
Ryason, Adam; Sankaranarayanan, Ganesh; Butler, Kathryn L; DeMoya, Marc; De, Suvranu
2016-08-01
Emergency Cricothyroidotomy (CCT) is a surgical procedure performed to secure a patient's airway. This high-stakes, but seldom-performed procedure is an ideal candidate for a virtual reality simulator to enhance physician training. For the first time, this study characterizes the force/torque characteristics of the cricothyroidotomy procedure, to guide development of a virtual reality CCT simulator for use in medical training. We analyze the upper force and torque thresholds experienced at the human-scalpel interface. We then group individual surgical cuts based on style of cut and cut medium and perform a regression analysis to create two models that allow us to predict the style of cut performed and the cut medium.
Perception and Haptic Rendering of Friction Moments.
Kawasaki, H; Ohtuka, Y; Koide, S; Mouri, T
2011-01-01
This paper considers moments due to friction forces on the human fingertip. A computational technique called the friction moment arc method is presented. The method computes the static and/or dynamic friction moment independent of a friction force calculation. In addition, a new finger holder to display friction moment is presented. This device incorporates a small brushless motor and disk, and connects the human's finger to an interface finger of the five-fingered haptic interface robot HIRO II. Subjects' perception of friction moment while wearing the finger holder, as well as perceptions during object manipulation in a virtual reality environment, were evaluated experimentally.
Top-down causation regarding the chemistry-physics interface: a sceptical view.
Scerri, Eric R
2012-02-06
This article examines two influential authors who have addressed the interface between the fields of chemistry and physics and have reached opposite conclusions about whether or not emergence and downward causation represent genuine phenomena. While McLaughlin concludes that emergence is impossible in the light of quantum mechanics, Hendry regards issues connected with the status of molecular structure as supporting emergence. The present author suggests that one should not be persuaded by either of these arguments and pleads for a form of agnosticism over the reality of emergence and downward causation until further studies might be carried out.
Top-down causation regarding the chemistry–physics interface: a sceptical view
Scerri, Eric R.
2012-01-01
This article examines two influential authors who have addressed the interface between the fields of chemistry and physics and have reached opposite conclusions about whether or not emergence and downward causation represent genuine phenomena. While McLaughlin concludes that emergence is impossible in the light of quantum mechanics, Hendry regards issues connected with the status of molecular structure as supporting emergence. The present author suggests that one should not be persuaded by either of these arguments and pleads for a form of agnosticism over the reality of emergence and downward causation until further studies might be carried out. PMID:23386957
VEVI: A Virtual Reality Tool For Robotic Planetary Explorations
NASA Technical Reports Server (NTRS)
Piguet, Laurent; Fong, Terry; Hine, Butler; Hontalas, Phil; Nygren, Erik
1994-01-01
The Virtual Environment Vehicle Interface (VEVI), developed by the NASA Ames Research Center's Intelligent Mechanisms Group, is a modular operator interface for direct teleoperation and supervisory control of robotic vehicles. Virtual environments enable the efficient display and visualization of complex data. This characteristic allows operators to perceive and control complex systems in a natural fashion, utilizing the highly-evolved human sensory system. VEVI utilizes real-time, interactive, 3D graphics and position / orientation sensors to produce a range of interface modalities from the flat panel (windowed or stereoscopic) screen displays to head mounted/head-tracking stereo displays. The interface provides generic video control capability and has been used to control wheeled, legged, air bearing, and underwater vehicles in a variety of different environments. VEVI was designed and implemented to be modular, distributed and easily operated through long-distance communication links, using a communication paradigm called SYNERGY.
The application of autostereoscopic display in smart home system based on mobile devices
NASA Astrophysics Data System (ADS)
Zhang, Yongjun; Ling, Zhi
2015-03-01
Smart home is a system to control home devices which are more and more popular in our daily life. Mobile intelligent terminals based on smart homes have been developed, make remote controlling and monitoring possible with smartphones or tablets. On the other hand, 3D stereo display technology developed rapidly in recent years. Therefore, a iPad-based smart home system adopts autostereoscopic display as the control interface is proposed to improve the userfriendliness of using experiences. In consideration of iPad's limited hardware capabilities, we introduced a 3D image synthesizing method based on parallel processing with Graphic Processing Unit (GPU) implemented it with OpenGL ES Application Programming Interface (API) library on IOS platforms for real-time autostereoscopic displaying. Compared to the traditional smart home system, the proposed system applied autostereoscopic display into smart home system's control interface enhanced the reality, user-friendliness and visual comfort of interface.
Fusion interfaces for tactical environments: An application of virtual reality technology
NASA Technical Reports Server (NTRS)
Haas, Michael W.
1994-01-01
The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and nonvirtual concepts and devices across the visual, auditory, and haptic sensory modalities. A fusion interface is a multisensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion interface concepts. This new facility, the Fusion Interfaces for Tactical Environments (FITE) Facility is a specialized flight simulator enabling efficient concept development through rapid prototyping and direct experience of new fusion concepts. The FITE Facility also supports evaluation of fusion concepts by operation fighter pilots in an air combat environment. The facility is utilized by a multidisciplinary design team composed of human factors engineers, electronics engineers, computer scientists, experimental psychologists, and oeprational pilots. The FITE computational architecture is composed of twenty-five 80486-based microcomputers operating in real-time. The microcomputers generate out-the-window visuals, in-cockpit and head-mounted visuals, localized auditory presentations, haptic displays on the stick and rudder pedals, as well as executing weapons models, aerodynamic models, and threat models.
A Hybrid 2D/3D User Interface for Radiological Diagnosis.
Mandalika, Veera Bhadra Harish; Chernoglazov, Alexander I; Billinghurst, Mark; Bartneck, Christoph; Hurrell, Michael A; Ruiter, Niels de; Butler, Anthony P H; Butler, Philip H
2018-02-01
This paper presents a novel 2D/3D desktop virtual reality hybrid user interface for radiology that focuses on improving 3D manipulation required in some diagnostic tasks. An evaluation of our system revealed that our hybrid interface is more efficient for novice users and more accurate for both novice and experienced users when compared to traditional 2D only interfaces. This is a significant finding because it indicates, as the techniques mature, that hybrid interfaces can provide significant benefit to image evaluation. Our hybrid system combines a zSpace stereoscopic display with 2D displays, and mouse and keyboard input. It allows the use of 2D and 3D components interchangeably, or simultaneously. The system was evaluated against a 2D only interface with a user study that involved performing a scoliosis diagnosis task. There were two user groups: medical students and radiology residents. We found improvements in completion time for medical students, and in accuracy for both groups. In particular, the accuracy of medical students improved to match that of the residents.
Transforming Polar Research with Google Glass Augmented Reality (Invited)
NASA Astrophysics Data System (ADS)
Ruthkoski, T.
2013-12-01
Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device
Transforming Polar Research with Google Glass Augmented Reality (Invited)
NASA Astrophysics Data System (ADS)
Ramachandran, R.; McEniry, M.; Maskey, M.
2011-12-01
Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device
Chemical mixing at “Al on Fe” and “Fe on Al” interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Süle, P.; Horváth, Z. E.; Kaptás, D.
2015-10-07
The chemical mixing at the “Al on Fe” and “Fe on Al” interfaces was studied by molecular dynamics simulations of the layer growth and by {sup 57}Fe Mössbauer spectroscopy. The concentration distribution along the layer growth direction was calculated for different crystallographic orientations, and atomically sharp “Al on Fe” interfaces were found when Al grows over (001) and (110) oriented Fe layers. The Al/Fe(111) interface is also narrow as compared to the intermixing found at the “Fe on Al” interfaces for any orientation. Conversion electron Mössbauer measurements of trilayers—Al/{sup 57}Fe/Al and Al/{sup 57}Fe/Ag grown simultaneously over Si(111) substrate by vacuummore » evaporation—support the results of the molecular dynamics calculations.« less
Perform light and optic experiments in Augmented Reality
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Vauderwange, Oliver; Curticapean, Dan; Javahiraly, Nicolas; Israel, Kai
2015-10-01
In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university's laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based. Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one's perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content. The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.
Ingress in Geography: Portals to Academic Success?
ERIC Educational Resources Information Center
Davis, Michael
2017-01-01
Niantic Labs has developed an augmented virtual reality mobile app game called Ingress in which agents must seek out and control locations for their designated factions. The app uses the Google Maps interface along with GPS to enhance a geocaching-like experience with elements of other classical games such as capture-the-flag. This study aims to…
A Head in Virtual Reality: Development of A Dynamic Head and Neck Model
ERIC Educational Resources Information Center
Nguyen, Ngan; Wilson, Timothy D.
2009-01-01
Advances in computer and interface technologies have made it possible to create three-dimensional (3D) computerized models of anatomical structures for visualization, manipulation, and interaction in a virtual 3D environment. In the past few decades, a multitude of digital models have been developed to facilitate complex spatial learning of the…
Nonconvective mixing of miscible ionic liquids.
Frost, Denzil S; Machas, Michael; Perea, Brian; Dai, Lenore L
2013-08-13
Ionic liquids (ILs) are ionic compounds that are liquid at room temperature. We studied the spontaneous mixing behavior between two ILs, ethylammonium nitrate (EAN) and 1-butyl-3-methylimidazolium hexafluorophosphate ([BMIM][PF6]), and observed notable phenomena. Experimental studies showed that the interface between the two ILs was unusually long-lived, despite the ILs being miscible with one another. Molecular dynamics (MD) simulations supported these findings and provided insight into the micromixing behavior of the ILs. We found that not only did the ions experience slow diffusion as they mix but also exhibited significant ordering into distinct regions. We suspect that this ordering disrupted concentration gradients in the direction normal to the interface, thus hindering diffusion in this direction and allowing the macroscopic interface to remain for long periods of time. Intermolecular interactions responsible for this behavior included the O-NH interaction between the EAN ions and the carbon chain-carbon chain interactions between the [BMIM](+) cations, which associate more strongly in the mixed state than in the pure IL state.
Using HT and DT gamma rays to diagnose mix in Omega capsule implosions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, M. J.; Herrmann, H. W.; Kim, Y. H.
Experimental evidence [1] indicates that shell material can be driven into the core of Omega capsule implosions on the same time scale as the initial convergent shock. It has been hypothesized that shock-generated temperatures at the fuel/shell interface in thin exploding pusher capsules diffusively drives shell material into the gas core between the time of shock passage and bang time. Here, we propose a method to temporally resolve and observe the evolution of shell material into the capsule core as a function of fuel/shell interface temperature (which can be varied by varying the capsule shell thickness). Our proposed method usesmore » a CD plastic capsule filled with 50/50 HT gas and diagnosed using gas Cherenkov detection (GCD) to temporally resolve both the HT "clean" and DT "mix" gamma ray burn histories. Simulations using Hydra [2] for an Omega CD-lined capsule with a sub-micron layer of the inside surface of the shell pre-mixed into a fraction of the gas region produce gamma reaction history profiles that are sensitive to the depth to which this material is mixed. Furthermore, we observe these differences as a function of capsule shell thickness is proposed to determine if interface mixing is consistent with thermal diffusion λ ii~T 2/Z 2ρ at the gas/shell interface. Finally, since hydrodynamic mixing from shell perturbations, such as the mounting stalk and glue, could complicate these types of capsule-averaged temporal measurements, simulations including their effects also have been performed showing minimal perturbation of the hot spot geometry.« less
Using HT and DT gamma rays to diagnose mix in Omega capsule implosions
Schmitt, M. J.; Herrmann, H. W.; Kim, Y. H.; ...
2016-05-26
Experimental evidence [1] indicates that shell material can be driven into the core of Omega capsule implosions on the same time scale as the initial convergent shock. It has been hypothesized that shock-generated temperatures at the fuel/shell interface in thin exploding pusher capsules diffusively drives shell material into the gas core between the time of shock passage and bang time. Here, we propose a method to temporally resolve and observe the evolution of shell material into the capsule core as a function of fuel/shell interface temperature (which can be varied by varying the capsule shell thickness). Our proposed method usesmore » a CD plastic capsule filled with 50/50 HT gas and diagnosed using gas Cherenkov detection (GCD) to temporally resolve both the HT "clean" and DT "mix" gamma ray burn histories. Simulations using Hydra [2] for an Omega CD-lined capsule with a sub-micron layer of the inside surface of the shell pre-mixed into a fraction of the gas region produce gamma reaction history profiles that are sensitive to the depth to which this material is mixed. Furthermore, we observe these differences as a function of capsule shell thickness is proposed to determine if interface mixing is consistent with thermal diffusion λ ii~T 2/Z 2ρ at the gas/shell interface. Finally, since hydrodynamic mixing from shell perturbations, such as the mounting stalk and glue, could complicate these types of capsule-averaged temporal measurements, simulations including their effects also have been performed showing minimal perturbation of the hot spot geometry.« less
Detection and use of HT and DT gamma rays to diagnose mix in ICF capsules
NASA Astrophysics Data System (ADS)
Schmitt, M. J.; Kim, Y. H.; Herrmann, H. W.; McEvoy, A. M.; Zylstra, A.; Leatherland, A.; Gales, S.
2015-11-01
Recent results from Omega capsule implosion experiments containing HT-rich gas mixtures indicate that the 19.8 MeV gamma ray from aneutronic HT fusion can be measured using existing time-resolved gas Cherenkov detectors (GCDs). Additional dedicated experiments to characterize HT- γ emission in ICF experiments already have been planned. The concurrent temporally-resolved measurement of both HT- γs and DT- γs opens the door for in-depth exploration of interface mix in gas-filled ICF capsules. We propose a method to temporally resolve and observe the evolution of shell material into the capsule core as a function of fuel/shell interface temperature (which can be varied by varying the capsule shell thickness). Our proposed method uses a CD-lined plastic capsule filled with 50/50 HT gas and diagnosed using GCDs to temporally resolve both the HT ``clean'' and DT ``mix'' gamma ray burn histories. It will be shown that these burn history profiles are sensitive to the depth to which shell material mixes into the gas region. An experiment to observe these differences as a function of capsule shell thickness is proposed to determine if interface mixing is consistent with thermal diffusion (λion ~Tion2 /Zion2 ρ) at the gas/shell interface. Since hydrodynamic mixing from shell perturbations, such as the mounting stalk and glue, could complicate these types of capsule-averaged temporal measurements, simulations including their effects also will be shown. This research supported by the US DOE/NNSA, performed in part at LANL, operated by LANS LLC under contract DE-AC52-06NA25396.
ERIC Educational Resources Information Center
Fitzgibbons, Megan; Meert, Deborah
2010-01-01
The use of bibliographic management software and its internal search interfaces is now pervasive among researchers. This study compares the results between searches conducted in academic databases' search interfaces versus the EndNote search interface. The results show mixed search reliability, depending on the database and type of search…
Biocybrid systems and the re-engineering of life
NASA Astrophysics Data System (ADS)
Domingues, Diana; Ferreira da Rocha, Adson; Hamdan, Camila; Augusto, Leci; Miosso, Cristiano Jacques
2011-03-01
The reengineering of life expanded by perceptual experiences in the sense of presence in Virtual Reality and Augmented Reality is the theme of our investigation in collaborative practices confirming the artistś creativity close to the inventivity of scientists and mutual capacity for the generation of biocybrid systems. We consider the enactive bodily interfaces for human existence being co-located in the continuum and symbiotic zone between body and flesh - cyberspace and data - and the hybrid properties of physical world. That continuum generates a biocybrid zone (Bio+cyber+hybrid) and the life is reinvented. Results reaffirm the creative reality of coupled body and mutual influences with environment information, enhancing James Gibson's ecological perception theory. The ecosystem life in its dynamical relations between human, animal, plants, landscapes, urban life and objects, bring questions and challenges for artworks and the reengineering of life discussed in our artworks in technoscience. Finally, we describe an implementation in which the immersion experience is enhanced by the datavisualization of biological audio signals and by using wearable miniaturized devices for biofeedback.
Implementing virtual reality interfaces for the geosciences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, W.; Jacobsen, J.; Austin, A.
1996-06-01
For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called {open_quotes}Virtual Reality{close_quotes} (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools. VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter threemore » or six-dimensional parameters. In the latter, three or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world, but the keyboard is the obstacle in that typing is cumbersome, error-prone and typically slow. In the latter, the user can interact with these parameters by means of motor skills which are highly developed. Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as a tool for computing streamlines via manipulation of a {open_quotes}rake.{close_quotes} The rake is presented to the user in the form of a {open_quotes}virtual well{close_quotes} icon, and provides parameters used by the streamlines algorithm.« less
GLIMPSE: Google Glass interface for sensory feedback in myoelectric hand prostheses
NASA Astrophysics Data System (ADS)
Markovic, Marko; Karnal, Hemanth; Graimann, Bernhard; Farina, Dario; Dosen, Strahinja
2017-06-01
Objective. Providing sensory feedback to the user of the prosthesis is an important challenge. The common approach is to use tactile stimulation, which is easy to implement but requires training and has limited information bandwidth. In this study, we propose an alternative approach based on augmented reality. Approach. We have developed the GLIMPSE, a Google Glass application which connects to the prosthesis via a Bluetooth interface and renders the prosthesis states (EMG signals, aperture, force and contact) using augmented reality (see-through display) and sound (bone conduction transducer). The interface was tested in healthy subjects that used the prosthesis with (FB group) and without (NFB group) feedback during a modified clothespins test that allowed us to vary the difficulty of the task. The outcome measures were the number of unsuccessful trials, the time to accomplish the task, and the subjective ratings of the relevance of the feedback. Main results. There was no difference in performance between FB and NFB groups in the case of a simple task (basic, same-color clothespins test), but the feedback significantly improved the performance in a more complex task (pins of different resistances). Importantly, the GLIMPSE feedback did not increase the time to accomplish the task. Therefore, the supplemental feedback might be useful in the tasks which are more demanding, and thereby less likely to benefit from learning and feedforward control. The subjects integrated the supplemental feedback with the intrinsic sources (vision and muscle proprioception), developing their own idiosyncratic strategies to accomplish the task. Significance. The present study demonstrates a novel self-contained, ready-to-deploy, wearable feedback interface. The interface was successfully tested and was proven to be feasible and functionally beneficial. The GLIMPSE can be used as a practical solution but also as a general and flexible instrument to investigate closed-loop prosthesis control.
Mouraviev, Vladimir; Klein, Martina; Schommer, Eric; Thiel, David D; Samavedi, Srinivas; Kumar, Anup; Leveillee, Raymond J; Thomas, Raju; Pow-Sang, Julio M; Su, Li-Ming; Mui, Engy; Smith, Roger; Patel, Vipul
2016-03-01
In pursuit of improving the quality of residents' education, the Southeastern Section of the American Urological Association (SES AUA) hosts an annual robotic training course for its residents. The workshop involves performing a robotic live porcine nephrectomy as well as virtual reality robotic training modules. The aim of this study was to evaluate workload levels of urology residents when performing a live porcine nephrectomy and the virtual reality robotic surgery training modules employed during this workshop. Twenty-one residents from 14 SES AUA programs participated in 2015. On the first-day residents were taught with didactic lectures by faculty. On the second day, trainees were divided into two groups. Half were asked to perform training modules of the Mimic da Vinci-Trainer (MdVT, Mimic Technologies, Inc., Seattle, WA, USA) for 4 h, while the other half performed nephrectomy procedures on a live porcine model using the da Vinci Si robot (Intuitive Surgical Inc., Sunnyvale, CA, USA). After the first 4 h the groups changed places for another 4-h session. All trainees were asked to complete the NASA-TLX 1-page questionnaire following both the MdVT simulation and live animal model sessions. A significant interface and TLX interaction was observed. The interface by TLX interaction was further analyzed to determine whether the scores of each of the six TLX scales varied across the two interfaces. The means of the TLX scores observed at the two interfaces were similar. The only significant difference was observed for frustration, which was significantly higher at the simulation than the animal model, t (20) = 4.12, p = 0.001. This could be due to trainees' familiarity with live anatomical structures over skill set simulations which remain a real challenge to novice surgeons. Another reason might be that the simulator provides performance metrics for specific performance traits as well as composite scores for entire exercises. Novice trainees experienced substantial mental workload while performing tasks on both the simulator and the live animal model during the robotics course. The NASA-TLX profiles demonstrated that the live animal model and the MdVT were similar in difficulty, as indicated by their comparable workload profiles.
Energy and water vapor transport across a simplified cloud-clear air interface
NASA Astrophysics Data System (ADS)
Gallana, L.; Di Savino, S.; De Santi, F.; Iovieno, M.; Tordella, D.
2014-11-01
We consider a simplified physics of the could interface where condensation, evaporation and radiation are neglected and momentum, thermal energy and water vapor transport is represented in terms of the Boussinesq model coupled to a passive scalar transport equation for the vapor. The interface is modeled as a layer separating two isotropic turbulent regions with different kinetic energy and vapor concentration. In particular, we focus on the small scale part of the inertial range of the atmospheric boundary layer as well as on the dissipative range of scales which are important to the micro-physics of warm clouds. We have numerically investigated stably stratified interfaces by locally perturbing at an initial instant the standard temperature lapse rate at the cloud interface and then observing the temporal evolution of the system. When the buoyancy term becomes of the same order of the inertial one, we observe a spatial redistribution of the kinetic energy which produce a concomitant pit of kinetic energy within the mixing layer. In this situation, the mixing layer contains two interfacial regions with opposite kinetic energy gradient, which in turn produces two intermittent sublayers in the velocity fluctuations field. This changes the structure of the field with respect to the corresponding non-stratified shearless mixing: the communication between the two turbulent region is weak, and the growth of the mixing layer stops. These results are discussed with respect to Large Eddy Simulations data for the Planetary Boundary Layers.
Yetisen, Ali K; Martinez-Hurtado, Juan Leonardo; Ünal, Barış; Khademhosseini, Ali; Butt, Haider
2018-06-11
Wearables as medical technologies are becoming an integral part of personal analytics, measuring physical status, recording physiological parameters, or informing schedule for medication. These continuously evolving technology platforms do not only promise to help people pursue a healthier life style, but also provide continuous medical data for actively tracking metabolic status, diagnosis, and treatment. Advances in the miniaturization of flexible electronics, electrochemical biosensors, microfluidics, and artificial intelligence algorithms have led to wearable devices that can generate real-time medical data within the Internet of things. These flexible devices can be configured to make conformal contact with epidermal, ocular, intracochlear, and dental interfaces to collect biochemical or electrophysiological signals. This article discusses consumer trends in wearable electronics, commercial and emerging devices, and fabrication methods. It also reviews real-time monitoring of vital signs using biosensors, stimuli-responsive materials for drug delivery, and closed-loop theranostic systems. It covers future challenges in augmented, virtual, and mixed reality, communication modes, energy management, displays, conformity, and data safety. The development of patient-oriented wearable technologies and their incorporation in randomized clinical trials will facilitate the design of safe and effective approaches. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An Examination of the Evolution of Radiation and Advection Fogs
1993-01-01
and fog diagnostic and prediction models have developed in sophistication so that they can reproduce fairly accurate one- or two-dimensional...occurred only by molecular diffusion near the interface created between the species during the mixing process. The rate of homogenization is minimal until...of excess vapor by molecular diffusion at the interfaces of nearly saturated air mixing in eddies is faster than the relaxation time of droplet
Microstructure and Hydrogen-Induced Failure Mechanisms in Fe and Ni Alloy Weldments
NASA Astrophysics Data System (ADS)
Fenske, J. A.; Robertson, I. M.; Ayer, Raghavan; Hukle, Martin; Lillig, Dan; Newbury, Brian
2012-09-01
The microstructure and fracture morphology of AISI 8630-IN625 and ASTM A182-F22-IN625 dissimilar metal weld interfaces were compared and contrasted as a function of postweld heat treatment (PWHT) duration. For both systems, the microstructure along the weld interface consisted of a coarse grain heat-affected zone in the Fe-base metal followed by discontinuous martensitic partially mixed zones and a continuous partially mixed zone on the Ni side of the fusion line. Within the partially mixed zone on the Ni side, there exists a 200-nm-wide transition zone within a 20- μm-wide planar solidification region followed by a cellular dendritic region with Nb-Mo-rich carbides decorating the dendrite boundaries. Although there were differences in the volume of the partially mixed zones, the major difference in the metal weld interfaces was the presence of M7C3 precipitates in the planar solidification region, which had formed in AISI 8630-IN625 but not in ASTM A182-F22-IN625. These precipitates make the weldment more susceptible to hydrogen embrittlement and provide a low energy fracture path between the discontinuous partially mixed zones.
2011-01-18
JSC2011-E-003204 (18 Jan. 2011) --- NASA astronauts Rex Walheim, STS-135 mission specialist; and Mike Fossum (foreground), Expedition 28 flight engineer and Expedition 29 commander; use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. STS-135 is planned to be the final mission of the space shuttle program. Photo credit: NASA or National Aeronautics and Space Administration
Aharon, S; Robb, R A
1997-01-01
Virtual reality environments provide highly interactive, natural control of the visualization process, significantly enhancing the scientific value of the data produced by medical imaging systems. Due to the computational and real time display update requirements of virtual reality interfaces, however, the complexity of organ and tissue surfaces which can be displayed is limited. In this paper, we present a new algorithm for the production of a polygonal surface containing a pre-specified number of polygons from patient or subject specific volumetric image data. The advantage of this new algorithm is that it effectively tiles complex structures with a specified number of polygons selected to optimize the trade-off between surface detail and real-time display rates.
A virtual reality environment for telescope operation
NASA Astrophysics Data System (ADS)
Martínez, Luis A.; Villarreal, José L.; Ángeles, Fernando; Bernal, Abel
2010-07-01
Astronomical observatories and telescopes are becoming increasingly large and complex systems, demanding to any potential user the acquirement of great amount of information previous to access them. At present, the most common way to overcome that information is through the implementation of larger graphical user interfaces and computer monitors to increase the display area. Tonantzintla Observatory has a 1-m telescope with a remote observing system. As a step forward in the improvement of the telescope software, we have designed a Virtual Reality (VR) environment that works as an extension of the remote system and allows us to operate the telescope. In this work we explore this alternative technology that is being suggested here as a software platform for the operation of the 1-m telescope.
ERIC Educational Resources Information Center
Friedenthal-Haase, Martha
1993-01-01
A literature review of adult education and Anglo-German connections, 1880-1933, showed that interculturality played an essential role in identity formation at the interface of culture and politics. Adult education developed as an autonomous area of international communication between British and German citizens despite political realities. (SK)
Virtual Teleoperation for Unmanned Aerial Vehicles
2012-01-24
Gilbert, S., “Wayfinder: Evaluating Multitouch Interaction in Supervisory Control of Unmanned Vehicles,” Proceedings of ASME 2nd World Conference on... interactive virtual reality environment that fuses available information into a coherent picture that can be viewed from multiple perspectives and scales...for multimodal interaction • Generally abstracted controller hardware and graphical interfaces facilitating deployment on a variety of VR platform
Luu, Trieu Phat; He, Yongtian; Brown, Samuel; Nakagome, Sho; Contreras-Vidal, Jose L.
2016-01-01
The control of human bipedal locomotion is of great interest to the field of lower-body brain computer interfaces (BCIs) for rehabilitation of gait. While the feasibility of a closed-loop BCI system for the control of a lower body exoskeleton has been recently shown, multi-day closed-loop neural decoding of human gait in a virtual reality (BCI-VR) environment has yet to be demonstrated. In this study, we propose a real-time closed-loop BCI that decodes lower limb joint angles from scalp electroencephalography (EEG) during treadmill walking to control the walking movements of a virtual avatar. Moreover, virtual kinematic perturbations resulting in asymmetric walking gait patterns of the avatar were also introduced to investigate gait adaptation using the closed-loop BCI-VR system over a period of eight days. Our results demonstrate the feasibility of using a closed-loop BCI to learn to control a walking avatar under normal and altered visuomotor perturbations, which involved cortical adaptations. These findings have implications for the development of BCI-VR systems for gait rehabilitation after stroke and for understanding cortical plasticity induced by a closed-loop BCI system. PMID:27713915
An elevated plus-maze in mixed reality for studying human anxiety-related behavior.
Biedermann, Sarah V; Biedermann, Daniel G; Wenzlaff, Frederike; Kurjak, Tim; Nouri, Sawis; Auer, Matthias K; Wiedemann, Klaus; Briken, Peer; Haaker, Jan; Lonsdorf, Tina B; Fuss, Johannes
2017-12-21
A dearth of laboratory tests to study actual human approach-avoidance behavior has complicated translational research on anxiety. The elevated plus-maze (EPM) is the gold standard to assess approach-avoidance behavior in rodents. Here, we translated the EPM to humans using mixed reality through a combination of virtual and real-world elements. In two validation studies, we observed participants' anxiety on a behavioral, physiological, and subjective level. Participants reported higher anxiety on open arms, avoided open arms, and showed an activation of endogenous stress systems. Participants' with high anxiety exhibited higher avoidance. Moreover, open arm avoidance was moderately predicted by participants' acrophobia and sensation seeking, with opposing influences. In a randomized, double blind, placebo controlled experiment, GABAergic stimulation decreased avoidance of open arms while alpha-2-adrenergic antagonism increased avoidance. These findings demonstrate cross-species validity of open arm avoidance as a translational measure of anxiety. We thus introduce the first ecologically valid assay to track actual human approach-avoidance behavior under laboratory conditions.
Collective motion patterns of swarms with delay coupling: Theory and experiment.
Szwaykowska, Klementyna; Schwartz, Ira B; Mier-Y-Teran Romero, Luis; Heckman, Christoffer R; Mox, Dan; Hsieh, M Ani
2016-03-01
The formation of coherent patterns in swarms of interacting self-propelled autonomous agents is a subject of great interest in a wide range of application areas, ranging from engineering and physics to biology. In this paper, we model and experimentally realize a mixed-reality large-scale swarm of delay-coupled agents. The coupling term is modeled as a delayed communication relay of position. Our analyses, assuming agents communicating over an Erdös-Renyi network, demonstrate the existence of stable coherent patterns that can be achieved only with delay coupling and that are robust to decreasing network connectivity and heterogeneity in agent dynamics. We also show how the bifurcation structure for emergence of different patterns changes with heterogeneity in agent acceleration capabilities and limited connectivity in the network as a function of coupling strength and delay. Our results are verified through simulation as well as preliminary experimental results of delay-induced pattern formation in a mixed-reality swarm.
Collective motion patterns of swarms with delay coupling: Theory and experiment
NASA Astrophysics Data System (ADS)
Szwaykowska, Klementyna; Schwartz, Ira B.; Mier-y-Teran Romero, Luis; Heckman, Christoffer R.; Mox, Dan; Hsieh, M. Ani
2016-03-01
The formation of coherent patterns in swarms of interacting self-propelled autonomous agents is a subject of great interest in a wide range of application areas, ranging from engineering and physics to biology. In this paper, we model and experimentally realize a mixed-reality large-scale swarm of delay-coupled agents. The coupling term is modeled as a delayed communication relay of position. Our analyses, assuming agents communicating over an Erdös-Renyi network, demonstrate the existence of stable coherent patterns that can be achieved only with delay coupling and that are robust to decreasing network connectivity and heterogeneity in agent dynamics. We also show how the bifurcation structure for emergence of different patterns changes with heterogeneity in agent acceleration capabilities and limited connectivity in the network as a function of coupling strength and delay. Our results are verified through simulation as well as preliminary experimental results of delay-induced pattern formation in a mixed-reality swarm.
Tele-auscultation support system with mixed reality navigation.
Hori, Kenta; Uchida, Yusuke; Kan, Tsukasa; Minami, Maya; Naito, Chisako; Kuroda, Tomohiro; Takahashi, Hideya; Ando, Masahiko; Kawamura, Takashi; Kume, Naoto; Okamoto, Kazuya; Takemura, Tadamasa; Yoshihara, Hiroyuki
2013-01-01
The aim of this research is to develop an information support system for tele-auscultation. In auscultation, a doctor requires to understand condition of applying a stethoscope, in addition to auscultatory sounds. The proposed system includes intuitive navigation system of stethoscope operation, in addition to conventional audio streaming system of auscultatory sounds and conventional video conferencing system for telecommunication. Mixed reality technology is applied for intuitive navigation of the stethoscope. Information, such as position, contact condition and breath, is overlaid on a view of the patient's chest. The contact condition of the stethoscope is measured by e-textile contact sensors. The breath is measured by a band type breath sensor. In a simulated tele-auscultation experiment, the stethoscope with the contact sensors and the breath sensor were evaluated. The results show that the presentation of the contact condition was not understandable enough for navigating the stethoscope handling. The time series of the breath phases was usable for the remote doctor to understand the breath condition of the patient.
Koppel, Ross; Kuziemsky, Craig
2017-01-01
Usability of health information technology (HIT), if considered at all, is usually focused on individual providers, settings and vendors. However, in light of transformative models of healthcare delivery such as collaborative care delivery that crosses providers and settings, we need to think of usability as a collective and constantly emerging process. To address this new reality we develop a matrix of usability that spans several dimensions and contexts, incorporating differing vendors, user, settings, disciplines, and display configurations. The matrix, while conceptual, extends existing work by providing the means for discussion of usability issues and needs beyond one setting and one user type.
Modeling and Analysis of Mixed Synchronous/Asynchronous Systems
NASA Technical Reports Server (NTRS)
Driscoll, Kevin R.; Madl. Gabor; Hall, Brendan
2012-01-01
Practical safety-critical distributed systems must integrate safety critical and non-critical data in a common platform. Safety critical systems almost always consist of isochronous components that have synchronous or asynchronous interface with other components. Many of these systems also support a mix of synchronous and asynchronous interfaces. This report presents a study on the modeling and analysis of asynchronous, synchronous, and mixed synchronous/asynchronous systems. We build on the SAE Architecture Analysis and Design Language (AADL) to capture architectures for analysis. We present preliminary work targeted to capture mixed low- and high-criticality data, as well as real-time properties in a common Model of Computation (MoC). An abstract, but representative, test specimen system was created as the system to be modeled.
Turbulent mixing induced by Richtmyer-Meshkov instability
NASA Astrophysics Data System (ADS)
Krivets, V. V.; Ferguson, K. J.; Jacobs, J. W.
2017-01-01
Richtmyer-Meshkov instability is studied in shock tube experiments with an Atwood number of 0.7. The interface is formed in a vertical shock tube using opposed gas flows, and three-dimensional random initial interface perturbations are generated by the vertical oscillation of gas column producing Faraday waves. Planar Laser Mie scattering is used for flow visualization and for measurements of the mixing process. Experimental image sequences are recorded at 6 kHz frequency and processed to obtain the time dependent variation of the integral mixing layer width. Measurements of the mixing layer width are compared with Mikaelian's [1] model in order to extract the growth exponent θ where a fairly wide range of values is found varying from θ ≈ 0.2 to 0.6.
de Miguel, Gustavo; Martín-Romero, María T; Pedrosa, José M; Muñoz, Eulogia; Pérez-Morales, Marta; Richardson, Tim H; Camacho, Luis
2008-03-21
In this paper, the different aggregation modes of a water-insoluble porphyrin (EHO) mixed with an amphiphilic calix[8]arene (C8A), at the air-water interface and in Langmuir-Blodgett (LB) film form, are analyzed as a function of the mixed composition. The strategy used to control the EHO aggregation has consisted of preparing mixed thin films containing EHO and C8A, in different ratios, at the air-water interface. Therefore, the increase of the C8A molar ratio in the mixed film diminishes the aggregation of the EHO molecules, although such an effect must be exclusively related to the dilution of the porphyrin. The reflection spectra of the mixed C8A-EHO films registered at the air-water interface, show a complex Soret band exhibiting splitting, hypochromicity and broadening features. Also, during the transfer process at high surface pressure, it has been shown that the EHO molecules are ejected from the C8A monolayer and only a fraction of porphyrin is transferred to the solid support, in spite of a complete transfer for the C8A matrix. The complex structure of the reflection spectra at the air-water interface, as well as the polarization dependence of the absorption spectra for the mixed LB films, indicate the existence of four different arrangements for the EHO hosted in the C8A matrix. The aggregate formation is governed by two factors: the attraction between the porphyrin rings which minimizes their separation, and the alkyl chain interactions, that is, hydrophobic effect and/or steric hindrance which determine and restrict the possible aggregation structures. By using the extended dipole model, the assignment of the spectral peaks observed to different EHO aggregates is shown.
NASA Astrophysics Data System (ADS)
Yeckel, Andrew; Derby, Jeffrey J.
2000-02-01
Three-dimensional axisymmetric, time-dependent simulations of the high-pressure vertical Bridgman growth of large-diameter cadmium zinc telluride are performed to study the effect of accelerated crucible rotation (ACRT) on crystal growth dynamics. The model includes details of heat transfer, melt convection, solid-liquid interface shape, and dilute zinc segregation. Application of ACRT greatly improves mixing in the melt, but causes an overall increased deflection of the solid-liquid interface. The flow exhibits a Taylor-Görtler instability at the crucible sidewall, which further enhances melt mixing. The rate of mixing depends strongly on the length of the ACRT cycle, with an optimum half-cycle length between 2 and 4 Ekman time units. Significant melting of the crystal occurs during a portion of the rotation cycle, caused by periodic reversal of the secondary flow at the solid-liquid interface, indicating the possibility of compositional striations.
Mixed-initiative control of intelligent systems
NASA Technical Reports Server (NTRS)
Borchardt, G. C.
1987-01-01
Mixed-initiative user interfaces provide a means by which a human operator and an intelligent system may collectively share the task of deciding what to do next. Such interfaces are important to the effective utilization of real-time expert systems as assistants in the execution of critical tasks. Presented here is the Incremental Inference algorithm, a symbolic reasoning mechanism based on propositional logic and suited to the construction of mixed-initiative interfaces. The algorithm is similar in some respects to the Truth Maintenance System, but replaces the notion of 'justifications' with a notion of recency, allowing newer values to override older values yet permitting various interested parties to refresh these values as they become older and thus more vulnerable to change. A simple example is given of the use of the Incremental Inference algorithm plus an overview of the integration of this mechanism within the SPECTRUM expert system for geological interpretation of imaging spectrometer data.
A 'mixed reality' simulator concept for future Medical Emergency Response Team training.
Stone, Robert J; Guest, R; Mahoney, P; Lamb, D; Gibson, C
2017-08-01
The UK Defence Medical Service's Pre-Hospital Emergency Care (PHEC) capability includes rapid-deployment Medical Emergency Response Teams (MERTs) comprising tri-service trauma consultants, paramedics and specialised nurses, all of whom are qualified to administer emergency care under extreme conditions to improve the survival prospects of combat casualties. The pre-deployment training of MERT personnel is designed to foster individual knowledge, skills and abilities in PHEC and in small team performance and cohesion in 'mission-specific' contexts. Until now, the provision of airborne pre-deployment MERT training had been dependent on either the availability of an operational aircraft (eg, the CH-47 Chinook helicopter) or access to one of only two ground-based facsimiles of the Chinook 's rear cargo/passenger cabin. Although MERT training has high priority, there will always be competition with other military taskings for access to helicopter assets (and for other platforms in other branches of the Armed Forces). This paper describes the development of an inexpensive, reconfigurable and transportable MERT training concept based on 'mixed reality' technologies-in effect the 'blending' of real-world objects of training relevance with virtual reality reconstructions of operational contexts. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Heyn, Patricia C; Baumgardner, Chad A; McLachlan, Leslie; Bodine, Cathy
2014-01-01
The purpose of this pilot study was to investigate the effectiveness of a mixed-reality (MR) exercise environment on engagement and enjoyment levels of individuals with spinal cord injury (SCI) and intellectual and developmental disabilities (IDD). Six people participated in this cross-sectional, observational pilot study involving one MR exercise trial. The augmented reality environment was based on a first-person perspective video of a scenic biking/walking trail in Colorado. Males and females (mean age, 43.3 ± 13.7 years) were recruited from a research database for their participation in previous clinical studies. Of the 6 participants, 2 had SCI, 2 had IDD, and 2 were without disability. The primary outcome measurement of this pilot study was the self-reported engagement and enjoyment level of each participant after the exercise trial. All participants reported increased levels of engagement, enjoyment, and immersion involving the MR exercise environment as well as positive feedback recommending this type of exercise approach to peers with similar disabilities. All the participants reported higher than normal levels of enjoyment and 66.7% reported higher than normal levels of being on a real trail. Participants' feedback suggested that the MR environment could be entertaining, motivating, and engaging for users with disabilities, resulting in a foundation for further development of this technology for use in individuals with cognitive and physical disabilities.
Training for planning tumour resection: augmented reality and human factors.
Abhari, Kamyar; Baxter, John S H; Chen, Elvis C S; Khan, Ali R; Peters, Terry M; de Ribaupierre, Sandrine; Eagleson, Roy
2015-06-01
Planning surgical interventions is a complex task, demanding a high degree of perceptual, cognitive, and sensorimotor skills to reduce intra- and post-operative complications. This process requires spatial reasoning to coordinate between the preoperatively acquired medical images and patient reference frames. In the case of neurosurgical interventions, traditional approaches to planning tend to focus on providing a means for visualizing medical images, but rarely support transformation between different spatial reference frames. Thus, surgeons often rely on their previous experience and intuition as their sole guide is to perform mental transformation. In case of junior residents, this may lead to longer operation times or increased chance of error under additional cognitive demands. In this paper, we introduce a mixed augmented-/virtual-reality system to facilitate training for planning a common neurosurgical procedure, brain tumour resection. The proposed system is designed and evaluated with human factors explicitly in mind, alleviating the difficulty of mental transformation. Our results indicate that, compared to conventional planning environments, the proposed system greatly improves the nonclinicians' performance, independent of the sensorimotor tasks performed ( ). Furthermore, the use of the proposed system by clinicians resulted in a significant reduction in time to perform clinically relevant tasks ( ). These results demonstrate the role of mixed-reality systems in assisting residents to develop necessary spatial reasoning skills needed for planning brain tumour resection, improving patient outcomes.
Price, Matthew; Anderson, Page L.
2012-01-01
Outcome expectancy, the extent that clients anticipate benefiting from therapy, is theorized to be an important predictor of treatment response for cognitive–behavioral therapy. However, there is a relatively small body of empirical research on outcome expectancy and the treatment of social anxiety disorder. This literature, which has examined the association mostly in group-based interventions, has yielded mixed findings. The current study sought to further evaluate the effect of outcome expectancy as a predictor of treatment response for public-speaking fears across both individual virtual reality and group-based cognitive– behavioral therapies. The findings supported outcome expectancy as a predictor of the rate of change in public-speaking anxiety during both individual virtual reality exposure therapy and group cognitive– behavioral therapy. Furthermore, there was no evidence to suggest that the impact of outcome expectancy differed across virtual reality or group treatments. PMID:21967073
In-Factory Learning - Qualification For The Factory Of The Future
NASA Astrophysics Data System (ADS)
Quint, Fabian; Mura, Katharina; Gorecky, Dominic
2015-07-01
The Industry 4.0 vision anticipates that internet technologies will find their way into future factories replacing traditional components by dynamic and intelligent cyber-physical systems (CPS) that combine the physical objects with their digital representation. Reducing the gap between the real and digital world makes the factory environment more flexible, more adaptive, but also more complex for the human workers. Future workers require interdisciplinary competencies from engineering, information technology, and computer science in order to understand and manage the diverse interrelations between physical objects and their digital counterpart. This paper proposes a mixed-reality based learning environment, which combines physical objects and visualisation of digital content via Augmented Reality. It uses reality-based interaction in order to make the dynamic interrelations between real and digital factory visible and tangible. We argue that our learning system does not work as a stand-alone solution, but should fit into existing academic and advanced training curricula.
[VR and AR Applications in Medical Practice and Education].
Hsieh, Min-Chai; Lin, Yu-Hsuan
2017-12-01
As technology advances, mobile devices have gradually turned into wearable devices. Furthermore, virtual reality (VR), augmented reality (AR), and mixed reality (MR) are being increasingly applied in medical fields such as medical education and training, surgical simulation, neurological rehabilitation, psychotherapy, and telemedicine. Research results demonstrate the ability of VR, AR, and MR to ameliorate the inconveniences that are often associated with traditional medical care, reduce incidents of medical malpractice caused by unskilled operations, and reduce the cost of medical education and training. What is more, the application of these technologies has enhanced the effectiveness of medical education and training, raised the level of diagnosis and treatment, improved the doctor-patient relationship, and boosted the efficiency of medical execution. The present study introduces VR, AR, and MR applications in medical practice and education with the aim of helping health professionals better understand the applications and use these technologies to improve the quality of medical care.
Depth resolution and preferential sputtering in depth profiling of sharp interfaces
NASA Astrophysics Data System (ADS)
Hofmann, S.; Han, Y. S.; Wang, J. Y.
2017-07-01
The influence of preferential sputtering on depth resolution of sputter depth profiles is studied for different sputtering rates of the two components at an A/B interface. Surface concentration and intensity depth profiles on both the sputtering time scale (as measured) and the depth scale are obtained by calculations with an extended Mixing-Roughness-Information depth (MRI)-model. The results show a clear difference for the two extreme cases (a) preponderant roughness and (b) preponderant atomic mixing. In case (a), the interface width on the time scale (Δt(16-84%)) increases with preferential sputtering if the faster sputtering component is on top of the slower sputtering component, but the true resolution on the depth scale (Δz(16-84%)) stays constant. In case (b), the interface width on the time scale stays constant but the true resolution on the depth scale varies with preferential sputtering. For similar order of magnitude of the atomic mixing and the roughness parameters, a transition state between the two extremes is obtained. While the normalized intensity profile of SIMS represents that of the surface concentration, an additional broadening effect is encountered in XPS or AES by the influence of the mean electron escape depth which may even cause an additional matrix effect at the interface.
Numerical simulation of a non-equilibrium electrokinetic micro/nano fluidic mixer
NASA Astrophysics Data System (ADS)
Hadidi, H.; Kamali, R.
2016-03-01
In this study we numerically simulate a novel micromixer that utilizes vortex generation from the non-equilibrium electrokinetics near the micro/nanochannels interface. After mixing in combined pressure-driven and electroosmotic flows was compared with mixing in a pure pressure-driven flow, the superior mixing performance of the former was evident: for a specific case study, 90% mixing of two fluid streams for a short mixing length was achieved. The results of our numerical study were very similar to those of previously reported experiments. In this paper we explain the phenomenon occurring adjacent to the nano-junctions by plotting the electrical field components, the velocity contours and the concentration distribution in the micromixer. The vortices at the micro/nanochannel interface were obviously indicators of non-equilibrium behaviour in these regions. At the end, the mixing performance was evaluated by the investigation of different applied voltages, Reynolds numbers and surface charge densities using the mixing index parameter, and the results showed that more efficient mixing occurred when the applied voltage and surface charge density magnitude were increased and the Reynolds number was decreased.
Transformative Mixed Methods Research
ERIC Educational Resources Information Center
Mertens, Donna M.
2010-01-01
Paradigms serve as metaphysical frameworks that guide researchers in the identification and clarification of their beliefs with regard to ethics, reality, knowledge, and methodology. The transformative paradigm is explained and illustrated as a framework for researchers who place a priority on social justice and the furtherance of human rights.…
Mixed Reality on a Virtual Globe
2011-01-01
devices which can be inaccurate. However in a feature-based tracking system such as simultaneous localization and mapping ( SLAM ) (Durrant-Whyte & Bailey...or as complex as reconstruction from Light Detection and Ranging ( LIDAR ) sensing may be used to generate such a model. Many studies have been done to
Mission Specific Embedded Training Using Mixed Reality
2011-10-01
3] Mark A. Livingston, J. Edward Swan II, Simon J. Julier, Yohan Baillot, Dennis G. Brown, Lawrence J. Rosenblum, Joseph L. Gabbard , Tobias H...Mark A. Livingston, Lawrence J. Rosenblum, Simon J. Julier, Dennis Brown, Yohan Baillot, Edward Swan, Joseph L. Gabbard , and Deb- orah Hix. An Augmented
Sensor supervision and multiagent commanding by means of projective virtual reality
NASA Astrophysics Data System (ADS)
Rossmann, Juergen
1998-10-01
When autonomous systems with multiple agents are considered, conventional control- and supervision technologies are often inadequate because the amount of information available is often presented in a way that the user is effectively overwhelmed by the displayed data. New virtual reality (VR) techniques can help to cope with this problem, because VR offers the chance to convey information in an intuitive manner and can combine supervision capabilities and new, intuitive approaches to the control of autonomous systems. In the approach taken, control and supervision issues were equally stressed and finally led to the new ideas and the general framework for Projective Virtual Reality. The key idea of this new approach for an intuitively operable man machine interface for decentrally controlled multi-agent systems is to let the user act in the virtual world, detect the changes and have an action planning component automatically generate task descriptions for the agents involved to project actions that have been carried out by users in the virtual world into the physical world, e.g. with the help of robots. Thus the Projective Virtual Reality approach is to split the job between the task deduction in the VR and the task `projection' onto the physical automation components by the automatic action planning component. Besides describing the realized projective virtual reality system, the paper will also describe in detail the metaphors and visualization aids used to present different types of (e.g. sensor-) information in an intuitively comprehensible manner.
Mixing and transient interface condensation of a liquid hydrogen tank
NASA Technical Reports Server (NTRS)
Lin, C. S.; Hasan, M. M.; Nyland, T. W.
1993-01-01
Experiments were conducted to investigate the effect of axial jet-induced mixing on the pressure reduction of a thermally stratified liquid hydrogen tank. The tank was nearly cylindrical, having a volume of about 0.144 cu m with 0.559 m in diameter and 0.711 m long. A mixer/pump unit, which had a jet nozzle outlet of 0.0221 m in diameter was located 0.178 m from the tank bottom and was installed inside the tank to generate the axial jet mixing and tank fluid circulation. The liquid fill and jet flow rate ranged from 42 to 85 percent (by volume) and 0.409 to 2.43 cu m/hr, respectively. Mixing tests began with the tank pressure ranging from 187.5 to 238.5 kPa at which the thermal stratification results in 4.9 to 6.2 K liquid sub cooling. The mixing time and transient vapor condensation rate at the liquid-vapor interface are determined. Two mixing time correlations, based on the thermal equilibrium and pressure equilibrium, are developed. Both mixing time correlations are expressed as functions of system and buoyancy parameters and compared well with other experimental data. The steady state condensation rate correlation of Sonin et al. based on steam-water data is modified and expressed as a function of jet subcooling. The limited liquid hydrogen data of the present study shows that the modified steady state condensation rate correlation may be used to predict the transient condensation rate in a mixing process if the instantaneous values of jet sub cooling and turbulence intensity at the interface are employed.
Web GIS in practice VI: a demo playlist of geo-mashups for public health neogeographers
Boulos, Maged N Kamel; Scotch, Matthew; Cheung, Kei-Hoi; Burden, David
2008-01-01
'Mashup' was originally used to describe the mixing together of musical tracks to create a new piece of music. The term now refers to Web sites or services that weave data from different sources into a new data source or service. Using a musical metaphor that builds on the origin of the word 'mashup', this paper presents a demonstration "playlist" of four geo-mashup vignettes that make use of a range of Web 2.0, Semantic Web, and 3-D Internet methods, with outputs/end-user interfaces spanning the flat Web (two-dimensional – 2-D maps), a three-dimensional – 3-D mirror world (Google Earth) and a 3-D virtual world (Second Life ®). The four geo-mashup "songs" in this "playlist" are: 'Web 2.0 and GIS (Geographic Information Systems) for infectious disease surveillance', 'Web 2.0 and GIS for molecular epidemiology', 'Semantic Web for GIS mashup', and 'From Yahoo! Pipes to 3-D, avatar-inhabited geo-mashups'. It is hoped that this showcase of examples and ideas, and the pointers we are providing to the many online tools that are freely available today for creating, sharing and reusing geo-mashups with minimal or no coding, will ultimately spark the imagination of many public health practitioners and stimulate them to start exploring the use of these methods and tools in their day-to-day practice. The paper also discusses how today's Web is rapidly evolving into a much more intensely immersive, mixed-reality and ubiquitous socio-experiential Metaverse that is heavily interconnected through various kinds of user-created mashups. PMID:18638385
NASA Astrophysics Data System (ADS)
Demir, I.
2014-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.
Christ, Roxie; Guevar, Julien; Poyade, Matthieu; Rea, Paul M
2018-01-01
Neuroanatomy can be challenging to both teach and learn within the undergraduate veterinary medicine and surgery curriculum. Traditional techniques have been used for many years, but there has now been a progression to move towards alternative digital models and interactive 3D models to engage the learner. However, digital innovations in the curriculum have typically involved the medical curriculum rather than the veterinary curriculum. Therefore, we aimed to create a simple workflow methodology to highlight the simplicity there is in creating a mobile augmented reality application of basic canine head anatomy. Using canine CT and MRI scans and widely available software programs, we demonstrate how to create an interactive model of head anatomy. This was applied to augmented reality for a popular Android mobile device to demonstrate the user-friendly interface. Here we present the processes, challenges and resolutions for the creation of a highly accurate, data based anatomical model that could potentially be used in the veterinary curriculum. This proof of concept study provides an excellent framework for the creation of augmented reality training products for veterinary education. The lack of similar resources within this field provides the ideal platform to extend this into other areas of veterinary education and beyond.
Christ, Roxie; Guevar, Julien; Poyade, Matthieu
2018-01-01
Neuroanatomy can be challenging to both teach and learn within the undergraduate veterinary medicine and surgery curriculum. Traditional techniques have been used for many years, but there has now been a progression to move towards alternative digital models and interactive 3D models to engage the learner. However, digital innovations in the curriculum have typically involved the medical curriculum rather than the veterinary curriculum. Therefore, we aimed to create a simple workflow methodology to highlight the simplicity there is in creating a mobile augmented reality application of basic canine head anatomy. Using canine CT and MRI scans and widely available software programs, we demonstrate how to create an interactive model of head anatomy. This was applied to augmented reality for a popular Android mobile device to demonstrate the user-friendly interface. Here we present the processes, challenges and resolutions for the creation of a highly accurate, data based anatomical model that could potentially be used in the veterinary curriculum. This proof of concept study provides an excellent framework for the creation of augmented reality training products for veterinary education. The lack of similar resources within this field provides the ideal platform to extend this into other areas of veterinary education and beyond. PMID:29698413
Astronauts Prepare for Mission With Virtual Reality Hardware
NASA Technical Reports Server (NTRS)
2001-01-01
Astronauts John M. Grunsfeld (left), STS-109 payload commander, and Nancy J. Currie, mission specialist, use the virtual reality lab at Johnson Space Center to train for upcoming duties aboard the Space Shuttle Columbia. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team to perform its duties for the fourth Hubble Space Telescope Servicing mission. The most familiar form of virtual reality technology is some form of headpiece, which fits over your eyes and displays a three dimensional computerized image of another place. Turn your head left and right, and you see what would be to your sides; turn around, and you see what might be sneaking up on you. An important part of the technology is some type of data glove that you use to propel yourself through the virtual world. Currently, the medical community is using the new technologies in four major ways: To see parts of the body more accurately, for study, to make better diagnosis of disease and to plan surgery in more detail; to obtain a more accurate picture of a procedure during surgery; to perform more types of surgery with the most noninvasive, accurate methods possible; and to model interactions among molecules at a molecular level.
Laboratory simulations of the atmospheric mixed-layer in flow ...
A laboratory study of the influence of complex terrain on the interface between a well-mixed boundary layer and an elevated stratified layer was conducted in the towing-tank facility of the U.S. Environmental Protection Agency. The height of the mixed layer in the daytime boundary layer can have a strong influence on the concentration of pollutants within this layer. Deflections of streamlines at the height of the interface are primarily a function of hill Froude number (Fr), the ratio of mixed-layer height (zi) to terrain height (h), and the crosswind dimension of the terrain. The magnitude of the deflections increases as Fr increases and zi / h decreases. For mixing-height streamlines that are initially below the terrain top, the response is linear with Fr; for those initially above the terrain feature the response to Fr is more complex. Once Fr exceeds about 2, the terrain related response of the mixed layer interface decreases somewhat with increasing Fr (toward more neutral flow). Deflections are also shown to increase as the crosswind dimensions of the terrain increases. Comparisons with numerical modeling, limited field data and other laboratory measurements reported in the literature are favorable. Additionally, visual observations of dye streamers suggests that the flow structure exhibited for our elevated inversions passing over three dimensional hills is similar to that reported in the literature for continuously stratified flow over two-dimensional h
Mathematical modeling of two phase stratified flow in a microchannel with curved interface
NASA Astrophysics Data System (ADS)
Dandekar, Rajat; Picardo, Jason R.; Pushpavanam, S.
2017-11-01
Stratified or layered two-phase flows are encountered in several applications of microchannels, such as solvent extraction. Assuming steady, unidirectional creeping flow, it is possible to solve the Stokes equations by the method of eigenfunctions, provided the interface is flat and meets the wall with a 90 degree contact angle. However, in reality the contact angle depends on the pair of liquids and the material of the channel, and differs significantly from 90 degrees in many practical cases. For unidirectional flow, this implies that the interface is a circular arc (of constant curvature). We solve this problem within the framework of eigenfunctions, using the procedure developed by Shankar. We consider two distinct cases: (a) the interface meets the wall with the equilibrium contact angle; (b) the interface is pinned by surface treatment of the walls, so that the flow rates determine the apparent contact angle. We show that the contact angle appreciably affects the velocity profile and the volume fractions of the liquids, while limiting the range of flow rates that can be sustained without the interface touching the top/bottom walls. Non-intuitively, we find that the pressure drop is reduced when the more viscous liquid wets the wall.
Mixed-Mode Decohesion Elements for Analyses of Progressive Delamination
NASA Technical Reports Server (NTRS)
Davila, Carlos G.; Camanho, Pedro P.; deMoura, Marcelo F.
2001-01-01
A new 8-node decohesion element with mixed mode capability is proposed and demonstrated. The element is used at the interface between solid finite elements to model the initiation and propagation of delamination. A single displacement-based damage parameter is used in a strain softening law to track the damage state of the interface. The method can be used in conjunction with conventional material degradation procedures to account for inplane and intra-laminar damage modes. The accuracy of the predictions is evaluated in single mode delamination tests, in the mixed-mode bending test, and in a structural configuration consisting of the debonding of a stiffener flange from its skin.
Virtual Rehabilitation with Children: Challenges for Clinical Adoption [From the Field].
Glegg, Stephanie
2017-01-01
Virtual, augmented, and mixed reality environments are increasingly being developed and used to address functional rehabilitation goals related to physical, cognitive, social, and psychological impairments. For example, a child with an acquired brain injury may participate in virtual rehabilitation to address impairments in balance, attention, turn taking, and engagement in therapy. The trend toward virtual rehabilitation first gained momentum with the adoption of commercial off-the-shelf active video gaming consoles (e.g., Nintendo Wii and XBox). Now, we are seeing the rapid emergence of customized rehabilitation-specific systems that integrate technological advances in virtual reality, visual effects, motion tracking, physiological monitoring, and robotics.
Distributed Planning in a Mixed-Initiative Environment
2008-06-01
Knowledge Sources Control Remote Blackboard Remote Knowledge Sources Remot e Data Remot e Data Java Distributed Blackboard Figure 3 - Distributed...an interface agent or planning agent and the second type is a critic agent. Agents in the DEEP architecture extend and use the Java Agent...chosen because it is fully implemented in Java , and supports these requirements. 2.3.3 Interface Agents Interface agents are the interfaces through
Augmented reality and haptic interfaces for robot-assisted surgery.
Yamamoto, Tomonori; Abolhassani, Niki; Jung, Sung; Okamura, Allison M; Judkins, Timothy N
2012-03-01
Current teleoperated robot-assisted minimally invasive surgical systems do not take full advantage of the potential performance enhancements offered by various forms of haptic feedback to the surgeon. Direct and graphical haptic feedback systems can be integrated with vision and robot control systems in order to provide haptic feedback to improve safety and tissue mechanical property identification. An interoperable interface for teleoperated robot-assisted minimally invasive surgery was developed to provide haptic feedback and augmented visual feedback using three-dimensional (3D) graphical overlays. The software framework consists of control and command software, robot plug-ins, image processing plug-ins and 3D surface reconstructions. The feasibility of the interface was demonstrated in two tasks performed with artificial tissue: palpation to detect hard lumps and surface tracing, using vision-based forbidden-region virtual fixtures to prevent the patient-side manipulator from entering unwanted regions of the workspace. The interoperable interface enables fast development and successful implementation of effective haptic feedback methods in teleoperation. Copyright © 2011 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eric A. Wernert; William R. Sherman; Patrick O'Leary
Immersive visualization makes use of the medium of virtual reality (VR) - it is a subset of virtual reality focused on the application of VR technologies to scientific and information visualization. As the name implies, there is a particular focus on the physically immersive aspect of VR that more fully engages the perceptual and kinesthetic capabilities of the scientist with the goal of producing greater insight. The immersive visualization community is uniquely positioned to address the analysis needs of the wide spectrum of domain scientists who are becoming increasingly overwhelmed by data. The outputs of computational science simulations and high-resolutionmore » sensors are creating a data deluge. Data is coming in faster than it can be analyzed, and there are countless opportunities for discovery that are missed as the data speeds by. By more fully utilizing the scientists visual and other sensory systems, and by offering a more natural user interface with which to interact with computer-generated representations, immersive visualization offers great promise in taming this data torrent. However, increasing the adoption of immersive visualization in scientific research communities can only happen by simultaneously lowering the engagement threshold while raising the measurable benefits of adoption. Scientists time spent immersed with their data will thus be rewarded with higher productivity, deeper insight, and improved creativity. Immersive visualization ties together technologies and methodologies from a variety of related but frequently disjoint areas, including hardware, software and human-computer interaction (HCI) disciplines. In many ways, hardware is a solved problem. There are well established technologies including large walk-in systems such as the CAVE{trademark} and head-based systems such as the Wide-5{trademark}. The advent of new consumer-level technologies now enable an entirely new generation of immersive displays, with smaller footprints and costs, widening the potential consumer base. While one would be hard-pressed to call software a solved problem, we now understand considerably more about best practices for designing and developing sustainable, scalable software systems, and we have useful software examples that illuminate the way to even better implementations. As with any research endeavour, HCI will always be exploring new topics in interface design, but we now have a sizable knowledge base of the strengths and weaknesses of the human perceptual systems and we know how to design effective interfaces for immersive systems. So, in a research landscape with a clear need for better visualization and analysis tools, a methodology in immersive visualization that has been shown to effectively address some of those needs, and vastly improved supporting technologies and knowledge of hardware, software, and HCI, why hasn't immersive visualization 'caught on' more with scientists? What can we do as a community of immersive visualization researchers and practitioners to facilitate greater adoption by scientific communities so as to make the transition from 'the promise of virtual reality' to 'the reality of virtual reality'.« less
NASA Astrophysics Data System (ADS)
Alam, Md. Sayem; Siddiq, A. Mohammed; Mandal, Asit Baran
2018-01-01
The influence of halide ions of (sodium salt) electrolytes on the mixed micellization of a cationic gemini (dimeric) surfactant, hexanediyl-1,6-bis(dimethylcetylammonium) bromide (16-6-16) and a cationic conventional (monomeric) surfactant, cetyltrimethylammonium bromide (CTAB) have been investigated. The critical micelle concentration (CMC) of the mixed (16-6-16+CTAB) surfactants was measured by the surface tension measurements. The surface properties: viz., the surfactant concentration required to reduce the surface tension by 20 mN/m ( C 20), the surface pressure at the CMC (ΠCMC), the maximum surface excess concentration at the air/water interface (Γmax), the minimum area per surfactant molecule at the air/water interface ( A min), etc. of the mixed micellar surfactant systems were evaluated. In the absence and presence of electrolytes, the thermodynamic parameters of the mixed micellar surfactant systems were also evaluated.
The importance of fluctuations in fluid mixing.
Kadau, Kai; Rosenblatt, Charles; Barber, John L; Germann, Timothy C; Huang, Zhibin; Carlès, Pierre; Alder, Berni J
2007-05-08
A ubiquitous example of fluid mixing is the Rayleigh-Taylor instability, in which a heavy fluid initially sits atop a light fluid in a gravitational field. The subsequent development of the unstable interface between the two fluids is marked by several stages. At first, each interface mode grows exponentially with time before transitioning to a nonlinear regime characterized by more complex hydrodynamic mixing. Unfortunately, traditional continuum modeling of this process has generally been in poor agreement with experiment. Here, we indicate that the natural, random fluctuations of the flow field present in any fluid, which are neglected in continuum models, can lead to qualitatively and quantitatively better agreement with experiment. We performed billion-particle atomistic simulations and magnetic levitation experiments with unprecedented control of initial interface conditions. A comparison between our simulations and experiments reveals good agreement in terms of the growth rate of the mixing front as well as the new observation of droplet breakup at later times. These results improve our understanding of many fluid processes, including interface phenomena that occur, for example, in supernovae, the detachment of droplets from a faucet, and ink jet printing. Such instabilities are also relevant to the possible energy source of inertial confinement fusion, in which a millimeter-sized capsule is imploded to initiate nuclear fusion reactions between deuterium and tritium. Our results suggest that the applicability of continuum models would be greatly enhanced by explicitly including the effects of random fluctuations.
Computational parametric study of a Richtmyer-Meshkov instability for an inclined interface.
McFarland, Jacob A; Greenough, Jeffrey A; Ranjan, Devesh
2011-08-01
A computational study of the Richtmyer-Meshkov instability for an inclined interface is presented. The study covers experiments to be performed in the Texas A&M University inclined shock tube facility. Incident shock wave Mach numbers from 1.2 to 2.5, inclination angles from 30° to 60°, and gas pair Atwood numbers of ∼0.67 and ∼0.95 are used in this parametric study containing 15 unique combinations of these parameters. Qualitative results are examined through a time series of density plots for multiple combinations of these parameters, and the qualitative effects of each of the parameters are discussed. Pressure, density, and vorticity fields are presented in animations available online to supplement the discussion of the qualitative results. These density plots show the evolution of two main regions in the flow field: a mixing region containing driver and test gas that is dominated by large vortical structures, and a more homogeneous region of unmixed fluid which can separate away from the mixing region in some cases. The interface mixing width is determined for various combinations of the parameters listed at the beginning of the Abstract. A scaling method for the mixing width is proposed using the interface geometry and wave velocities calculated using one-dimensional gas dynamic equations. This model uses the transmitted wave velocity for the characteristic velocity and an initial offset time based on the travel time of strong reflected waves. It is compared to an adapted Richtmyer impulsive model scaling and shown to scale the initial mixing width growth rate more effectively for fixed Atwood number.
Effects of initial condition spectral content on shock-driven turbulent mixing.
Nelson, Nicholas J; Grinstein, Fernando F
2015-07-01
The mixing of materials due to the Richtmyer-Meshkov instability and the ensuing turbulent behavior is of intense interest in a variety of physical systems including inertial confinement fusion, combustion, and the final stages of stellar evolution. Extensive numerical and laboratory studies of shock-driven mixing have demonstrated the rich behavior associated with the onset of turbulence due to the shocks. Here we report on progress in understanding shock-driven mixing at interfaces between fluids of differing densities through three-dimensional (3D) numerical simulations using the rage code in the implicit large eddy simulation context. We consider a shock-tube configuration with a band of high density gas (SF(6)) embedded in low density gas (air). Shocks with a Mach number of 1.26 are passed through SF(6) bands, resulting in transition to turbulence driven by the Richtmyer-Meshkov instability. The system is followed as a rarefaction wave and a reflected secondary shock from the back wall pass through the SF(6) band. We apply a variety of initial perturbations to the interfaces between the two fluids in which the physical standard deviation, wave number range, and the spectral slope of the perturbations are held constant, but the number of modes initially present is varied. By thus decreasing the density of initial spectral modes of the interface, we find that we can achieve as much as 25% less total mixing at late times. This has potential direct implications for the treatment of initial conditions applied to material interfaces in both 3D and reduced dimensionality simulation models.
Effects of Initial Condition Spectral Content on Shock Driven-Turbulent Mixing
Nelson, Nicholas James; Grinstein, Fernando F.
2015-07-15
The mixing of materials due to the Richtmyer-Meshkov instability and the ensuing turbulent behavior is of intense interest in a variety of physical systems including inertial confinement fusion, combustion, and the final stages of stellar evolution. Extensive numerical and laboratory studies of shock-driven mixing have demonstrated the rich behavior associated with the onset of turbulence due to the shocks. Here we report on progress in understanding shock-driven mixing at interfaces between fluids of differing densities through three-dimensional (3D) numerical simulations using the RAGE code in the implicit large eddy simulation context. We consider a shock-tube configuration with a band ofmore » high density gas (SF 6) embedded in low density gas (air). Shocks with a Mach number of 1.26 are passed through SF 6 bands, resulting in transition to turbulence driven by the Richtmyer-Meshkov instability. The system is followed as a rarefaction wave and a reflected secondary shock from the back wall pass through the SF 6 band. We apply a variety of initial perturbations to the interfaces between the two fluids in which the physical standard deviation, wave number range, and the spectral slope of the perturbations are held constant, but the number of modes initially present is varied. By thus decreasing the density of initial spectral modes of the interface, we find that we can achieve as much as 25% less total mixing at late times. This has potential direct implications for the treatment of initial conditions applied to material interfaces in both 3D and reduced dimensionality simulation models.« less
Using HT and DT gamma rays to diagnose mix in Omega capsule implosions
NASA Astrophysics Data System (ADS)
Schmitt, M. J.; Herrmann, H. W.; Kim, Y. H.; McEvoy, A. M.; Zylstra, A.; Hammel, B. A.; Sepke, S. M.; Leatherland, A.; Gales, S.
2016-05-01
Experimental evidence [1] indicates that shell material can be driven into the core of Omega capsule implosions on the same time scale as the initial convergent shock. It has been hypothesized that shock-generated temperatures at the fuel/shell interface in thin exploding pusher capsules diffusively drives shell material into the gas core between the time of shock passage and bang time. We propose a method to temporally resolve and observe the evolution of shell material into the capsule core as a function of fuel/shell interface temperature (which can be varied by varying the capsule shell thickness). Our proposed method uses a CD plastic capsule filled with 50/50 HT gas and diagnosed using gas Cherenkov detection (GCD) to temporally resolve both the HT “clean” and DT “mix” gamma ray burn histories. Simulations using Hydra [2] for an Omega CD-lined capsule with a sub-micron layer of the inside surface of the shell pre-mixed into a fraction of the gas region produce gamma reaction history profiles that are sensitive to the depth to which this material is mixed. An experiment to observe these differences as a function of capsule shell thickness is proposed to determine if interface mixing is consistent with thermal diffusion λii∼T2/Z2ρ at the gas/shell interface. Since hydrodynamic mixing from shell perturbations, such as the mounting stalk and glue, could complicate these types of capsule-averaged temporal measurements, simulations including their effects also have been performed showing minimal perturbation of the hot spot geometry.
Adherence of sputtered titanium carbides
NASA Technical Reports Server (NTRS)
Brainard, W. A.; Wheeler, D. R.
1979-01-01
The study searches for interface treatment that would increase the adhesion of TiC coating to nickel- and titanium-base alloys. Rene 41 (19 wt percent Cr, 11 wt percent Mo, 3 wt percent Ti, balance Ni) and Ti-6Al-4V (6 wt percent Al, 4 wt percent V, balance Ti) are considered. Adhesion of the coatings is evaluated in pin-and disk friction tests. The coatings and interface regions are examined by X-ray photoelectron spectroscopy. Results suggest that sputtered refractory compound coatings adhere best when a mixed compound of coating and substrate metals is formed in the interfacial region. The most effective type of refractory compound interface appears to depend on both substrate and coating material. A combination of metallic interlayer deposition and mixed compound interface formation may be more effective for some substrate coating combinations than either alone.
Achieving Solution Success: An Investigation of User Participation Approaches
ERIC Educational Resources Information Center
Mattia, Angela Marie
2009-01-01
User participation and its relationship to system success have been discussed in the information systems (IS) literature from many theoretical and practical perspectives. In reality, most of this discussion is grounded in empirical research that has yielded mixed results on the importance of user participation and its relationship to system…
When Worlds Collide: An Augmented Reality Check
ERIC Educational Resources Information Center
Villano, Matt
2008-01-01
The technology is simple: Mobile technologies such as handheld computers and global positioning systems work in sync to create an alternate, hybrid world that mixes virtual characters with the actual physical environment. The result is a digital simulation that offers powerful game-playing opportunities and allows students to become more engaged…
"In Reality It's Almost Impossible": CLT-Oriented Curriculum Change
ERIC Educational Resources Information Center
Humphries, Simon; Burns, Anne
2015-01-01
Curriculum innovation is challenging and, as several commentators have reported, moves to introduce communicative language teaching in many contexts internationally have resulted in mixed outcomes, or even failure. In an effort to shed some light on this complex problem, this article focuses on curriculum change through the introduction of new…
Language Teacher Noticing: A Socio-Cognitive Window on Classroom Realities
ERIC Educational Resources Information Center
Jackson, Daniel O.; Cho, Minyoung
2018-01-01
This article introduces the construct of teacher noticing, situates it in research on second language teacher cognition, and considers its implications for research on second language teacher training, acknowledging socio-cognitive perspectives on language learning and teaching. We then present a mixed-methods observational study that utilized…
Immersive environment technologies for planetary exploration with applications for mixed reality
NASA Technical Reports Server (NTRS)
Wright, J.; Hartman, F.; Cooper, B.
2002-01-01
Immersive environments are successfully being used to support mission operations at JPL. This technology contributed to the Mars Pathfinder Mission in planning sorties for the Sojourner rover. Results and operational experiences with these tools are being incorporated into the development of the second generation of mission planning tools.
Digital Education: Opportunities for Social Collaboration. Digital Education and Learning
ERIC Educational Resources Information Center
Thomas, Michael, Ed.
2011-01-01
This timely collection of theoretical and applied studies examines the pedagogical potential and realities of digital technologies in a wide range of disciplinary contexts across the educational spectrum. By mixing content-based chapters with a theoretical perspective with case studies detailing actual teaching approaches utilizing digital…
ERIC Educational Resources Information Center
Seifert, Tami
2014-01-01
As the disparity between educational standards and reality outside educational institutions is increasing, alternative learning infrastructure such as mobile technologies are becoming more common, and are challenging long held, traditional modes of teaching. Educators' attitudes toward wireless devices are mixed. Wireless devices are perceived by…
The Impact of Simulated Interviews for Individuals with Intellectual Disability
ERIC Educational Resources Information Center
Walker, Zachary; Vasquez, Eleazar; Wienke, Wilfred
2016-01-01
The purpose of this research study was to explore the efficacy of role-playing and coaching in mixed-reality environments for the acquisition and generalization of social skills leading to successful job interview performance. Using a multiple baseline across participants design, five young adults with intellectual disability practiced…
ERIC Educational Resources Information Center
Bower, Beverly L.
1998-01-01
Reviews research on the instructional benefits of computer technology. Discusses the computer readiness of students, faculty, and institutions, and suggests that despite mixed findings, political and organizational realities indicate computer-based instruction is a feasible alternative for community colleges. Therefore, educators should continue…
Fast ion transport at a gas-metal interface
McDevitt, Christopher J.; Tang, Xian-Zhu; Guo, Zehua
2017-11-06
Fast ion transport and the resulting fusion yield reduction are computed at a gas-metal interface. The extent of fusion yield reduction is observed to depend sensitively on the charge state of the surrounding pusher material and the width of the atomically mixed region. These sensitivities suggest that idealized boundary conditions often implemented at the gas-pusher interface for the purpose of estimating fast ion loss will likely overestimate fusion reactivity reduction in several important limits. Additionally, the impact of a spatially complex material interface is investigated by considering a collection of droplets of the pusher material immersed in a DT plasma.more » It is found that for small Knudsen numbers, the extent of fusion yield reduction scales with the surface area of the material interface. As the Knudsen number is increased, but, the simple surface area scaling is broken, suggesting that hydrodynamic mix has a nontrivial impact on the extent of fast ion losses.« less
Tuning Interfacial States Using Organic Molecules as Spin Filters
NASA Astrophysics Data System (ADS)
Deloach, Andrew; Wang, Jingying; Papa, Christopher M.; Myahkostupov, Mykhaylo; Castellano, Felix N.; Dougherty, Daniel B.; Jiang, Wei; Liu, Feng
Organic semiconductors are known to have long spin relaxation times which makes them a good candidate for spintronics. However, an issue with these materials is that at metal-organic interfaces there is a conductivity mismatch problem that suppresses spin injection. To overcome this, orbital mixing at the interface can be tuned with an organic spacer layer to promote the formation of spin polarized interface states. These states act as a ``spin filters'' and have been proposed as an explanation for the large tunneling magnetoresistance seen in devices using tris-(8-hydroxyquinolate)-aluminum(Alq3). Here, we show that the spin polarized interface states can be tuned from metallic to resistive by subtle changes in molecular orbitals. This is done using spin polarized scanning tunneling microscopy with three different tris-(8-hydroxyquinolate) compounds: aluminum, chromium, and iron. Differences in d-orbital mixing results in different mechanisms of interfacial coupling, giving rise to metallic or resistive interface states. Supported by the U.S. DoE award No. DE-SC0010324.
Virtual reality laparoscopic simulator for assessment in gynaecology.
Gor, Mounna; McCloy, Rory; Stone, Robert; Smith, Anthony
2003-02-01
A validated virtual reality laparoscopic simulator minimally invasive surgical trainer (MIST) 2 was used to assess the psychomotor skills of 21 gynaecologists (2 consultants, 8 registrars and 11 senior house officers). Nine gynaecologists failed to complete the VR tasks at the first attempt and were excluded for sequential evaluation. Each of the remaining 12 gynaecologists were tested on MIST 2 on four occasions within four weeks. The MIST 2 simulator provided quantitative data on time to complete tasks, errors, economy of movement and economy of diathermy use--for both right and left hand performance. The results show a significant early learning curve for the majority of tasks which plateaued by the third session. This suggests a high quality surgeon-computer interface. MIST 2 provides objective assessment of laparoscopic skills in gynaecologists.
The effects of virtual experience on attitudes toward real brands.
Dobrowolski, Pawel; Pochwatko, Grzegorz; Skorko, Maciek; Bielecki, Maksymilian
2014-02-01
Although the commercial availability and implementation of virtual reality interfaces has seen rapid growth in recent years, little research has been conducted on the potential for virtual reality to affect consumer behavior. One unaddressed issue is how our real world attitudes are affected when we have a virtual experience with the target of those attitudes. This study compared participant (N=60) attitudes toward car brands before and after a virtual test drive of those cars was provided. Results indicated that attitudes toward test brands changed after experience with virtual representations of those brands. Furthermore, manipulation of the quality of this experience (in this case modification of driving difficulty) was reflected in the direction of attitude change. We discuss these results in the context of the associative-propositional evaluation model.
Seeing the World in a Grain of Sand
NASA Astrophysics Data System (ADS)
Clucas, T.; Wirth, G. S.
2015-12-01
Enabling people to trigger and to witness landscape change is a powerful method of communicating scientific concepts. Alaska EPSCoR and GINA have found an effective tool for this effort in their "Augmented-Reality Sandbox," an engaging hands-on interface that can be used to teach about topography, hydrology, natural hazards, and landscape change. People are consistently excited about the sandbox, the success of which has led EPSCoR to construct mobile versions which have traveled to remote Alaskan communities. EPSCoR has also developed model curricula that use the sandbox to teach basic topography and hydrology skills, and is working on advanced lessons based around hydrologic and landscape hazards. Instructions on building a mobile sandbox, curricula, and video of the sandbox in action are available at www.alaska.edu/epscor/Augmented-Reality%20Sandbox/
Experimental growth of inertial forced Richtmyer-Meshkov instabilities for different Atwood numbers
NASA Astrophysics Data System (ADS)
Redondo, J. M.; Castilla, R.
2009-04-01
Richtmyer-Meshkov instability occurs when a shock wave impinges on an interface separating two fluids having different densities [1,2]. The instability causes perturbations on the interface to grow, bubbles and spikes, producing vortical structures which potentially result in a turbulent mixing layer. In addition to shock tube experiments, the incompressible Richtmyer-Meshkov instability has also been studied by impulsively accelerating containers of incompressible fluids. Castilla and Redondo (1994) [3] first exploited this technique by dropping tanks containing a liquid and air or two liquids onto a cushioned surface. This technique was improved upon by Niederhaus and Jacobs (2003)[4] by mounting the tank onto a rail system and then allowing it to bounce off of a fixed spring. A range of both miscible and inmiscible liquids were used, giving a wide range of Atwood numbers using the combinations of air, water, alcohol, oil and mercury. Experimental results show the different pattern selection of both the bubbles and spikes for the different Atwood numbers. Visual analysis of the marked interfaces allows to distinguish the regions of strong mixing and compare self-similarity growth of the mixing region. [1] Meshkov, E. E. 1969 Instability of the interface of two gases accelerated by a shock wave. Fluid Dynamics 4, 101-104. [2] Brouillette, M. & Sturtevant, B. 1994 Experiments on the Richtmyer-Meshkov instability: single-scale perturbations on a continuous interface. Journal of Fluid Mechanics 263, 271-292. [3] Castilla, R. & Redondo, J. M. 1994 Mixing Front Growth in RT and RM Instabilities. Proceedings of the Fourth International Workshop on the Physics of Compressible Turbulent Mixing, Cambridge, United Kingdom, edited by P. F. Linden, D. L. Youngs, and S. B. Dalziel, 11-31. [4] Niederhaus, C. E. & Jacobs, J. W. 2003 Experimental study of the Richtmyer-Meshkov instability of incompressible fluids. Journal of Fluid Mechanics 485, 243-277.
Dan, Abhijit; Gochev, Georgi; Miller, Reinhard
2015-07-01
Oscillating drop tensiometry was applied to study adsorbed interfacial layers at water/air and water/hexane interfaces formed from mixed solutions of β-lactoglobulin (BLG, 1 μM in 10 mM buffer, pH 7 - negative net charge) and the anionic surfactant SDS or the cationic DoTAB. The interfacial pressure Π and the dilational viscoelasticity modulus |E| of the mixed layers were measured for mixtures of varying surfactant concentrations. The double capillary technique was employed which enables exchange of the protein solution in the drop bulk by surfactant solution (sequential adsorption) or by pure buffer (washing out). The first protocol allows probing the influence of the surfactant on a pre-adsorbed protein layer thus studying the protein/surfactant interactions at the interface. The second protocol gives access to the residual values of Π and |E| measured after the washing out procedure thus bringing information about the process of protein desorption. The DoTAB/BLG complexes exhibit higher surface activity and higher resistance to desorption in comparison with those for the SDS/BLG complexes due to hydrophobization via electrostatic binding of surfactant molecules. The neutral DoTAB/BLG complexes achieve maximum elastic response of the mixed layer. Mixed BLG/surfactant layers at the water/oil interface are found to reach higher surface pressure and lower maximum dilational elasticity than those at the water/air surface. The sequential adsorption mode experiments and the desorption study reveal that binding of DoTAB to pre-adsorbed BLG globules is somehow restricted at the water/air surface in comparison with the case of complex formation in the solution bulk and subsequently adsorbed at the water/air surface. Maximum elasticity is achieved with washed out layers obtained after simultaneous adsorption, i.e. isolation of the most surface active DoTAB/BLG complex. These specific effects are much less pronounced at the W/H interface. Copyright © 2015 Elsevier Inc. All rights reserved.
Thompson, Katherine C; Jones, Stephanie H; Rennie, Adrian R; King, Martin D; Ward, Andrew D; Hughes, Brian R; Lucas, Claire O M; Campbell, Richard A; Hughes, Arwel V
2013-04-09
The presence of unsaturated lipids in lung surfactant is important for proper respiratory function. In this work, we have used neutron reflection and surface pressure measurements to study the reaction of the ubiquitous pollutant gas-phase ozone, O3, with pure and mixed phospholipid monolayers at the air-water interface. The results reveal that the reaction of the unsaturated lipid 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine, POPC, with ozone leads to the rapid loss of the terminal C9 portion of the oleoyl strand of POPC from the air-water interface. The loss of the C9 portion from the interface is accompanied by an increase in the surface pressure (decrease in surface tension) of the film at the air-water interface. The results suggest that the portion of the oxidized oleoyl strand that is still attached to the lipid headgroup rapidly reverses its orientation and penetrates the air-water interface alongside the original headgroup, thus increasing the surface pressure. The reaction of POPC with ozone also leads to a loss of material from the palmitoyl strand, but the loss of palmitoyl material occurs after the loss of the terminal C9 portion from the oleoyl strand of the molecule, suggesting that the palmitoyl material is lost in a secondary reaction step. Further experiments studying the reaction of mixed monolayers composed of unsaturated lipid POPC and saturated lipid dipalmitoyl-sn-glycero-3-phosphocholine, DPPC, revealed that no loss of DPPC from the air-water interface occurs, eliminating the possibility that a reactive species such as an OH radical is formed and is able to attack nearby lipid chains. The reaction of ozone with the mixed films does cause a significant change in the surface pressure of the air-water interface. Thus, the reaction of unsaturated lipids in lung surfactant changes and impairs the physical properties of the film at the air-water interface.
Instability and turbulent mixing of shocked `V' shaped interface
NASA Astrophysics Data System (ADS)
Li, Long; Sun, Yutao
Based on the mass fraction model of multicomponent mixture, the interaction between weak shock wave and `V' shaped air/ interface with different vertex angles are numerical simulated using high resolution finite volume method with minimized dispersion and controllable dissipation (MDCD) scheme. It is observed that the baroclinic vorticity is deposited near the interface due to the misalignment of the density and pressure gradient, leading to the formation of vortical structures along the interface. The predicted leftmost interface displacement and interface width growth rate in the early stage of interface evolution agree well with experimental results. The numerical results indicate that with the evolution of the interfacial vortical structures, the array of vortices begins to merge. As the result, the vortices accumulate at several distinct regions. It is in these regions, the multi-scale structures are generated because of the interaction between vortices. It is observed that due to the different scaling with Reynolds number of upper bound and lower bound, an uncoupled inertial range appears, and the mixing transition occurs with the appearance of an inertial range of scales. The classical Kolmogorov -5/3 power laws are shown in the energy fluctuation spectrum, which means the inertial range is just beginning to form and the flow field near the material interface will develop to turbulence.
STS-88 crew use simulators and virtual reality in preflight training
1998-04-08
S98-05075 (8 Apr. 1998) --- Astronaut Nancy J. Currie, assigned as a mission specialist for the mission, uses hardware in the virtual reality lab at the Johnson Space Center (JSC) to train for her duties aboard the Space Shuttle Endeavour. This type computer interface paired with virtual reality training hardware for the assigned space-walking astronauts -- in this case, Jerry L. Ross and James H. Newman -- helps to prepare the entire team for dealing with International Space Station (ISS) elements. One of those elements will be the Functional Cargo Block (FGB), which will have been launched a couple of weeks prior to STS-88. Once the FGB is captured using the Remote Manipulator System (RMS) of the Endeavour, Currie will maneuver the robot arm to dock the FGB to the conical mating adapter at the top of Node 1, to be carried in the Endeavour?s cargo bay. In ensuing days, three Extravehicular Activity?s (EVA) by Ross and Newman will be performed to make power, data and utility connections between the two modules.
Virtual reality as a new trend in mechanical and electrical engineering education
NASA Astrophysics Data System (ADS)
Kamińska, Dorota; Sapiński, Tomasz; Aitken, Nicola; Rocca, Andreas Della; Barańska, Maja; Wietsma, Remco
2017-12-01
In their daily practice, academics frequently face lack of access to modern equipment and devices, which are currently in use on the market. Moreover, many students have problems with understanding issues connected to mechanical and electrical engineering due to the complexity, necessity of abstract thinking and the fact that those concepts are not fully tangible. Many studies indicate that virtual reality can be successfully used as a training tool in various domains, such as development, health-care, the military or school education. In this paper, an interactive training strategy for mechanical and electrical engineering education shall be proposed. The prototype of the software consists of a simple interface, meaning it is easy for comprehension and use. Additionally, the main part of the prototype allows the user to virtually manipulate a 3D object that should be analyzed and studied. Initial studies indicate that the use of virtual reality can contribute to improving the quality and efficiency of higher education, as well as qualifications, competencies and the skills of graduates, and increase their competitiveness in the labour market.
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
Interrater Reliability of the Power Mobility Road Test in the Virtual Reality-Based Simulator-2.
Kamaraj, Deepan C; Dicianno, Brad E; Mahajan, Harshal P; Buhari, Alhaji M; Cooper, Rory A
2016-07-01
To assess interrater reliability of the Power Mobility Road Test (PMRT) when administered through the Virtual Reality-based SIMulator-version 2 (VRSIM-2). Within-subjects repeated-measures design. Participants interacted with VRSIM-2 through 2 display options (desktop monitor vs immersive virtual reality screens) using 2 control interfaces (roller system vs conventional movement-sensing joystick), providing 4 different driving scenarios (driving conditions 1-4). Participants performed 3 virtual driving sessions for each of the 2 display screens and 1 session through a real-world driving course (driving condition 5). The virtual PMRT was conducted in a simulated indoor office space, and an equivalent course was charted in an open space for the real-world assessment. After every change in driving condition, participants completed a self-reported workload assessment questionnaire, the Task Load Index, developed by the National Aeronautics and Space Administration. A convenience sample of electric-powered wheelchair (EPW) athletes (N=21) recruited at the 31st National Veterans Wheelchair Games. Not applicable. Total composite PMRT score. The PMRT had high interrater reliability (intraclass correlation coefficient [ICC]>.75) between the 2 raters in all 5 driving conditions. Post hoc analyses revealed that the reliability analyses had >80% power to detect high ICCs in driving conditions 1 and 4. The PMRT has high interrater reliability in conditions 1 and 4 and could be used to assess EPW driving performance virtually in VRSIM-2. However, further psychometric assessment is necessary to assess the feasibility of administering the PMRT using the different interfaces of VRSIM-2. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Combined rTMS and virtual reality brain-computer interface training for motor recovery after stroke
NASA Astrophysics Data System (ADS)
Johnson, N. N.; Carey, J.; Edelman, B. J.; Doud, A.; Grande, A.; Lakshminarayan, K.; He, B.
2018-02-01
Objective. Combining repetitive transcranial magnetic stimulation (rTMS) with brain-computer interface (BCI) training can address motor impairment after stroke by down-regulating exaggerated inhibition from the contralesional hemisphere and encouraging ipsilesional activation. The objective was to evaluate the efficacy of combined rTMS + BCI, compared to sham rTMS + BCI, on motor recovery after stroke in subjects with lasting motor paresis. Approach. Three stroke subjects approximately one year post-stroke participated in three weeks of combined rTMS (real or sham) and BCI, followed by three weeks of BCI alone. Behavioral and electrophysiological differences were evaluated at baseline, after three weeks, and after six weeks of treatment. Main results. Motor improvements were observed in both real rTMS + BCI and sham groups, but only the former showed significant alterations in inter-hemispheric inhibition in the desired direction and increased relative ipsilesional cortical activation from fMRI. In addition, significant improvements in BCI performance over time and adequate control of the virtual reality BCI paradigm were observed only in the former group. Significance. When combined, the results highlight the feasibility and efficacy of combined rTMS + BCI for motor recovery, demonstrated by increased ipsilesional motor activity and improvements in behavioral function for the real rTMS + BCI condition in particular. Our findings also demonstrate the utility of BCI training alone, as shown by behavioral improvements for the sham rTMS + BCI condition. This study is the first to evaluate combined rTMS and BCI training for motor rehabilitation and provides a foundation for continued work to evaluate the potential of both rTMS and virtual reality BCI training for motor recovery after stroke.
Vroom: designing an augmented environment for remote collaboration in digital cinema production
NASA Astrophysics Data System (ADS)
Margolis, Todd; Cornish, Tracy
2013-03-01
As media technologies become increasingly affordable, compact and inherently networked, new generations of telecollaborative platforms continue to arise which integrate these new affordances. Virtual reality has been primarily concerned with creating simulations of environments that can transport participants to real or imagined spaces that replace the "real world". Meanwhile Augmented Reality systems have evolved to interleave objects from Virtual Reality environments into the physical landscape. Perhaps now there is a new class of systems that reverse this precept to enhance dynamic media landscapes and immersive physical display environments to enable intuitive data exploration through collaboration. Vroom (Virtual Room) is a next-generation reconfigurable tiled display environment in development at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego. Vroom enables freely scalable digital collaboratories, connecting distributed, high-resolution visualization resources for collaborative work in the sciences, engineering and the arts. Vroom transforms a physical space into an immersive media environment with large format interactive display surfaces, video teleconferencing and spatialized audio built on a highspeed optical network backbone. Vroom enables group collaboration for local and remote participants to share knowledge and experiences. Possible applications include: remote learning, command and control, storyboarding, post-production editorial review, high resolution video playback, 3D visualization, screencasting and image, video and multimedia file sharing. To support these various scenarios, Vroom features support for multiple user interfaces (optical tracking, touch UI, gesture interface, etc.), support for directional and spatialized audio, giga-pixel image interactivity, 4K video streaming, 3D visualization and telematic production. This paper explains the design process that has been utilized to make Vroom an accessible and intuitive immersive environment for remote collaboration specifically for digital cinema production.
Systematic review on the effectiveness of augmented reality applications in medical training.
Barsom, E Z; Graafland, M; Schijven, M P
2016-10-01
Computer-based applications are increasingly used to support the training of medical professionals. Augmented reality applications (ARAs) render an interactive virtual layer on top of reality. The use of ARAs is of real interest to medical education because they blend digital elements with the physical learning environment. This will result in new educational opportunities. The aim of this systematic review is to investigate to which extent augmented reality applications are currently used to validly support medical professionals training. PubMed, Embase, INSPEC and PsychInfo were searched using predefined inclusion criteria for relevant articles up to August 2015. All study types were considered eligible. Articles concerning AR applications used to train or educate medical professionals were evaluated. Twenty-seven studies were found relevant, describing a total of seven augmented reality applications. Applications were assigned to three different categories. The first category is directed toward laparoscopic surgical training, the second category toward mixed reality training of neurosurgical procedures and the third category toward training echocardiography. Statistical pooling of data could not be performed due to heterogeneity of study designs. Face-, construct- and concurrent validity was proven for two applications directed at laparoscopic training, face- and construct validity for neurosurgical procedures and face-, content- and construct validity in echocardiography training. In the literature, none of the ARAs completed a full validation process for the purpose of use. Augmented reality applications that support blended learning in medical training have gained public and scientific interest. In order to be of value, applications must be able to transfer information to the user. Although promising, the literature to date is lacking to support such evidence.
NASA Technical Reports Server (NTRS)
Ding, R. Jeffrey (Inventor)
2012-01-01
A welding apparatus is provided for forming a weld joint between first and second elements of a workpiece. The apparatus heats the first and second elements to form an interface of material in a plasticized or melted state interface between the elements. The interface material is then allowed to cool to a plasticized state if previously in a melted state. The interface material, while in the plasticized state, is then mixed, for example, using a grinding/extruding mixer, to remove any dendritic-type weld microstructures introduced into the interface material during heating.
Langlois, Gary N.
1983-09-13
Measurement of the relative and actual value of acoustic characteristic impedances of an unknown substance, location of the interfaces of vertically-layered materials, and the determination of the concentration of a first material mixed in a second material. A highly damped ultrasonic pulse is transmitted into one side of a reference plate, such as a tank wall, where the other side of the reference plate is in physical contact with the medium to be measured. The amplitude of a return signal, which is the reflection of the transmitted pulse from the interface between the other side of the reference plate and the medium, is measured. The amplitude value indicates the acoustic characteristic impedance of the substance relative to that of the reference plate or relative to that of other tested materials. Discontinuities in amplitude with repeated measurements for various heights indicate the location of interfaces in vertically-layered materials. Standardization techniques permit the relative acoustic characteristic impedance of a substance to be converted to an actual value. Calibration techniques for mixtures permit the amplitude to be converted to the concentration of a first material mixed in a second material.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vandenboomgaerde, M.; Bonnefille, M.; Gauthier, P.
Highly resolved radiation-hydrodynamics FCI2 simulations have been performed to model laser experiments on the National Ignition Facility. In these experiments, cylindrical gas-filled hohlraums with gold walls are driven by a 20 ns laser pulse. For the first time, simulations show the appearance of Kelvin-Helmholtz (KH) vortices at the interface between the expanding wall material and the gas fill. In this paper, we determine the mechanisms which generate this instability: the increase of the gas pressure around the expanding gold plasma leads to the aggregation of an over-dense gold layer simultaneously with shear flows. At the surface of this layer, all themore » conditions are met for a KH instability to grow. Later on, as the interface decelerates, the Rayleigh-Taylor instability also comes into play. A potential scenario for the generation of a mixing zone at the gold-gas interface due to the KH instability is presented. Our estimates of the Reynolds number and the plasma diffusion width at the interface support the possibility of such a mix. The key role of the first nanosecond of the laser pulse in the instability occurrence is also underlined.« less
Langlois, G.N.
1983-09-13
Measurement of the relative and actual value of acoustic characteristic impedances of an unknown substance, location of the interfaces of vertically-layered materials, and the determination of the concentration of a first material mixed in a second material are disclosed. A highly damped ultrasonic pulse is transmitted into one side of a reference plate, such as a tank wall, where the other side of the reference plate is in physical contact with the medium to be measured. The amplitude of a return signal, which is the reflection of the transmitted pulse from the interface between the other side of the reference plate and the medium, is measured. The amplitude value indicates the acoustic characteristic impedance of the substance relative to that of the reference plate or relative to that of other tested materials. Discontinuities in amplitude with repeated measurements for various heights indicate the location of interfaces in vertically-layered materials. Standardization techniques permit the relative acoustic characteristic impedance of a substance to be converted to an actual value. Calibration techniques for mixtures permit the amplitude to be converted to the concentration of a first material mixed in a second material. 6 figs.
NASA Astrophysics Data System (ADS)
Zhang, Y.; Yan, X. H.; Guo, Y. D.; Xiao, Y.
2017-08-01
Motivated by a recent tunneling magnetoresistance (TMR) measurement in which the negative TMR is observed in MgO/NiO-based magnetic tunnel junctions (MTJs), we have performed systematic calculations of transmission, current, and TMR of Fe/MgO/NiO/Fe MTJ with different thicknesses of NiO and MgO layers based on noncollinear density functional theory and non-equilibrium Green's function theory. The calculations show that, as the thickness of NiO and MgO layers is small, the negative TMR can be obtained which is attributed to the spin mixing effect and interface state. However, in the thick MTJ, the spin-flipping scattering becomes weaker, and thus, the MTJs recover positive TMR. Based on our theoretical results, we believe that the interface state at Fe/NiO interface and the spin mixing effect induced by noncollinear interfacial magnetization will play important role in determining transmission and current of Fe/MgO/NiO/Fe MTJ. The results reported here will be important in understanding the electron tunneling in MTJ with the barrier made by transition metal oxide.
Delamination modeling of laminate plate made of sublaminates
NASA Astrophysics Data System (ADS)
Kormaníková, Eva; Kotrasová, Kamila
2017-07-01
The paper presents the mixed-mode delamination of plates made of sublaminates. To this purpose an opening load mode of delamination is proposed as failure model. The failure model is implemented in ANSYS code to calculate the mixed-mode delamination response as energy release rate. The analysis is based on interface techniques. Within the interface finite element modeling there are calculated the individual components of damage parameters as spring reaction forces, relative displacements and energy release rates along the lamination front.
Magezi, David A
2015-01-01
Linear mixed-effects models (LMMs) are increasingly being used for data analysis in cognitive neuroscience and experimental psychology, where within-participant designs are common. The current article provides an introductory review of the use of LMMs for within-participant data analysis and describes a free, simple, graphical user interface (LMMgui). LMMgui uses the package lme4 (Bates et al., 2014a,b) in the statistical environment R (R Core Team).
Cameirão, Mónica S; Badia, Sergi Bermúdez i; Duarte, Esther; Frisoli, Antonio; Verschure, Paul F M J
2012-10-01
Although there is strong evidence on the beneficial effects of virtual reality (VR)-based rehabilitation, it is not yet well understood how the different aspects of these systems affect recovery. Consequently, we do not exactly know what features of VR neurorehabilitation systems are decisive in conveying their beneficial effects. To specifically address this issue, we developed 3 different configurations of the same VR-based rehabilitation system, the Rehabilitation Gaming System, using 3 different interface technologies: vision-based tracking, haptics, and a passive exoskeleton. Forty-four patients with chronic stroke were randomly allocated to one of the configurations and used the system for 35 minutes a day for 5 days a week during 4 weeks. Our results revealed significant within-subject improvements at most of the standard clinical evaluation scales for all groups. Specifically we observe that the beneficial effects of VR-based training are modulated by the use/nonuse of compensatory movement strategies and the specific sensorimotor contingencies presented to the user, that is, visual feedback versus combined visual haptic feedback. Our findings suggest that the beneficial effects of VR-based neurorehabilitation systems such as the Rehabilitation Gaming System for the treatment of chronic stroke depend on the specific interface systems used. These results have strong implications for the design of future VR rehabilitation strategies that aim at maximizing functional outcomes and their retention. Clinical Trial Registration- This trial was not registered because it is a small clinical study that evaluates the feasibility of prototype devices.
NASA Astrophysics Data System (ADS)
Gorham, Caroline S.; Hattar, Khalid; Cheaito, Ramez; Duda, John C.; Gaskins, John T.; Beechem, Thomas E.; Ihlefeld, Jon F.; Biedermann, Laura B.; Piekos, Edward S.; Medlin, Douglas L.; Hopkins, Patrick E.
2014-07-01
The thermal boundary conductance across solid-solid interfaces can be affected by the physical properties of the solid boundary. Atomic composition, disorder, and bonding between materials can result in large deviations in the phonon scattering mechanisms contributing to thermal boundary conductance. Theoretical and computational studies have suggested that the mixing of atoms around an interface can lead to an increase in thermal boundary conductance by creating a region with an average vibrational spectra of the two materials forming the interface. In this paper, we experimentally demonstrate that ion irradiation and subsequent modification of atoms at solid surfaces can increase the thermal boundary conductance across solid interfaces due to a change in the acoustic impedance of the surface. We measure the thermal boundary conductance between thin aluminum films and silicon substrates with native silicon dioxide layers that have been subjected to proton irradiation and post-irradiation surface cleaning procedures. The thermal boundary conductance across the Al/native oxide/Si interfacial region increases with an increase in proton dose. Supported with statistical simulations, we hypothesize that ion beam mixing of the native oxide and silicon substrate within ˜2.2nm of the silicon surface results in the observed increase in thermal boundary conductance. This ion mixing leads to the spatial gradation of the silicon native oxide into the silicon substrate, which alters the acoustic impedance and vibrational characteristics at the interface of the aluminum film and native oxide/silicon substrate. We confirm this assertion with picosecond acoustic analyses. Our results demonstrate that under specific conditions, a "more disordered and defected" interfacial region can have a lower resistance than a more "perfect" interface.
Morphology of the D/A interface in vapor deposited bilayer organic photovoltaics
NASA Astrophysics Data System (ADS)
Erwin, Patrick; Dimitriou, Michael; Thompson, Mark E.
2017-08-01
A series of bilayer films were prepared by vacuum deposition onto Silicon substrates. These films consisted of either Si/SiO2/donor/C60 or Si/SiO2/C60/donor, where the organic films were in the 20-40 nm thick range and the donors were 7,7-difluoro-14-phenyl-7H-6l4,7l4-[1,3,2]diazaborinino[4,3-a:6,1-a']diisoindole (bDIP), copper phthalocyanine (CuPC), 3,6,11,14-tetraphenyldiindeno[1,2,3-cd:1',2',3'-lm]perylene (DBP) and 2-(4-(diphenylamino)-2,6- dihydroxyphenyl)-4-(4-(diphenyliminio)-2,6-dihydroxycyclohexa-2,5-dien-1-ylidene)-3-oxocyclobut-1-en-1-olate (DPSQ). The donors chosen here have been reported to give good power efficiencies when incorporated into bilayer photovoltaic cells with a C60 acceptor. These bilayer films were examined by neutron reflectometry to characterize the interface between the donor and C60. In the SiO2/donor/C60 films, DPSQ, CuPC, and DBP show a discrete interface with C60 while bDIP shows substantial spontaneous mixing at the interface, consistent with a donor/(donor + C60)/C60 structure, where the mixed layer is 14 nm.. In the SiO2/C60/donor films, all four donors show negligible mixing at the D/A interface consistent with a discrete D/A junction.
Atomistic study of mixing at high Z / low Z interfaces at Warm Dense Matter Conditions
NASA Astrophysics Data System (ADS)
Haxhimali, Tomorr; Glosli, James; Rudd, Robert; Lawrence Livermore National Laboratory Team
2016-10-01
We use atomistic simulations to study different aspects of mixing occurring at an initially sharp interface of high Z and low Z plasmas in the Warm/Hot Dense Matter regime. We consider a system of Diamond (the low Z component) in contact with Ag (the high Z component), which undergoes rapid isochoric heating from room temperature up to 10 eV, rapidly changing the solids into warm dense matter at solid density. We simulate the motion of ions via the screened Coulomb potential. The electric field, the electron density and ionizations level are computed on the fly by solving Poisson equation. The spatially varying screening lengths computed from the electron cloud are included in this effective interaction; the electrons are not simulated explicitly. We compute the electric field generated at the Ag-C interface as well as the dynamics of the ions during the mixing process occurring at the plasma interface. Preliminary results indicate an anomalous transport of high Z ions (Ag) into the low Z component (C); a phenomenon that is partially related to the enhanced transport of ions due to the generated electric field. These results are in agreement with recent experimental observation on Au-diamond plasma interface. This work was performed under the auspices of the US Dept. of Energy by Lawrence Livermore National Security, LLC under Contract DE-AC52-07NA27344.
Effect of dry air on interface smoothening in reactive sputter deposited Co/Ti multilayer
NASA Astrophysics Data System (ADS)
Biswas, A.; Porwal, A.; Bhattacharya, Debarati; Prajapat, C. L.; Ghosh, Arnab; Nand, Mangla; Nayak, C.; Rai, S.; Jha, S. N.; Singh, M. R.; Bhattacharyya, D.; Basu, S.; Sahoo, N. K.
2017-09-01
Top surface roughness and interface roughness are one of the key elements which determine the performance of X-ray and neutron thin film multilayer devices. It has been observed that by mixing air with argon in sputtering ambience during deposition of Co layers, polarized neutron reflectivity (PNR) of Co/Ti supermirror polarizers can be improved substantially. Cross-sectional HRTEM measurement reveals that sharper interfaces in the supermirror can be achieved in case of deposition of the multilayer under mixed ambience of argon and air. In order to investigate this interface modification mechanism further, in this communication two sets of tri-layer Co/Ti/Co samples and 20-layer Co/Ti periodic multilayer samples have been prepared; in one set all the layers are deposited only under argon ambience and in the other set, Co layers are deposited under a mixed ambience of argon and air. These samples have been characterized by measuring specular and non-specular X-ray reflectivities (GIXR) with X-rays of 1.54 Å wavelength and polarized neutron reflectivity (PNR) with neutron of 2.5 Å wavelength at grazing angle of incidence. It has been observed that the X-ray and neutron specular reflectivities at Bragg peaks of 20 layer periodic multilayer increase when Co layers are deposited under mixed ambience of argon and air. The detail information regarding the effect of air on the interfaces and magnetic properties has been obtained by fitting the measured spectra. The above information has subsequently been supplemented by XRD and magnetic measurements on the samples. XPS and XANES measurements have also been carried out to investigate whether cobalt oxide or cobalt nitride layers are being formed due to use of air in sputtering ambience.
Mixing driven by transient buoyancy flows. I. Kinematics
NASA Astrophysics Data System (ADS)
Duval, W. M. B.; Zhong, H.; Batur, C.
2018-05-01
Mixing of two miscible liquids juxtaposed inside a cavity initially separated by a divider, whose buoyancy-driven motion is initiated via impulsive perturbation of divider motion that can generate the Richtmyer-Meshkov instability, is investigated experimentally. The measured Lagrangian history of interface motion that contains the continuum mechanics of mixing shows self-similar nearly Gaussian length stretch distribution for a wide range of control parameters encompassing an approximate Hele-Shaw cell to a three-dimensional cavity. Because of the initial configuration of the interface which is parallel to the gravitational field, we show that at critical initial potential energy mixing occurs through the stretching of the interface, which shows frontogenesis, and folding, owing to an overturning motion that results in unstable density stratification and produces an ideal condition for the growth of the single wavelength Rayleigh-Taylor instability. The initial perturbation of the interface and flow field generates the Kelvin-Helmholtz instability and causes kinks at the interface, which grow into deep fingers during overturning motion and unfold into local whorl structures that merge and self-organize into the Rayleigh-Taylor morphology (RTM) structure. For a range of parametric space that yields two-dimensional flows, the unfolding of the instability through a supercritical bifurcation yields an asymmetric pairwise structure exhibiting smooth RTM that transitions to RTM fronts with fractal structures that contain small length scales for increasing Peclet numbers. The late stage of the RTM structure unfolds into an internal breakwave that breaks down through wall and internal collision and sets up the condition for self-induced sloshing that decays exponentially as the two fluids become stably stratified with a diffusive region indicating local molecular diffusion.
The Benjamin Shock Tube Problem in KULL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ulitsky, M
2005-08-26
The goal of the EZturb mix model in KULL is to predict the turbulent mixing process as it evolves from Rayleigh-Taylor, Richtmyer-Meshkov, or Kelvin-Helmholtz instabilities. In this report we focus on a simple example of the Richtmyer-Meshkov instability (which occurs when a shock hits an interface between fluids of different densities) without the complication of reshock. The experiment by Benjamin et al. involving a Mach 1.21 incident shock striking an air / SF6 interface, is a good one to model and understand before moving onto shock tubes that follow the growth of the turbulent mixing zone from first shock throughmore » well after reshock.« less
Laboratory simulations of the atmospheric mixed-layer in flow over complex topography
A laboratory study of the influence of complex terrain on the interface between a well-mixed boundary layer and an elevated stratified layer was conducted in the towing-tank facility of the U.S. Environmental Protection Agency. The height of the mixed layer in the daytime boundar...
Tack coat optimization for HMA overlays laboratory testing.
DOT National Transportation Integrated Search
2008-09-01
Interface bonding between hot-mix asphalt (HMA) overlays and Portland cement concrete (PCC) pavements can be one of the most : significant factors affecting overlay service life. Various factors may affect the bonding condition at the interface, incl...
Mixed-mode fracture mechanics parameters of elliptical interface cracks in anisotropic bimaterials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xue, Y.; Qu, J.
1999-07-01
Two-dimensional interface cracks in anisotropic bimaterials have been studied extensively in the literature. However, solutions to three-dimensional interface cracks in anisotropic bimaterials are not available, except for circular (penny-shaped) cracks. In this paper, an elliptical crack on the interface between two anisotropic elastic half-spaces is considered. A formal solution is obtained by using the Stroh method in two dimensional elasticity in conjunction with the Fourier transform method. To illustrate the solution procedure, an elliptical delamination in a cross-ply composite is solved. Numerical results of the stress intensity factors and energy release rate along the crack front are obtained terms ofmore » the interfacial matrix M. It is found that the fields near the crack front are often in mixed mode, due to material anisotropy and the three dimensional nature of the crack front.« less
Revisiting the Quantitative-Qualitative Debate: Implications for Mixed-Methods Research
SALE, JOANNA E. M.; LOHFELD, LYNNE H.; BRAZIL, KEVIN
2015-01-01
Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation. PMID:26523073
a Framework for Distributed Mixed Language Scientific Applications
NASA Astrophysics Data System (ADS)
Quarrie, D. R.
The Object Management Group has defined an architecture (CORBA) for distributed object applications based on an Object Request Broker and Interface Definition Language. This project builds upon this architecture to establish a framework for the creation of mixed language scientific applications. A prototype compiler has been written that generates FORTRAN 90 or Eiffel stubs and skeletons and the required C++ glue code from an input IDL file that specifies object interfaces. This generated code can be used directly for non-distributed mixed language applications or in conjunction with the C++ code generated from a commercial IDL compiler for distributed applications. A feasibility study is presently underway to see whether a fully integrated software development environment for distributed, mixed-language applications can be created by modifying the back-end code generator of a commercial CASE tool to emit IDL.
Mixed-Mode Decohesion Finite Elements for the Simulation of Delamination in Composite Materials
NASA Technical Reports Server (NTRS)
Camanho, Pedro P.; Davila, Carlos G.
2002-01-01
A new decohesion element with mixed-mode capability is proposed and demonstrated. The element is used at the interface between solid finite elements to model the initiation and non-self-similar growth of delaminations. A single relative displacement-based damage parameter is applied in a softening law to track the damage state of the interface and to prevent the restoration of the cohesive state during unloading. The softening law for mixed-mode delamination propagation can be applied to any mode interaction criterion such as the two-parameter power law or the three-parameter Benzeggagh-Kenane criterion. To demonstrate the accuracy of the predictions and the irreversibility capability of the constitutive law, steady-state delamination growth is simulated for quasistatic loading-unloading cycles of various single mode and mixed-mode delamination test specimens.
Equilibrium of adsorption of mixed milk protein/surfactant solutions at the water/air interface.
Kotsmar, C; Grigoriev, D O; Xu, F; Aksenenko, E V; Fainerman, V B; Leser, M E; Miller, R
2008-12-16
Ellipsometry and surface profile analysis tensiometry were used to study and compare the adsorption behavior of beta-lactoglobulin (BLG)/C10DMPO, beta-casein (BCS)/C10DMPO and BCS/C12DMPO mixtures at the air/solution interface. The adsorption from protein/surfactant mixed solutions is of competitive nature. The obtained adsorption isotherms suggest a gradual replacement of the protein molecules at the interface with increasing surfactant concentration for all studied mixed systems. The thickness, refractive index, and the adsorbed amount of the respective adsorption layers, determined by ellipsometry, decrease monotonically and reach values close to those for a surface covered only by surfactant molecules, indicating the absence of proteins from a certain surfactant concentration on. These results correlate with the surface tension data. A continuous increase of adsorption layer thickness was observed up to this concentration, caused by the desorption of segments of the protein and transforming the thin surface layer into a rather diffuse and thick one. Replacement and structural changes of the protein molecules are discussed in terms of protein structure and surface activity of surfactant molecules. Theoretical models derived recently were used for the quantitative description of the equilibrium state of the mixed surface layers.
Transport and Mixing Induced by Beating Cilia in Human Airways
Chateau, Sylvain; D'Ortona, Umberto; Poncet, Sébastien; Favier, Julien
2018-01-01
The fluid transport and mixing induced by beating cilia, present in the bronchial airways, are studied using a coupled lattice Boltzmann—Immersed Boundary solver. This solver allows the simulation of both single and multi-component fluid flows around moving solid boundaries. The cilia are modeled by a set of Lagrangian points, and Immersed Boundary forces are computed onto these points in order to ensure the no-slip velocity conditions between the cilia and the fluids. The cilia are immersed in a two-layer environment: the periciliary layer (PCL) and the mucus above it. The motion of the cilia is prescribed, as well as the phase lag between two cilia in order to obtain a typical collective motion of cilia, known as metachronal waves. The results obtained from a parametric study show that antiplectic metachronal waves are the most efficient regarding the fluid transport. A specific value of phase lag, which generates the larger mucus transport, is identified. The mixing is studied using several populations of tracers initially seeded into the pericilary liquid, in the mucus just above the PCL-mucus interface, and in the mucus far away from the interface. We observe that each zone exhibits different chaotic mixing properties. The larger mixing is obtained in the PCL layer where only a few beating cycles of the cilia are required to obtain a full mixing, while above the interface, the mixing is weaker and takes more time. Almost no mixing is observed within the mucus, and almost all the tracers do not penetrate the PCL layer. Lyapunov exponents are also computed for specific locations to assess how the mixing is performed locally. Two time scales are introduced to allow a comparison between mixing induced by fluid advection and by molecular diffusion. These results are relevant in the context of respiratory flows to investigate the transport of drugs for patients suffering from chronic respiratory diseases. PMID:29559920
Transport and Mixing Induced by Beating Cilia in Human Airways.
Chateau, Sylvain; D'Ortona, Umberto; Poncet, Sébastien; Favier, Julien
2018-01-01
The fluid transport and mixing induced by beating cilia, present in the bronchial airways, are studied using a coupled lattice Boltzmann-Immersed Boundary solver. This solver allows the simulation of both single and multi-component fluid flows around moving solid boundaries. The cilia are modeled by a set of Lagrangian points, and Immersed Boundary forces are computed onto these points in order to ensure the no-slip velocity conditions between the cilia and the fluids. The cilia are immersed in a two-layer environment: the periciliary layer (PCL) and the mucus above it. The motion of the cilia is prescribed, as well as the phase lag between two cilia in order to obtain a typical collective motion of cilia, known as metachronal waves. The results obtained from a parametric study show that antiplectic metachronal waves are the most efficient regarding the fluid transport. A specific value of phase lag, which generates the larger mucus transport, is identified. The mixing is studied using several populations of tracers initially seeded into the pericilary liquid, in the mucus just above the PCL-mucus interface, and in the mucus far away from the interface. We observe that each zone exhibits different chaotic mixing properties. The larger mixing is obtained in the PCL layer where only a few beating cycles of the cilia are required to obtain a full mixing, while above the interface, the mixing is weaker and takes more time. Almost no mixing is observed within the mucus, and almost all the tracers do not penetrate the PCL layer. Lyapunov exponents are also computed for specific locations to assess how the mixing is performed locally. Two time scales are introduced to allow a comparison between mixing induced by fluid advection and by molecular diffusion. These results are relevant in the context of respiratory flows to investigate the transport of drugs for patients suffering from chronic respiratory diseases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, N.; Ye, Z.
This report documents part of a multiyear research program dedicated to the development of requirements to support the definition, design, and demonstration of a distributed generation-electric power system interconnection interface concept. The report focuses on the dynamic behavior of power systems when a significant portion of the total energy resource is distributed generation. It also focuses on the near-term reality that the majority of new DG relies on rotating synchronous generators for energy conversion.
Hybrid 2-D and 3-D Immersive and Interactive User Interface for Scientific Data Visualization
2017-08-01
visualization, 3-D interactive visualization, scientific visualization, virtual reality, real -time ray tracing 16. SECURITY CLASSIFICATION OF: 17...scientists to employ in the real world. Other than user-friendly software and hardware setup, scientists also need to be able to perform their usual...and scientific visualization communities mostly have different research priorities. For the VR community, the ability to support real -time user
Virtual reality hardware and graphic display options for brain-machine interfaces
Marathe, Amar R.; Carey, Holle L.; Taylor, Dawn M.
2009-01-01
Virtual reality hardware and graphic displays are reviewed here as a development environment for brain-machine interfaces (BMIs). Two desktop stereoscopic monitors and one 2D monitor were compared in a visual depth discrimination task and in a 3D target-matching task where able-bodied individuals used actual hand movements to match a virtual hand to different target hands. Three graphic representations of the hand were compared: a plain sphere, a sphere attached to the fingertip of a realistic hand and arm, and a stylized pacman-like hand. Several subjects had great difficulty using either stereo monitor for depth perception when perspective size cues were removed. A mismatch in stereo and size cues generated inappropriate depth illusions. This phenomenon has implications for choosing target and virtual hand sizes in BMI experiments. Target matching accuracy was about as good with the 2D monitor as with either 3D monitor. However, users achieved this accuracy by exploring the boundaries of the hand in the target with carefully controlled movements. This method of determining relative depth may not be possible in BMI experiments if movement control is more limited. Intuitive depth cues, such as including a virtual arm, can significantly improve depth perception accuracy with or without stereo viewing. PMID:18006069
Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T
2007-07-01
Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.
The importance of fluctuations in fluid mixing
Kadau, Kai; Rosenblatt, Charles; Barber, John L.; Germann, Timothy C.; Huang, Zhibin; Carlès, Pierre; Alder, Berni J.
2007-01-01
A ubiquitous example of fluid mixing is the Rayleigh–Taylor instability, in which a heavy fluid initially sits atop a light fluid in a gravitational field. The subsequent development of the unstable interface between the two fluids is marked by several stages. At first, each interface mode grows exponentially with time before transitioning to a nonlinear regime characterized by more complex hydrodynamic mixing. Unfortunately, traditional continuum modeling of this process has generally been in poor agreement with experiment. Here, we indicate that the natural, random fluctuations of the flow field present in any fluid, which are neglected in continuum models, can lead to qualitatively and quantitatively better agreement with experiment. We performed billion-particle atomistic simulations and magnetic levitation experiments with unprecedented control of initial interface conditions. A comparison between our simulations and experiments reveals good agreement in terms of the growth rate of the mixing front as well as the new observation of droplet breakup at later times. These results improve our understanding of many fluid processes, including interface phenomena that occur, for example, in supernovae, the detachment of droplets from a faucet, and ink jet printing. Such instabilities are also relevant to the possible energy source of inertial confinement fusion, in which a millimeter-sized capsule is imploded to initiate nuclear fusion reactions between deuterium and tritium. Our results suggest that the applicability of continuum models would be greatly enhanced by explicitly including the effects of random fluctuations. PMID:17470811
HST STIS Observations of the Mixing Layer in the Cat’s Eye Nebula
NASA Astrophysics Data System (ADS)
Fang, Xuan; Guerrero, Martín A.; Toalá, Jesús A.; Chu, You-Hua; Gruendl, Robert A.
2016-05-01
Planetary nebulae (PNe) are expected to have a ˜105 K interface layer between the ≥slant 106 K inner hot bubble and the ˜104 K optical nebular shell. The PN structure and evolution, and the X-ray emission, depend critically on the efficiency of the mixing of material at this interface layer. However, neither its location nor its spatial extent have ever been determined. Using high-spatial resolution HST STIS spectroscopic observations of the N v λ λ 1239,1243 lines in the Cat’s Eye Nebula (NGC 6543), we have detected this interface layer and determined its location, extent, and physical properties for the first time in a PN. We confirm that this interface layer, as revealed by the spatial distribution of the N v λ1239 line emission, is located between the hot bubble and the optical nebular shell. We estimate a thickness of 1.5× {10}16 cm and an electron density of ˜200 cm-3 for the mixing layer. With a thermal pressure of ˜2 × 10-8 dyn cm-2, the mixing layer is in pressure equilibrium with the hot bubble and ionized nebular rim of NGC 6543. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555. The observations are associated with program #12509.
Declarative Knowledge Acquisition in Immersive Virtual Learning Environments
ERIC Educational Resources Information Center
Webster, Rustin
2016-01-01
The author investigated the interaction effect of immersive virtual reality (VR) in the classroom. The objective of the project was to develop and provide a low-cost, scalable, and portable VR system containing purposely designed and developed immersive virtual learning environments for the US Army. The purpose of the mixed design experiment was…
The University and the Voluntary Work Culture: Reality and Perspective
ERIC Educational Resources Information Center
Almaraee, Mohammed Abdullah
2016-01-01
To explore the present role of universities in propagating the culture of voluntary work in the Saudi community, mixed research design has been incorporated along with descriptive statistics for retrieving outcomes. The research design has been implemented in order to evaluate the concept of voluntary work culture among the university staff and…
The Value of Team-Based Mixed-Reality (TBMR) Games in Higher Education
ERIC Educational Resources Information Center
Denholm, John A.; Protopsaltis, Aristidis; de Freitas, Sara
2013-01-01
This paper reports on a conducted study, measuring the perceptions of post-graduate students on the effectiveness of serious games in the classroom. Four games were used (Project Management Exercise, "Winning Margin" Business Simulation, Management of Change and Management of Product Design and Development) with scenarios ranging from…
From Pipe Dream to Reality: Creating a Technology-Rich School Environment.
ERIC Educational Resources Information Center
Crafton, John A.
1998-01-01
Methuen Public Schools, Massachusetts, has become a wired school system with computers in every classroom, Internet access, and state-of-the-art mixed-media. Five citizens who work in the technology industry formed a steering committee to drive the project. A long-term partnership with a private vendor, Lucent Technologies, addresses the…
Comprehension for What? Preparing Students for Their Meaningful Future
ERIC Educational Resources Information Center
Conley, Mark W.; Wise, Antoinette
2011-01-01
Researchers, policymakers, and educators face a daunting task these days concerning literacy education for the here and now and literacy for the future. Even though one clings to the romantic notion that education provides the building blocks in a straight line to a meaningful future, the reality is that mixed goals and instructional messages…
E-Learning System Using Segmentation-Based MR Technique for Learning Circuit Construction
ERIC Educational Resources Information Center
Takemura, Atsushi
2016-01-01
This paper proposes a novel e-Learning system using the mixed reality (MR) technique for technical experiments involving the construction of electronic circuits. The proposed system comprises experimenters' mobile computers and a remote analysis system. When constructing circuits, each learner uses a mobile computer to transmit image data from the…
ERIC Educational Resources Information Center
Kirkley, Sonny E.; Kirkley, Jamie R.
2005-01-01
In this article, the challenges and issues of designing next generation learning environments using current and emerging technologies are addressed. An overview of the issues is provided as well as design principles that support the design of instruction and the overall learning environment. Specific methods for creating cognitively complex,…
ERIC Educational Resources Information Center
Dieker, Lisa; Hynes, Michael; Hughes, Charles; Smith, Eileen
2008-01-01
As technology evolves, so does its impact on people's lives. These changes clearly affect people's daily activities, but how might they also impact education, teachers, and the lives of students with disabilities? This article focuses on technological innovations and their potential implications for students and teachers in schools. This article…
Starting Somewhere: Folks with Unique Communication Needs Make Their Way at Work
ERIC Educational Resources Information Center
Murphy, Patti
2009-01-01
A mix of technologies and human dynamics can make good communication a workplace reality when workers cannot take for granted that they'll be understood. As more people using augmentative and alternative communication (AAC) because of significant speech impairment pursue traditional paid, volunteer, and self-employment, their concerns reflect…
ERIC Educational Resources Information Center
Marty, Jean-Charles; Carron, Thibault; Pernelle, Philippe; Talbot, Stéphane; Houzet, Gregory
2015-01-01
The authors' research work deals with the development of new game-based learning (gbl) environments. They think that the way of acquiring knowledge during a learning session is similar to following an adventure in a role-playing game and they apply the metaphor of exploring a virtual world, where each student embarks on a quest in order to collect…
Impact of RFID on Retail Value Chain: A Mixed Method Study
ERIC Educational Resources Information Center
Bhattacharya, Mithu
2011-01-01
Radio Frequency Identification (RFID) mandates by large retailers and various government agencies have driven a large number of organizations to roll out the technology. Despite these commitments the business case for RFID is far from reality and is still at its infancy. This dissertation work aims at providing realistic perspective on the…
Human Service Administrator Perceptions of Online MSW Degree Programs
ERIC Educational Resources Information Center
Curran, Laura; Sanchez Mayers, Ray; Fulghum, Fontaine
2017-01-01
Online programs have proliferated rapidly in higher education, and this reality holds true for social work education as well. Employing a mixed methods design, this study looked at employer perceptions of online degrees compared to traditional degrees. Data was collected through an online survey that included Likert type and open-ended questions…
Phenomenological Influences in Minority Attitudes toward School Desegregation.
ERIC Educational Resources Information Center
Sobol, Marion Gross; Beck, William W.
1980-01-01
Examines whether Black parents' satisfaction with schools reflects their perception that their children's schools are integrated or is based on the schools' actual integration. Concludes that perception is more important than reality: if the parent thinks the child attends a mixed school, s/he will be more positive toward that school. (Author/GC)
Perception Versus Reality in Educational Attitudes Toward School Desegregation.
ERIC Educational Resources Information Center
Beck, William W.; Sobol, Marion Gross
After a Federal court ordered school desegregation in Dallas, Texas, a study was conducted to determine factors influencing black parents' attitudes toward their children's school. Parents who said their children were in racially mixed schools were shown to be far more satisfied than those who said their children were in all black schools. Broad…
Embodied information behavior, mixed reality and big data
NASA Astrophysics Data System (ADS)
West, Ruth; Parola, Max J.; Jaycen, Amelia R.; Lueg, Christopher P.
2015-03-01
A renaissance in the development of virtual (VR), augmented (AR), and mixed reality (MR) technologies with a focus on consumer and industrial applications is underway. As data becomes ubiquitous in our lives, a need arises to revisit the role of our bodies, explicitly in relation to data or information. Our observation is that VR/AR/MR technology development is a vision of the future framed in terms of promissory narratives. These narratives develop alongside the underlying enabling technologies and create new use contexts for virtual experiences. It is a vision rooted in the combination of responsive, interactive, dynamic, sharable data streams, and augmentation of the physical senses for capabilities beyond those normally humanly possible. In parallel to the varied definitions of information and approaches to elucidating information behavior, a myriad of definitions and methods of measuring and understanding presence in virtual experiences exist. These and other ideas will be tested by designers, developers and technology adopters as the broader ecology of head-worn devices for virtual experiences evolves in order to reap the full potential and benefits of these emerging technologies.
Optoelectronic devices utilizing materials having enhanced electronic transitions
Black, Marcie R [Newton, MA
2011-02-22
An optoelectronic device that includes a material having enhanced electronic transitions. The electronic transitions are enhanced by mixing electronic states at an interface. The interface may be formed by a nano-well, a nano-dot, or a nano-wire.
Optoelectronic devices utilizing materials having enhanced electronic transitions
Black, Marcie R.
2013-04-09
An optoelectronic device that includes a material having enhanced electronic transitions. The electronic transitions are enhanced by mixing electronic states at an interface. The interface may be formed by a nano-well, a nano-dot, or a nano-wire.
Experimental study of an isochorically heated heterogeneous interface. A progress report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandez, Juan Carlos
2015-08-20
Outline of the presentation: Studying possible mix / interface motion between heterogeneous low/high Z interfaces driven by 2-fluid or kinetic plasma effects (Heated to few eV, Sharp (sub µm) interface); Isochoric heating to initialize interface done with Al quasimonoenergetic ion beams on Trident; Have measured isochoric heating in individual materials intended for compound targets; Fielded experiments on Trident to measure interface motion (Gold-diamond, tin-aluminium); Measured heated-sample temperature with streaked optical pyrometry (SOP) (UT Austin led (research contract), SOP tests → heating uniformity Vs thickness on Al foils. Results are being analyzed.
NASA Technical Reports Server (NTRS)
Lindsey, Patricia F.
1993-01-01
In its search for higher level computer interfaces and more realistic electronic simulations for measurement and spatial analysis in human factors design, NASA at MSFC is evaluating the functionality of virtual reality (VR) technology. Virtual reality simulation generates a three dimensional environment in which the participant appears to be enveloped. It is a type of interactive simulation in which humans are not only involved, but included. Virtual reality technology is still in the experimental phase, but it appears to be the next logical step after computer aided three-dimensional animation in transferring the viewer from a passive to an active role in experiencing and evaluating an environment. There is great potential for using this new technology when designing environments for more successful interaction, both with the environment and with another participant in a remote location. At the University of North Carolina, a VR simulation of a the planned Sitterson Hall, revealed a flaw in the building's design that had not been observed during examination of the more traditional building plan simulation methods on paper and on computer aided design (CAD) work station. The virtual environment enables multiple participants in remote locations to come together and interact with one another and with the environment. Each participant is capable of seeing herself and the other participants and of interacting with them within the simulated environment.
The interface between blood preparation and use in Uganda.
Kajja, I; Bimenya, G; Smit Sibinga, C
2010-04-01
The interface between preparation and use of blood impacts directly on the outcome of hemotherapy. The present study explores the knowledge and opinions of key players at, practical realities at, and quality improvement strategies of this interface. We surveyed clinicians (n = 81) and blood bank staff (n = 25) to assess their knowledge on key issues in their counterparts' working domains, the turnaround time on effecting a blood order from a hospital transfusion laboratory and strategies to improve communication of blood needs to blood banks. Out of 81 clinicians, 20 knew the four available blood products while only 17 knew the three uses of these products. Twenty-three blood bank staff reported the patient's condition as the main factor on which blood orders are based. Forty-four (54.3%) clinicians reported reception of a blood product within an hour of placing the order. Addressing infrastructure and human resource were some of the strategies suggested to improve this step of the transfusion chain. The knowledge of staff at the extreme ends of the clinical interface in their counterparts' working domain is far from adequate. However, they have well formed opinions on strategies to improve this interface.
Observation of Compressible Plasma Mix in Cylindrically Convergent Implosions
NASA Astrophysics Data System (ADS)
Barnes, Cris W.; Batha, Steven H.; Lanier, Nicholas E.; Magelssen, Glenn R.; Tubbs, David L.; Dunne, A. M.; Rothman, Steven R.; Youngs, David L.
2000-10-01
An understanding of hydrodynamic mix in convergent geometry will be of key importance in the development of a robust ignition/burn capability on NIF, LMJ and future pulsed power machines. We have made use of the OMEGA laser facility at the University of Rochester to investigate directly the mix evolution in a convergent geometry, compressible plasma regime. The experiments comprise a plastic cylindrical shell imploded by direct laser irradiation. The cylindrical shell surrounds a lower density plastic foam which provides sufficient back pressure to allow the implosion to stagnate at a sufficiently high radius to permit quantitative radiographic diagnosis of the interface evolution near turnaround. The susceptibility to mix of the shell-foam interface is varied by choosing different density material for the inner shell surface (thus varying the Atwood number). This allows the study of shock-induced Richtmyer-Meshkov growth during the coasting phase, and Rayleigh-Taylor growth during the stagnation phase. The experimental results will be described along with calculational predictions using various radiation hydrodynamics codes and turbulent mix models.
Molecular assembly, interfacial rheology and foaming properties of oligofructose fatty acid esters.
van Kempen, Silvia E H J; Schols, Henk A; van der Linden, Erik; Sagis, Leonard M C
2014-01-01
Two major types of food-grade surfactants used to stabilize foams are proteins and low molecular weight (LMW) surfactants. Proteins lower the surface tension of interfaces and tend to unfold and stabilize the interface by the formation of a visco-elastic network, which leads to high surface moduli. In contrast, LMW surfactants lower the surface tension more than proteins, but do not form interfaces with a high modulus. Instead, they stabilize the interface through the Gibbs-Marangoni mechanism that relies on rapid diffusion of surfactants, when surface tension gradients develop as a result of deformations of the interface. A molecule than can lower the surface tension considerably, like a LMW surfactant, but also provide the interface with a high modulus, like a protein, would be an excellent foam stabilizer. In this article we will discuss molecules with those properties: oligofructose fatty acid esters, both in pure and mixed systems. First, we will address the synthesis and structural characterization of the esters. Next, we will address self-assembly and rheological properties of air/water interfaces stabilized by the esters. Subsequently, this paper will deal with mixed systems of mono-esters with either di-esters and lauric acid, or proteins. Then, the foaming functionality of the esters is discussed.
Price, Matthew; Anderson, Page L
2012-06-01
Outcome expectancy, the extent that clients anticipate benefiting from therapy, is theorized to be an important predictor of treatment response for cognitive-behavioral therapy. However, there is a relatively small body of empirical research on outcome expectancy and the treatment of social anxiety disorder. This literature, which has examined the association mostly in group-based interventions, has yielded mixed findings. The current study sought to further evaluate the effect of outcome expectancy as a predictor of treatment response for public-speaking fears across both individual virtual reality and group-based cognitive-behavioral therapies. The findings supported outcome expectancy as a predictor of the rate of change in public-speaking anxiety during both individual virtual reality exposure therapy and group cognitive-behavioral therapy. Furthermore, there was no evidence to suggest that the impact of outcome expectancy differed across virtual reality or group treatments. PsycINFO Database Record (c) 2012 APA, all rights reserved.
A Content Analysis of Teen Parenthood in "Teen Mom" Reality Programming.
Martins, Nicole; Malacane, Mona; Lewis, Nicky; Kraus, Ashley
2016-12-01
A content analysis of the MTV shows 16 and Pregnant (n = 59), Teen Mom (n = 20), and Teen Mom 2 (n = 20) was conducted to determine whether these programs accurately portray teen pregnancy. The results revealed that teen mothers on 16 and Pregnant were younger, more often White, and had more healthy babies as compared to national averages. The babies' fathers were more involved in the daily care of their child as compared to reality. Medical insurance or receipt of government assistance was almost never discussed. Teen mothers in the Teen Mom shows were significantly more likely to achieve a high school diploma as compared to reality. Finally, mothers on Teen Mom and Teen Mom 2 were significantly less likely to voice concern about finances and had more active social lives than mothers on 16 and Pregnant. Using social learning theory as a theoretical framework, we argue that these shows provide mixed messages to young audiences about teen pregnancy and parenthood.
van den Berg, Nynke S; Engelen, Thijs; Brouwer, Oscar R; Mathéron, Hanna M; Valdés-Olmos, Renato A; Nieweg, Omgo E; van Leeuwen, Fijs W B
2016-08-01
To explore the feasibility of an intraoperative navigation technology based on preoperatively acquired single photon emission computed tomography combined with computed tomography (SPECT/CT) images during sentinel node (SN) biopsy in patients with melanoma or Merkel cell carcinoma. Patients with a melanoma (n=4) or Merkel cell carcinoma (n=1) of a lower extremity scheduled for wide re-excision of the primary lesion site and SN biopsy were studied. Following a Tc-nanocolloid injection and lymphoscintigraphy, SPECT/CT images were acquired with a reference target (ReTp) fixed on the leg or the iliac spine. Intraoperatively, a sterile ReTp was placed at the same site to enable SPECT/CT-based mixed-reality navigation of a gamma ray detection probe also containing a reference target (ReTgp).The accuracy of the navigation procedure was determined in the coronal plane (x, y-axis) by measuring the discrepancy between standard gamma probe-based SN localization and mixed-reality-based navigation to the SN. To determine the depth accuracy (z-axis), the depth estimation provided by the navigation system was compared to the skin surface-to-node distance measured in the computed tomography component of the SPECT/CT images. In four of five patients, it was possible to navigate towards the preoperatively defined SN. The average navigational error was 8.0 mm in the sagittal direction and 8.5 mm in the coronal direction. Intraoperative sterile ReTp positioning and tissue movement during surgery exerted a distinct influence on the accuracy of navigation. Intraoperative navigation during melanoma or Merkel cell carcinoma surgery is feasible and can provide the surgeon with an interactive 3D roadmap towards the SN or SNs in the groin. However, further technical optimization of the modality is required before this technology can become routine practice.
NASA Technical Reports Server (NTRS)
Ding, R. Jeffrey (Inventor)
2012-01-01
A welding method is provided for forming a weld joint between first and second elements of a workpiece. The method includes heating the first and second elements to form an interface of material in a plasticized or melted state interface between the elements. The interface material is then allowed to cool to a plasticized state if previously in a melted state. The interface material, while in the plasticized state, is then mixed, for example, using a grinding/extruding process, to remove any dendritic-type weld microstructures introduced into the interface material during the heating process.
Thermal stir welding apparatus
NASA Technical Reports Server (NTRS)
Ding, R. Jeffrey (Inventor)
2011-01-01
A welding method and apparatus are provided for forming a weld joint between first and second elements of a workpiece. The method includes heating the first and second elements to form an interface of material in a plasticized or melted state interface between the elements. The interface material is then allowed to cool to a plasticized state if previously in a melted state. The interface material, while in the plasticized state, is then mixed, for example, using a grinding/extruding process, to remove any dendritic-type weld microstructures introduced into the interface material during the heating process.
NASA Astrophysics Data System (ADS)
Romanyuk, O.; Supplie, O.; Susi, T.; May, M. M.; Hannappel, T.
2016-10-01
The atomic and electronic band structures of GaP/Si(001) heterointerfaces were investigated by ab initio density functional theory calculations. Relative total energies of abrupt interfaces and mixed interfaces with Si substitutional sites within a few GaP layers were derived. It was found that Si diffusion into GaP layers above the first interface layer is energetically unfavorable. An interface with Si/Ga substitution sites in the first layer above the Si substrate is energetically the most stable one in thermodynamic equilibrium. The electronic band structure of the epitaxial GaP/Si(001) heterostructure terminated by the (2 ×2 ) surface reconstruction consists of surface and interface electronic states in the common band gap of two semiconductors. The dispersion of the states is anisotropic and differs for the abrupt Si-Ga, Si-P, and mixed interfaces. Ga 2 p , P 2 p , and Si 2 p core-level binding-energy shifts were computed for the abrupt and the lowest-energy heterointerface structures. Negative and positive core-level shifts due to heterovalent bonds at the interface are predicted for the abrupt Si-Ga and Si-P interfaces, respectively. The distinct features in the heterointerface electronic structure and in the core-level shifts open new perspectives in the experimental characterization of buried polar-on-nonpolar semiconductor heterointerfaces.
Fischbach, Martin; Wiebusch, Dennis; Latoschik, Marc Erich
2017-04-01
Modularity, modifiability, reusability, and API usability are important software qualities that determine the maintainability of software architectures. Virtual, Augmented, and Mixed Reality (VR, AR, MR) systems, modern computer games, as well as interactive human-robot systems often include various dedicated input-, output-, and processing subsystems. These subsystems collectively maintain a real-time simulation of a coherent application state. The resulting interdependencies between individual state representations, mutual state access, overall synchronization, and flow of control implies a conceptual close coupling whereas software quality asks for a decoupling to develop maintainable solutions. This article presents five semantics-based software techniques that address this contradiction: Semantic grounding, code from semantics, grounded actions, semantic queries, and decoupling by semantics. These techniques are applied to extend the well-established entity-component-system (ECS) pattern to overcome some of this pattern's deficits with respect to the implied state access. A walk-through of central implementation aspects of a multimodal (speech and gesture) VR-interface is used to highlight the techniques' benefits. This use-case is chosen as a prototypical example of complex architectures with multiple interacting subsystems found in many VR, AR and MR architectures. Finally, implementation hints are given, lessons learned regarding maintainability pointed-out, and performance implications discussed.
Thomas, J Graham; Spitalnick, Josh S; Hadley, Wendy; Bond, Dale S; Wing, Rena R
2015-01-01
Virtual reality (VR) technology can provide a safe environment for observing, learning, and practicing use of behavioral weight management skills, which could be particularly useful in enhancing minimal contact online weight management programs. The Experience Success (ES) project developed a system for creating and deploying VR scenarios for online weight management skills training. Virtual environments populated with virtual actors allow users to experiment with implementing behavioral skills via a PC-based point and click interface. A culturally sensitive virtual coach guides the experience, including planning for real-world skill use. Thirty-seven overweight/obese women provided feedback on a test scenario focused on social eating situations. They reported that the scenario gave them greater skills, confidence, and commitment for controlling eating in social situations. © 2014 Diabetes Technology Society.
Spitalnick, Josh S.; Hadley, Wendy; Bond, Dale S.; Wing, Rena R.
2014-01-01
Virtual reality (VR) technology can provide a safe environment for observing, learning, and practicing use of behavioral weight management skills, which could be particularly useful in enhancing minimal contact online weight management programs. The Experience Success (ES) project developed a system for creating and deploying VR scenarios for online weight management skills training. Virtual environments populated with virtual actors allow users to experiment with implementing behavioral skills via a PC-based point and click interface. A culturally sensitive virtual coach guides the experience, including planning for real-world skill use. Thirty-seven overweight/obese women provided feedback on a test scenario focused on social eating situations. They reported that the scenario gave them greater skills, confidence, and commitment for controlling eating in social situations. PMID:25367014
The Application of Modeling and Simulation to the Behavioral Deficit of Autism
NASA Technical Reports Server (NTRS)
Anton, John J.
2010-01-01
This abstract describes a research effort to apply technological advances in virtual reality simulation and computer-based games to create behavioral modification programs for individuals with Autism Spectrum Disorder (ASD). The research investigates virtual social skills training within a 3D game environment to diminish the impact of ASD social impairments and to increase learning capacity for optimal intellectual capability. Individuals with autism will encounter prototypical social contexts via computer interface and will interact with 3D avatars with predefined roles within a game-like environment. Incremental learning objectives will combine to form a collaborative social environment. A secondary goal of the effort is to begin the research and development of virtual reality exercises aimed at triggering the release of neurotransmitters to promote critical aspects of synaptic maturation at an early age to change the course of the disease.
NASA Astrophysics Data System (ADS)
Xue, Qin; Liu, Shouyin; Zhang, Shiming; Chen, Ping; Zhao, Yi; Liu, Shiyong
2013-01-01
We fabricated organic light-emitting devices (OLEDs) employing 2-methyl-9,10-di(2-naphthyl)-anthracene (MADN) as hole-transport material (HTM) instead of commonly used N,N'-bis-(1-naphthyl)-N,N'-diphenyl,1,1'-biphenyl-4,4'-diamine (NPB). After inserting a 0.9 nm thick molybdenum oxide (MoOx) layer at the indium tin oxide (ITO)/MADN interface and a 5 nm thick mixed layer at the organic/organic heterojunction interface, the power conversion efficiency of the device can be increased by 4-fold.
Freshwater-Brine Mixing Zone Hydrodynamics in Salt Flats (Salar de Atacama)
NASA Astrophysics Data System (ADS)
Marazuela, M. A.; Vázquez-Suñé, E.; Custodio, E.; Palma, T.; García-Gil, A.
2017-12-01
The increase in the demand of strategic minerals for the development of medicines and batteries require detailed knowledge of the salt flats freshwater-brine interface to make its exploitation efficient. The interface zone is the result of a physical balance between the recharged and evaporated water. The sharp interface approach assumes the immiscibility of the fluids and thus neglects the mixing between them. As a consequence, for miscible fluids it is more accurate and often needed to use the mixing zone concept, which results from the dynamic equilibrium of flowing freshwater and brine. In this study, we consider two and three-dimensional scale approaches for the management of the mixing zone. The two-dimensional approach is used to understand the dynamics and the characteristics of the salt flat mixing zone, especially in the Salar de Atacama (Atacama salt flat) case. By making use of this model we analyze and quantify the effects of the aquitards on the mixing zone geometry. However, the understanding of the complex physical processes occurring in the salt flats and the management of these environments requires the adoption of three-dimensional regional scale numerical models. The models that take into account the effects of variable density represent the best management tool, but they require large computational resources, especially in the three-dimensional case. In order to avoid these computational limitations in the modeling of salt flats and their valuable ecosystems, we propose a three-step methodology, consisting of: (1) collection, validation and interpretation of the hydrogeochemical data, (2) identification and three-dimensional mapping of the mixing zone on the land surface and in depth, and (3) application of a water head correction to the freshwater and mixed water heads in order to compensate the density variations and to transform them to brine water heads. Finally, an evaluation of the sensibility of the mixing zone to anthropogenic and climate changes is included.
Entropy of adsorption of mixed surfactants from solutions onto the air/water interface
Chen, L.-W.; Chen, J.-H.; Zhou, N.-F.
1995-01-01
The partial molar entropy change for mixed surfactant molecules adsorbed from solution at the air/water interface has been investigated by surface thermodynamics based upon the experimental surface tension isotherms at various temperatures. Results for different surfactant mixtures of sodium dodecyl sulfate and sodium tetradecyl sulfate, decylpyridinium chloride and sodium alkylsulfonates have shown that the partial molar entropy changes for adsorption of the mixed surfactants were generally negative and decreased with increasing adsorption to a minimum near the maximum adsorption and then increased abruptly. The entropy decrease can be explained by the adsorption-orientation of surfactant molecules in the adsorbed monolayer and the abrupt entropy increase at the maximum adsorption is possible due to the strong repulsion between the adsorbed molecules.
Droser, Mary L.; Jensen, Sören; Gehling, James G.
2002-01-01
The trace fossil record is important in determining the timing of the appearance of bilaterian animals. A conservative estimate puts this time at ≈555 million years ago. The preservational potential of traces made close to the sediment–water interface is crucial to detecting early benthic activity. Our studies on earliest Cambrian sediments suggest that shallow tiers were preserved to a greater extent than typical for most of the Phanerozoic, which can be attributed both directly and indirectly to the low levels of sediment mixing. The low levels of sediment mixing meant that thin event beds were preserved. The shallow depth of sediment mixing also meant that muddy sediments were firm close to the sediment–water interface, increasing the likelihood of recording shallow-tier trace fossils in muddy sediments. Overall, trace fossils can provide a sound record of the onset of bilaterian benthic activity. PMID:12271130
iHand: an interactive bare-hand-based augmented reality interface on commercial mobile phones
NASA Astrophysics Data System (ADS)
Choi, Junyeong; Park, Jungsik; Park, Hanhoon; Park, Jong-Il
2013-02-01
The performance of mobile phones has rapidly improved, and they are emerging as a powerful platform. In many vision-based applications, human hands play a key role in natural interaction. However, relatively little attention has been paid to the interaction between human hands and the mobile phone. Thus, we propose a vision- and hand gesture-based interface in which the user holds a mobile phone in one hand but sees the other hand's palm through a built-in camera. The virtual contents are faithfully rendered on the user's palm through palm pose estimation, and reaction with hand and finger movements is achieved that is recognized by hand shape recognition. Since the proposed interface is based on hand gestures familiar to humans and does not require any additional sensors or markers, the user can freely interact with virtual contents anytime and anywhere without any training. We demonstrate that the proposed interface works at over 15 fps on a commercial mobile phone with a 1.2-GHz dual core processor and 1 GB RAM.
NASA Astrophysics Data System (ADS)
Bhandarkar, Y. V.; Ghaisas, S. V.; Ogale, S. B.
1988-07-01
Ion-beam mixing at an Fe:metallic glass (Fe67Co18B14Si1) interface is studied by employing the technique of conversion electron Mössbauer spectroscopy (CEMS). A 230-Å-thick overlayer of iron (enriched to 33% in the concentration of 57Fe Mössbauer isotope) was deposited on the shiny surface of metallic glass and such composites were bombarded with 100-keV Kr+ ions at dose values in the range between 1×1015 and 2×1016 ions/cm2. The transformations in the local atomic arrangements across the interface were investigated by monitoring the changes in the hyperfine-interaction parameters. It is shown that mixing leads to significant changes in the composition, in the vicinity of the interface as a function of the ion dose. At low dose (1×1015 ions/cm2) the local atomic coordination is found to be rich in the transition-metal concentration, while at a higher dose (2×1016 ions/cm2) it is observed to be rich in the boron concentration. Interestingly, at an intermediate dose 1×1016 ions/cm2 the composite near the interface region partially crystallizes and this structural state is found to revert back to the amorphous state upon thermal annealing at 300 °C. The observations made on the basis of CEMS are well supported by x-ray diffraction measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mancuso, C.A.
The INEL Database of BNCT Information and Treatment (TIDBIT) has been under development for several years. Late in 1993, a new software development team took over the project and did and assessment of the current implementation status, and determined that the user interface was unsatisfactory for the expected users and that the data structures were out of step with the current state of reality. The team evaluated several tools that would improve the user interface to make the system easier to use. Uniface turned out to be the product of choice. During 1994, TIDBIT got its name, underwent a completemore » change of appearance, had a major overhaul to the data structures that support the application, and system documentation was begun. A prototype of the system was demonstrated in September 1994.« less
Illustrative visualization of 3D city models
NASA Astrophysics Data System (ADS)
Doellner, Juergen; Buchholz, Henrik; Nienhaus, Marc; Kirsch, Florian
2005-03-01
This paper presents an illustrative visualization technique that provides expressive representations of large-scale 3D city models, inspired by the tradition of artistic and cartographic visualizations typically found in bird"s-eye view and panoramic maps. We define a collection of city model components and a real-time multi-pass rendering algorithm that achieves comprehensible, abstract 3D city model depictions based on edge enhancement, color-based and shadow-based depth cues, and procedural facade texturing. Illustrative visualization provides an effective visual interface to urban spatial information and associated thematic information complementing visual interfaces based on the Virtual Reality paradigm, offering a huge potential for graphics design. Primary application areas include city and landscape planning, cartoon worlds in computer games, and tourist information systems.
The complex fluid dynamics of simple diffusion
NASA Astrophysics Data System (ADS)
Vold, Erik
2017-11-01
Diffusion as the mass transport process responsible for mixing fluids at the atomic level is often underestimated in its complexity. An initial discontinuity between two species of different atomic masses exhibits a mass density discontinuity under isothermal pressure equilibrium implying equal species molar densities. The self-consistent kinetic transport processes across such an interface leads to a zero sum of mass flux relative to the center of mass and so diffusion alone cannot relax an initially stationary mass discontinuity nor broaden the density profile at the interface. The diffusive mixing leads to a molar imbalance which drives a center of mass velocity which moves the heavier species toward the lighter species leading to the interfacial density relaxation. Simultaneously, the species non-zero molar flux modifies the pressure profile in a transient wave and in a local perturbation. The resulting center of mass velocity has two components; one, associated with the divergence of the flow, persists in the diffusive mixing region throughout the diffusive mixing process, and two, travelling waves at the front of the pressure perturbations propagate away from the mixing region. The momentum in these waves is necessary to maintain momentum conservation in the center of mass frame. Thus, in a number of ways, the diffusive mixing provides feedback into the small scale advective motions. Numerical methods which diffuse all species assuming P-T equilibrium may not recover the subtle dynamics of mass transport at an interface. Work performed by the LANS, LLC, under USDOE Contract No. DE-AC52-06NA25396, funded by the (ASC) Program.
NASA Astrophysics Data System (ADS)
Johnson, Bradley; May, Gayle L.; Korn, Paula
A recent symposium produced papers in the areas of solar system exploration, man machine interfaces, cybernetics, virtual reality, telerobotics, life support systems and the scientific and technology spinoff from the NASA space program. A number of papers also addressed the social and economic impacts of the space program. For individual titles, see A95-87468 through A95-87479.
NASA Technical Reports Server (NTRS)
Orr, Joel N.
1995-01-01
This reflection of human-computer interface and its requirements as virtual technology is advanced, proposes a new term: 'Pezonomics'. The term replaces the term ergonomics ('the law of work') with a definition pointing to 'the law of play.' The necessity of this term, the author reasons, comes from the need to 'capture the essence of play and calibrate our computer systems to its cadences.' Pezonomics will ensure that artificial environments, in particular virtual reality, are user friendly.
Our Theme for 2016: Sustaining Momentum
2016-03-01
of design architectures and interfaces to make both open sys- tems and modularity a reality. This is “owning the technical baseline,” and the devil... write this year, although we will be implementing the changes required in the Fiscal Year (FY) 2016 National Defense Au- thorization Act. We still...Defense for Acquisition, Technology, and Logistics on the momentum we have gained as we get ready for a new administration next year. Promote Technical
The Optokinetic Cervical Reflex (OKCR) in Pilots of High-Performance Aircraft.
1997-04-01
Coupled System virtual reality - the attempt to create a realistic, three-dimensional environment or synthetic immersive environment in which the user ...factors interface between the pilot and the flight environment. The final section is a case study of head- and helmet-mounted displays (HMD) and the impact...themselves as actually moving (flying) through a virtual environment. However, in the studies of Held, et al. (1975) and Young, et al. (1975) the
NASA Technical Reports Server (NTRS)
2004-01-01
I/NET, Inc., is making the dream of natural human-computer conversation a practical reality. Through a combination of advanced artificial intelligence research and practical software design, I/NET has taken the complexity out of developing advanced, natural language interfaces. Conversational capabilities like pronoun resolution, anaphora and ellipsis processing, and dialog management that were once available only in the laboratory can now be brought to any application with any speech recognition system using I/NET s conversational engine middleware.
Perceptions vs Realities: How High School Principals View and Utilize Professional School Counselors
ERIC Educational Resources Information Center
Webb-Rea, Jennifer S.
2012-01-01
This mixed methods study used an explanatory sequential design to examine what principals of high schools with large low-income and racial minority student populations view as appropriate roles for the school counselors in their buildings in raising student academic achievement and how those role perceptions align with best practices and with the…
Mothering across Colour Lines: Decisions and Dilemmas of White Birth Mothers of Mixed-Race Children
ERIC Educational Resources Information Center
Kouritzin, Sandra G.
2016-01-01
Conceptions of identity in multilingual multicultural societies still seem to be dominated by the perception that human beings are born into social locations and categories of ethnicity that are pre-existing. This fails to acknowledge the current reality for the progeny of interracial marriages, who may find themselves belonging neither to their…