Sample records for virtual diagnostics interface

  1. VRUSE--a computerised diagnostic tool: for usability evaluation of virtual/synthetic environment systems.

    PubMed

    Kalawsky, R S

    1999-02-01

    A special questionnaire (VRUSE) has been designed to measure the usability of a VR system according to the attitude and perception of its users. Important aspects of VR systems were carefully derived to produce key usability factors for the questionnaire. Unlike questionnaires designed for generic interfaces VRUSE is specifically designed to cater for evaluating virtual environments, being a diagnostic tool providing a wealth of information about a user's viewpoint of the interface. VRUSE can be used to great effect with other evaluation techniques to pinpoint problematical areas of a VR interface. Other applications include bench-marking of competitor VR systems.

  2. Real-time functional magnetic imaging-brain-computer interface and virtual reality promising tools for the treatment of pedophilia.

    PubMed

    Renaud, Patrice; Joyal, Christian; Stoleru, Serge; Goyette, Mathieu; Weiskopf, Nikolaus; Birbaumer, Niels

    2011-01-01

    This chapter proposes a prospective view on using a real-time functional magnetic imaging (rt-fMRI) brain-computer interface (BCI) application as a new treatment for pedophilia. Neurofeedback mediated by interactive virtual stimuli is presented as the key process in this new BCI application. Results on the diagnostic discriminant power of virtual characters depicting sexual stimuli relevant to pedophilia are given. Finally, practical and ethical implications are briefly addressed. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. ViDI: Virtual Diagnostics Interface. Volume 1; The Future of Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A. (Technical Monitor); Schwartz, Richard J.

    2004-01-01

    The quality of data acquired in a given test facility ultimately resides within the fidelity and implementation of the instrumentation systems. Over the last decade, the emergence of robust optical techniques has vastly expanded the envelope of measurement possibilities. At the same time the capabilities for data processing, data archiving and data visualization required to extract the highest level of knowledge from these global, on and off body measurement techniques have equally expanded. Yet today, while the instrumentation has matured to the production stage, an optimized solution for gaining knowledge from the gigabytes of data acquired per test (or even per test point) is lacking. A technological void has to be filled in order to possess a mechanism for near-real time knowledge extraction during wind tunnel experiments. Under these auspices, the Virtual Diagnostics Interface, or ViDI, was developed.

  4. Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; McCrea, Andrew C.

    2009-01-01

    The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.

  5. Virtual Diagnostic Interface: Aerospace Experimentation in the Synthetic Environment

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; McCrea, Andrew C.

    2010-01-01

    The Virtual Diagnostics Interface (ViDI) methodology combines two-dimensional image processing and three-dimensional computer modeling to provide comprehensive in-situ visualizations commonly utilized for in-depth planning of wind tunnel and flight testing, real time data visualization of experimental data, and unique merging of experimental and computational data sets in both real-time and post-test analysis. The preparation of such visualizations encompasses the realm of interactive three-dimensional environments, traditional and state of the art image processing techniques, database management and development of toolsets with user friendly graphical user interfaces. ViDI has been under development at the NASA Langley Research Center for over 15 years, and has a long track record of providing unique and insightful solutions to a wide variety of experimental testing techniques and validation of computational simulations. This report will address the various aspects of ViDI and how it has been applied to test programs as varied as NASCAR race car testing in NASA wind tunnels to real-time operations concerning Space Shuttle aerodynamic flight testing. In addition, future trends and applications will be outlined in the paper.

  6. Virtual Diagnostics Interface: Real Time Comparison of Experimental Data and CFD Predictions for a NASA Ares I-Like Vehicle

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; Fleming, Gary A.

    2007-01-01

    Virtual Diagnostics Interface technology, or ViDI, is a suite of techniques utilizing image processing, data handling and three-dimensional computer graphics. These techniques aid in the design, implementation, and analysis of complex aerospace experiments. LiveView3D is a software application component of ViDI used to display experimental wind tunnel data in real-time within an interactive, three-dimensional virtual environment. The LiveView3D software application was under development at NASA Langley Research Center (LaRC) for nearly three years. LiveView3D recently was upgraded to perform real-time (as well as post-test) comparisons of experimental data with pre-computed Computational Fluid Dynamics (CFD) predictions. This capability was utilized to compare experimental measurements with CFD predictions of the surface pressure distribution of the NASA Ares I Crew Launch Vehicle (CLV) - like vehicle when tested in the NASA LaRC Unitary Plan Wind Tunnel (UPWT) in December 2006 - January 2007 timeframe. The wind tunnel tests were conducted to develop a database of experimentally-measured aerodynamic performance of the CLV-like configuration for validation of CFD predictive codes.

  7. Software architecture and design of the web services facilitating climate model diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.

    2015-12-01

    Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.

  8. ReportTutor – An Intelligent Tutoring System that Uses a Natural Language Interface

    PubMed Central

    Crowley, Rebecca S.; Tseytlin, Eugene; Jukic, Drazen

    2005-01-01

    ReportTutor is an extension to our work on Intelligent Tutoring Systems for visual diagnosis. ReportTutor combines a virtual microscope and a natural language interface to allow students to visually inspect a virtual slide as they type a diagnostic report on the case. The system monitors both actions in the virtual microscope interface as well as text created by the student in the reporting interface. It provides feedback about the correctness, completeness, and style of the report. ReportTutor uses MMTx with a custom data-source created with the NCI Metathesaurus. A separate ontology of cancer specific concepts is used to structure the domain knowledge needed for evaluation of the student’s input including co-reference resolution. As part of the early evaluation of the system, we collected data from 4 pathology residents who typed in their reports without the tutoring aspects of the system, and compared responses to an expert dermatopathologist. We analyzed the resulting reports to (1) identify the error rates and distribution among student reports, (2) determine the performance of the system in identifying features within student reports, and (3) measure the accuracy of the system in distinguishing between correct and incorrect report elements. PMID:16779024

  9. Intelligent approach to prognostic enhancements of diagnostic systems

    NASA Astrophysics Data System (ADS)

    Vachtsevanos, George; Wang, Peng; Khiripet, Noppadon; Thakker, Ash; Galie, Thomas R.

    2001-07-01

    This paper introduces a novel methodology to prognostics based on a dynamic wavelet neural network construct and notions from the virtual sensor area. This research has been motivated and supported by the U.S. Navy's active interest in integrating advanced diagnostic and prognostic algorithms in existing Naval digital control and monitoring systems. A rudimentary diagnostic platform is assumed to be available providing timely information about incipient or impending failure conditions. We focus on the development of a prognostic algorithm capable of predicting accurately and reliably the remaining useful lifetime of a failing machine or component. The prognostic module consists of a virtual sensor and a dynamic wavelet neural network as the predictor. The virtual sensor employs process data to map real measurements into difficult to monitor fault quantities. The prognosticator uses a dynamic wavelet neural network as a nonlinear predictor. Means to manage uncertainty and performance metrics are suggested for comparison purposes. An interface to an available shipboard Integrated Condition Assessment System is described and applications to shipboard equipment are discussed. Typical results from pump failures are presented to illustrate the effectiveness of the methodology.

  10. Virtual microscopy and digital pathology in training and education.

    PubMed

    Hamilton, Peter W; Wang, Yinhai; McCullough, Stephen J

    2012-04-01

    Traditionally, education and training in pathology has been delivered using textbooks, glass slides and conventional microscopy. Over the last two decades, the number of web-based pathology resources has expanded dramatically with centralized pathological resources being delivered to many students simultaneously. Recently, whole slide imaging technology allows glass slides to be scanned and viewed on a computer screen via dedicated software. This technology is referred to as virtual microscopy and has created enormous opportunities in pathological training and education. Students are able to learn key histopathological skills, e.g. to identify areas of diagnostic relevance from an entire slide, via a web-based computer environment. Students no longer need to be in the same room as the slides. New human-computer interfaces are also being developed using more natural touch technology to enhance the manipulation of digitized slides. Several major initiatives are also underway introducing online competency and diagnostic decision analysis using virtual microscopy and have important future roles in accreditation and recertification. Finally, researchers are investigating how pathological decision-making is achieved using virtual microscopy and modern eye-tracking devices. Virtual microscopy and digital pathology will continue to improve how pathology training and education is delivered. © 2012 The Authors APMIS © 2012 APMIS.

  11. A Case-Based Study with Radiologists Performing Diagnosis Tasks in Virtual Reality.

    PubMed

    Venson, José Eduardo; Albiero Berni, Jean Carlo; Edmilson da Silva Maia, Carlos; Marques da Silva, Ana Maria; Cordeiro d'Ornellas, Marcos; Maciel, Anderson

    2017-01-01

    In radiology diagnosis, medical images are most often visualized slice by slice. At the same time, the visualization based on 3D volumetric rendering of the data is considered useful and has increased its field of application. In this work, we present a case-based study with 16 medical specialists to assess the diagnostic effectiveness of a Virtual Reality interface in fracture identification over 3D volumetric reconstructions. We developed a VR volume viewer compatible with both the Oculus Rift and handheld-based head mounted displays (HMDs). We then performed user experiments to validate the approach in a diagnosis environment. In addition, we assessed the subjects' perception of the 3D reconstruction quality, ease of interaction and ergonomics, and also the users opinion on how VR applications can be useful in healthcare. Among other results, we have found a high level of effectiveness of the VR interface in identifying superficial fractures on head CTs.

  12. Rapid Technology Assessment via Unified Deployment of Global Optical and Virtual Diagnostics

    NASA Technical Reports Server (NTRS)

    Jordan, Jeffrey D.; Watkins, A. Neal; Fleming, Gary A.; Leighty, Bradley D.; Schwartz, Richard J.; Ingram, JoAnne L.; Grinstead, Keith D., Jr.; Oglesby, Donald M.; Tyler, Charles

    2003-01-01

    This paper discusses recent developments in rapid technology assessment resulting from an active collaboration between researchers at the Air Force Research Laboratory (AFRL) at Wright Patterson Air Force Base (WPAFB) and the NASA Langley Research Center (LaRC). This program targets the unified development and deployment of global measurement technologies coupled with a virtual diagnostic interface to enable the comparative evaluation of experimental and computational results. Continuing efforts focus on the development of seamless data translation methods to enable integration of data sets of disparate file format in a common platform. Results from a successful low-speed wind tunnel test at WPAFB in which global surface pressure distributions were acquired simultaneously with model deformation and geometry measurements are discussed and comparatively evaluated with numerical simulations. Intensity- and lifetime-based pressure-sensitive paint (PSP) and projection moire interferometry (PMI) results are presented within the context of rapid technology assessment to enable simulation-based R&D.

  13. Toward real-time virtual biopsy of oral lesions using confocal laser endomicroscopy interfaced with embedded computing.

    PubMed

    Thong, Patricia S P; Tandjung, Stephanus S; Movania, Muhammad Mobeen; Chiew, Wei-Ming; Olivo, Malini; Bhuvaneswari, Ramaswamy; Seah, Hock-Soon; Lin, Feng; Qian, Kemao; Soo, Khee-Chee

    2012-05-01

    Oral lesions are conventionally diagnosed using white light endoscopy and histopathology. This can pose a challenge because the lesions may be difficult to visualise under white light illumination. Confocal laser endomicroscopy can be used for confocal fluorescence imaging of surface and subsurface cellular and tissue structures. To move toward real-time "virtual" biopsy of oral lesions, we interfaced an embedded computing system to a confocal laser endomicroscope to achieve a prototype three-dimensional (3-D) fluorescence imaging system. A field-programmable gated array computing platform was programmed to enable synchronization of cross-sectional image grabbing and Z-depth scanning, automate the acquisition of confocal image stacks and perform volume rendering. Fluorescence imaging of the human and murine oral cavities was carried out using the fluorescent dyes fluorescein sodium and hypericin. Volume rendering of cellular and tissue structures from the oral cavity demonstrate the potential of the system for 3-D fluorescence visualization of the oral cavity in real-time. We aim toward achieving a real-time virtual biopsy technique that can complement current diagnostic techniques and aid in targeted biopsy for better clinical outcomes.

  14. Applications of virtual reality technology in pathology.

    PubMed

    Grimes, G J; McClellan, S A; Goldman, J; Vaughn, G L; Conner, D A; Kujawski, E; McDonald, J; Winokur, T; Fleming, W

    1997-01-01

    TelePath(SM) a telerobotic system utilizing virtual microscope concepts based on high quality still digital imaging and aimed at real-time support for surgery by remote diagnosis of frozen sections. Many hospitals and clinics have an application for the remote practice of pathology, particularly in the area of reading frozen sections in support of surgery, commonly called anatomic pathology. The goal is to project the expertise of the pathologist into the remote setting by giving the pathologist access to the microscope slides with an image quality and human interface comparable to what the pathologist would experience at a real rather than a virtual microscope. A working prototype of a virtual microscope has been defined and constructed which has the needed performance in both the image quality and human interface areas for a pathologist to work remotely. This is accomplished through the use of telerobotics and an image quality which provides the virtual microscope the same diagnostic capabilities as a real microscope. The examination of frozen sections is performed a two-dimensional world. The remote pathologist is in a virtual world with the same capabilities as a "real" microscope, but response times may be slower depending on the specific computing and telecommunication environments. The TelePath system has capabilities far beyond a normal biological microscope, such as the ability to create a low power image of the entire sample using multiple images digitally matched together; the ability to digitally retrace a viewing trajectory; and the ability to archive images using CD ROM and other mass storage devices.

  15. Usability Studies In Virtual And Traditional Computer Aided Design Environments For Spatial Awareness

    DTIC Science & Technology

    2017-08-08

    Usability Studies In Virtual And Traditional Computer Aided Design Environments For Spatial Awareness Dr. Syed Adeel Ahmed, Xavier University of...virtual environment with wand interfaces compared directly with a workstation non-stereoscopic traditional CAD interface with keyboard and mouse. In...navigate through a virtual environment. The wand interface provides a significantly improved means of interaction. This study quantitatively measures the

  16. Integrating Virtual Worlds with Tangible User Interfaces for Teaching Mathematics: A Pilot Study.

    PubMed

    Guerrero, Graciela; Ayala, Andrés; Mateu, Juan; Casades, Laura; Alamán, Xavier

    2016-10-25

    This article presents a pilot study of the use of two new tangible interfaces and virtual worlds for teaching geometry in a secondary school. The first tangible device allows the user to control a virtual object in six degrees of freedom. The second tangible device is used to modify virtual objects, changing attributes such as position, size, rotation and color. A pilot study on using these devices was carried out at the "Florida Secundaria" high school. A virtual world was built where students used the tangible interfaces to manipulate geometrical figures in order to learn different geometrical concepts. The pilot experiment results suggest that the use of tangible interfaces and virtual worlds allowed a more meaningful learning (concepts learnt were more durable).

  17. A Hybrid 2D/3D User Interface for Radiological Diagnosis.

    PubMed

    Mandalika, Veera Bhadra Harish; Chernoglazov, Alexander I; Billinghurst, Mark; Bartneck, Christoph; Hurrell, Michael A; Ruiter, Niels de; Butler, Anthony P H; Butler, Philip H

    2018-02-01

    This paper presents a novel 2D/3D desktop virtual reality hybrid user interface for radiology that focuses on improving 3D manipulation required in some diagnostic tasks. An evaluation of our system revealed that our hybrid interface is more efficient for novice users and more accurate for both novice and experienced users when compared to traditional 2D only interfaces. This is a significant finding because it indicates, as the techniques mature, that hybrid interfaces can provide significant benefit to image evaluation. Our hybrid system combines a zSpace stereoscopic display with 2D displays, and mouse and keyboard input. It allows the use of 2D and 3D components interchangeably, or simultaneously. The system was evaluated against a 2D only interface with a user study that involved performing a scoliosis diagnosis task. There were two user groups: medical students and radiology residents. We found improvements in completion time for medical students, and in accuracy for both groups. In particular, the accuracy of medical students improved to match that of the residents.

  18. Integrating Virtual Worlds with Tangible User Interfaces for Teaching Mathematics: A Pilot Study

    PubMed Central

    Guerrero, Graciela; Ayala, Andrés; Mateu, Juan; Casades, Laura; Alamán, Xavier

    2016-01-01

    This article presents a pilot study of the use of two new tangible interfaces and virtual worlds for teaching geometry in a secondary school. The first tangible device allows the user to control a virtual object in six degrees of freedom. The second tangible device is used to modify virtual objects, changing attributes such as position, size, rotation and color. A pilot study on using these devices was carried out at the “Florida Secundaria” high school. A virtual world was built where students used the tangible interfaces to manipulate geometrical figures in order to learn different geometrical concepts. The pilot experiment results suggest that the use of tangible interfaces and virtual worlds allowed a more meaningful learning (concepts learnt were more durable). PMID:27792132

  19. VIRTUAL FRAME BUFFER INTERFACE

    NASA Technical Reports Server (NTRS)

    Wolfe, T. L.

    1994-01-01

    Large image processing systems use multiple frame buffers with differing architectures and vendor supplied user interfaces. This variety of architectures and interfaces creates software development, maintenance, and portability problems for application programs. The Virtual Frame Buffer Interface program makes all frame buffers appear as a generic frame buffer with a specified set of characteristics, allowing programmers to write code which will run unmodified on all supported hardware. The Virtual Frame Buffer Interface converts generic commands to actual device commands. The virtual frame buffer consists of a definition of capabilities and FORTRAN subroutines that are called by application programs. The virtual frame buffer routines may be treated as subroutines, logical functions, or integer functions by the application program. Routines are included that allocate and manage hardware resources such as frame buffers, monitors, video switches, trackballs, tablets and joysticks; access image memory planes; and perform alphanumeric font or text generation. The subroutines for the various "real" frame buffers are in separate VAX/VMS shared libraries allowing modification, correction or enhancement of the virtual interface without affecting application programs. The Virtual Frame Buffer Interface program was developed in FORTRAN 77 for a DEC VAX 11/780 or a DEC VAX 11/750 under VMS 4.X. It supports ADAGE IK3000, DEANZA IP8500, Low Resolution RAMTEK 9460, and High Resolution RAMTEK 9460 Frame Buffers. It has a central memory requirement of approximately 150K. This program was developed in 1985.

  20. Natural gesture interfaces

    NASA Astrophysics Data System (ADS)

    Starodubtsev, Illya

    2017-09-01

    The paper describes the implementation of the system of interaction with virtual objects based on gestures. The paper describes the common problems of interaction with virtual objects, specific requirements for the interfaces for virtual and augmented reality.

  1. Virtually-augmented interfaces for tactical aircraft.

    PubMed

    Haas, M W

    1995-05-01

    The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and non-virtual concepts and devices across the visual, auditory and haptic sensory modalities. A fusion interface is a multi-sensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion-interface concepts. One of the virtual concepts to be investigated in the Fusion Interfaces for Tactical Environments facility (FITE) is the application of EEG and other physiological measures for virtual control of functions within the flight environment. FITE is a specialized flight simulator which allows efficient concept development through the use of rapid prototyping followed by direct experience of new fusion concepts. The FITE facility also supports evaluation of fusion concepts by operational fighter pilots in a high fidelity simulated air combat environment. The facility was utilized by a multi-disciplinary team composed of operational pilots, human-factors engineers, electronics engineers, computer scientists, and experimental psychologists to prototype and evaluate the first multi-sensory, virtually-augmented cockpit. The cockpit employed LCD-based head-down displays, a helmet-mounted display, three-dimensionally localized audio displays, and a haptic display. This paper will endeavor to describe the FITE facility architecture, some of the characteristics of the FITE virtual display and control devices, and the potential application of EEG and other physiological measures within the FITE facility.

  2. Virtual button interface

    DOEpatents

    Jones, Jake S.

    1999-01-01

    An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch.

  3. A haptic interface for virtual simulation of endoscopic surgery.

    PubMed

    Rosenberg, L B; Stredney, D

    1996-01-01

    Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.

  4. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    DTIC Science & Technology

    2017-08-08

    Usability Studies In Virtual And Traditional Computer Aided Design Environments For Fault Identification Dr. Syed Adeel Ahmed, Xavier University...virtual environment with wand interfaces compared directly with a workstation non-stereoscopic traditional CAD interface with keyboard and mouse. In...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  5. Virtual button interface

    DOEpatents

    Jones, J.S.

    1999-01-12

    An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment are disclosed. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch. 4 figs.

  6. Future Cyborgs: Human-Machine Interface for Virtual Reality Applications

    DTIC Science & Technology

    2007-04-01

    FUTURE CYBORGS : HUMAN-MACHINE INTERFACE FOR VIRTUAL REALITY APPLICATIONS Robert R. Powell, Major, USAF April 2007 Blue Horizons...SUBTITLE Future Cyborgs : Human-Machine Interface for Virtual Reality Applications 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...Nicholas Negroponte, Being Digital (New York: Alfred A Knopf, Inc, 1995), 123. 23 Ibid. 24 Andy Clark, Natural-Born Cyborgs (New York: Oxford

  7. A web-based platform for virtual screening.

    PubMed

    Watson, Paul; Verdonk, Marcel; Hartshorn, Michael J

    2003-09-01

    A fully integrated, web-based, virtual screening platform has been developed to allow rapid virtual screening of large numbers of compounds. ORACLE is used to store information at all stages of the process. The system includes a large database of historical compounds from high throughput screenings (HTS) chemical suppliers, ATLAS, containing over 3.1 million unique compounds with their associated physiochemical properties (ClogP, MW, etc.). The database can be screened using a web-based interface to produce compound subsets for virtual screening or virtual library (VL) enumeration. In order to carry out the latter task within ORACLE a reaction data cartridge has been developed. Virtual libraries can be enumerated rapidly using the web-based interface to the cartridge. The compound subsets can be seamlessly submitted for virtual screening experiments, and the results can be viewed via another web-based interface allowing ad hoc querying of the virtual screening data stored in ORACLE.

  8. Multi-modal virtual environment research at Armstrong Laboratory

    NASA Technical Reports Server (NTRS)

    Eggleston, Robert G.

    1995-01-01

    One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.

  9. Virtual optical interfaces for the transportation industry

    NASA Astrophysics Data System (ADS)

    Hejmadi, Vic; Kress, Bernard

    2010-04-01

    We present a novel implementation of virtual optical interfaces for the transportation industry (automotive and avionics). This new implementation includes two functionalities in a single device; projection of a virtual interface and sensing of the position of the fingers on top of the virtual interface. Both functionalities are produced by diffraction of laser light. The device we are developing include both functionalities in a compact package which has no optical elements to align since all of them are pre-aligned on a single glass wafer through optical lithography. The package contains a CMOS sensor which diffractive objective lens is optimized for the projected interface color as well as for the IR finger position sensor based on structured illumination. Two versions are proposed: a version which senses the 2d position of the hand and a version which senses the hand position in 3d.

  10. Distributed virtual environment for emergency medical training

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.; Garcia, Brian W.; Godsell-Stytz, Gayl M.

    1997-07-01

    In many professions where individuals must work in a team in a high stress environment to accomplish a time-critical task, individual and team performance can benefit from joint training using distributed virtual environments (DVEs). One professional field that lacks but needs a high-fidelity team training environment is the field of emergency medicine. Currently, emergency department (ED) medical personnel train by using words to create a metal picture of a situation for the physician and staff, who then cooperate to solve the problems portrayed by the word picture. The need in emergency medicine for realistic virtual team training is critical because ED staff typically encounter rarely occurring but life threatening situations only once in their careers and because ED teams currently have no realistic environment in which to practice their team skills. The resulting lack of experience and teamwork makes diagnosis and treatment more difficult. Virtual environment based training has the potential to redress these shortfalls. The objective of our research is to develop a state-of-the-art virtual environment for emergency medicine team training. The virtual emergency room (VER) allows ED physicians and medical staff to realistically prepare for emergency medical situations by performing triage, diagnosis, and treatment on virtual patients within an environment that provides them with the tools they require and the team environment they need to realistically perform these three tasks. There are several issues that must be addressed before this vision is realized. The key issues deal with distribution of computations; the doctor and staff interface to the virtual patient and ED equipment; the accurate simulation of individual patient organs' response to injury, medication, and treatment; and an accurate modeling of the symptoms and appearance of the patient while maintaining a real-time interaction capability. Our ongoing work addresses all of these issues. In this paper we report on our prototype VER system and its distributed system architecture for an emergency department distributed virtual environment for emergency medical staff training. The virtual environment enables emergency department physicians and staff to develop their diagnostic and treatment skills using the virtual tools they need to perform diagnostic and treatment tasks. Virtual human imagery, and real-time virtual human response are used to create the virtual patient and present a scenario. Patient vital signs are available to the emergency department team as they manage the virtual case. The work reported here consists of the system architectures we developed for the distributed components of the virtual emergency room. The architectures we describe consist of the network level architecture as well as the software architecture for each actor within the virtual emergency room. We describe the role of distributed interactive simulation and other enabling technologies within the virtual emergency room project.

  11. The Virtual Tablet: Virtual Reality as a Control System

    NASA Technical Reports Server (NTRS)

    Chronister, Andrew

    2016-01-01

    In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of hardware, write software to incorporate each piece of position or orientation data into a coherent description of the object's location in space, place the virtual analogue accordingly, and project the control interface onto it, resulting in a functioning object which has both a physical and a virtual presence. Additionally, the virtual environment was enhanced with two live video feeds from cameras mounted on the robotic device being used as an example target of the virtual interface. The working VT allows users to naturally interact with a control interface with little to no training and without the issues found in previous efforts.

  12. Building intuitive 3D interfaces for virtual reality systems

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Seitel, Mathias; Mullick, Rakesh

    2007-03-01

    An exploration of techniques for developing intuitive, and efficient user interfaces for virtual reality systems. Work seeks to understand which paradigms from the better-understood world of 2D user interfaces remain viable within 3D environments. In order to establish this a new user interface was created that applied various understood principles of interface design. A user study was then performed where it was compared with an earlier interface for a series of medical visualization tasks.

  13. The virtual windtunnel: Visualizing modern CFD datasets with a virtual environment

    NASA Technical Reports Server (NTRS)

    Bryson, Steve

    1993-01-01

    This paper describes work in progress on a virtual environment designed for the visualization of pre-computed fluid flows. The overall problems involved in the visualization of fluid flow are summarized, including computational, data management, and interface issues. Requirements for a flow visualization are summarized. Many aspects of the implementation of the virtual windtunnel were uniquely determined by these requirements. The user interface is described in detail.

  14. Comparing two types of navigational interfaces for Virtual Reality.

    PubMed

    Teixeira, Luís; Vilar, Elisângela; Duarte, Emília; Rebelo, Francisco; da Silva, Fernando Moreira

    2012-01-01

    Previous studies suggest significant differences between navigating virtual environments in a life-like walking manner (i.e., using treadmills or walk-in-place techniques) and virtual navigation (i.e., flying while really standing). The latter option, which usually involves hand-centric devices (e.g., joysticks), is the most common in Virtual Reality-based studies, mostly due to low costs, less space and technology demands. However, recently, new interaction devices, originally conceived for videogames have become available offering interesting potentialities for research. This study aimed to explore the potentialities of the Nintendo Wii Balance Board as a navigation interface in a Virtual Environment presented in an immersive Virtual Reality system. Comparing participants' performance while engaged in a simulated emergency egress allows determining the adequacy of such alternative navigation interface on the basis of empirical results. Forty university students participated in this study. Results show that participants were more efficient when performing navigation tasks using the Joystick than with the Balance Board. However there were no significantly differences in the behavioral compliance with exit signs. Therefore, this study suggests that, at least for tasks similar to the studied, the Balance Board have good potentiality to be used as a navigation interface for Virtual Reality systems.

  15. The Multimission Image Processing Laboratory's virtual frame buffer interface

    NASA Technical Reports Server (NTRS)

    Wolfe, T.

    1984-01-01

    Large image processing systems use multiple frame buffers with differing architectures and vendor supplied interfaces. This variety of architectures and interfaces creates software development, maintenance and portability problems for application programs. Several machine-dependent graphics standards such as ANSI Core and GKS are available, but none of them are adequate for image processing. Therefore, the Multimission Image Processing laboratory project has implemented a programmer level virtual frame buffer interface. This interface makes all frame buffers appear as a generic frame buffer with a specified set of characteristics. This document defines the virtual frame uffer interface and provides information such as FORTRAN subroutine definitions, frame buffer characteristics, sample programs, etc. It is intended to be used by application programmers and system programmers who are adding new frame buffers to a system.

  16. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases.

    PubMed

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-04-15

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients' brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies.

  17. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases

    PubMed Central

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-01-01

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients’ brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies. PMID:25206907

  18. Analysis of slide exploration strategy of cytologists when reading digital slides

    NASA Astrophysics Data System (ADS)

    Pantanowitz, Liron; Parwani, Anil; Tseytlin, Eugene; Mello-Thoms, Claudia

    2012-02-01

    Cytology is the sub-domain of Pathology that deals mainly with the diagnosis of cellular changes caused by disease. Current clinical practice involves a cytotechnologist that manually screens glass slides containing fixed cytology material using a light microscope. Screened slides are then forwarded to a specialized pathologist, a cytopathologist, for microscopic review and final diagnostic interpretation. If no abnormalities are detected, the specimen is interpreted as "normal", otherwise the abnormalities are marked with a pen on the glass slide by the cytotechnologist and then are used to render a diagnosis. As Pathology is migrating towards a digital environment it is important to determine whether these crucial screening and diagnostic tasks can be performed as well using digital slides as the current practice with glass slides. The purpose of this work is to make this assessment, by using a set of digital slides depicting cytological materials of different disease processes in several organs, and then to analyze how different cytologists including cytotechnologists, cytopathologists and cytotechnology-trainees explored the digital slides. We will (1) collect visual search data from the cytologists as they navigate the digital slides, as well as record any electronic marks (annotations) made by the cytologists; (2) convert the dynamic visual search data into a static representation of the observers' exploration strategy using 'search maps'; and (3) determine slide coverage, per viewing magnification range, for each group. We have developed a virtual microscope to collect this data, and this interface allows for interactive navigation of the virtual slide (including panning and zooming), as well as annotation of reportable findings. Furthermore, all interactions with the interface are time stamped, which allows us to recreate the cytologists' search strategy.

  19. Virtual interface environment workstations

    NASA Technical Reports Server (NTRS)

    Fisher, S. S.; Wenzel, E. M.; Coler, C.; Mcgreevy, M. W.

    1988-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed at NASA's Ames Research Center for use as a multipurpose interface environment. This Virtual Interface Environment Workstation (VIEW) system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, research scenarios, and research directions are described.

  20. Human Machine Interfaces for Teleoperators and Virtual Environments

    NASA Technical Reports Server (NTRS)

    Durlach, Nathaniel I. (Compiler); Sheridan, Thomas B. (Compiler); Ellis, Stephen R. (Compiler)

    1991-01-01

    In Mar. 1990, a meeting organized around the general theme of teleoperation research into virtual environment display technology was conducted. This is a collection of conference-related fragments that will give a glimpse of the potential of the following fields and how they interplay: sensorimotor performance; human-machine interfaces; teleoperation; virtual environments; performance measurement and evaluation methods; and design principles and predictive models.

  1. Rapid prototyping 3D virtual world interfaces within a virtual factory environment

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.

  2. Applying mixed reality to simulate vulnerable populations for practicing clinical communication skills.

    PubMed

    Chuah, Joon Hao; Lok, Benjamin; Black, Erik

    2013-04-01

    Health sciences students often practice and are evaluated on interview and exam skills by working with standardized patients (people that role play having a disease or condition). However, standardized patients do not exist for certain vulnerable populations such as children and the intellectually disabled. As a result, students receive little to no exposure to vulnerable populations before becoming working professionals. To address this problem and thereby increase exposure to vulnerable populations, we propose using virtual humans to simulate members of vulnerable populations. We created a mixed reality pediatric patient that allowed students to practice pediatric developmental exams. Practicing several exams is necessary for students to understand how to properly interact with and correctly assess a variety of children. Practice also increases a student's confidence in performing the exam. Effective practice requires students to treat the virtual child realistically. Treating the child realistically might be affected by how the student and virtual child physically interact, so we created two object interaction interfaces - a natural interface and a mouse-based interface. We tested the complete mixed reality exam and also compared the two object interaction interfaces in a within-subjects user study with 22 participants. Our results showed that the participants accepted the virtual child as a child and treated it realistically. Participants also preferred the natural interface, but the interface did not affect how realistically participants treated the virtual child.

  3. SpectraPLOT, Visualization Package with a User-Friendly Graphical Interface

    NASA Astrophysics Data System (ADS)

    Sebald, James; Macfarlane, Joseph; Golovkin, Igor

    2017-10-01

    SPECT3D is a collisional-radiative spectral analysis package designed to compute detailed emission, absorption, or x-ray scattering spectra, filtered images, XRD signals, and other synthetic diagnostics. The spectra and images are computed for virtual detectors by post-processing the results of hydrodynamics simulations in 1D, 2D, and 3D geometries. SPECT3D can account for a variety of instrumental response effects so that direct comparisons between simulations and experimental measurements can be made. SpectraPLOT is a user-friendly graphical interface for viewing a wide variety of results from SPECT3D simulations, and applying various instrumental effects to the simulated images and spectra. We will present SpectraPLOT's ability to display a variety of data, including spectra, images, light curves, streaked spectra, space-resolved spectra, and drilldown plasma property plots, for an argon-doped capsule implosion experiment example. Future SpectraPLOT features and enhancements will also be discussed.

  4. The Input-Interface of Webcam Applied in 3D Virtual Reality Systems

    ERIC Educational Resources Information Center

    Sun, Huey-Min; Cheng, Wen-Lin

    2009-01-01

    Our research explores a virtual reality application based on Web camera (Webcam) input-interface. The interface can replace with the mouse to control direction intention of a user by the method of frame difference. We divide a frame into nine grids from Webcam and make use of the background registration to compute the moving object. In order to…

  5. Integrated Information Support System (IISS). Volume 8. User Interface Subsystem. Part 14. Virtual Terminal Unit Test Plan

    DTIC Science & Technology

    1990-09-30

    Dynamics Research Corporation: Jones, L.. Glandorf, F. 3a. TYPE OF REPORT 113b. TIME COVERED 114. DATE OF REPORT (Yr.,Mo..Day) 15. PAGE COUNT Final...specific software modules written for each type of real terminal supported. Virtual Terminal Interface: the callable interface to the Virtual Terminal...2000;60000;2;0;100;100;5000;0;0;0;0;0;10 "v-Testing2- DVF - View Fill Area: <ESC>[5;1;2000;50000;20000;30000;20000;50000; 2000;30000&v DVM - View Markers: <ESC

  6. ViDI: Virtual Diagnostics Interface. Volume 2; Unified File Format and Web Services as Applied to Seamless Data Transfer

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A. (Technical Monitor); Schwartz, Richard J.

    2004-01-01

    The desire to revolutionize the aircraft design cycle from its currently lethargic pace to a fast turn-around operation enabling the optimization of non-traditional configurations is a critical challenge facing the aeronautics industry. In response, a large scale effort is underway to not only advance the state of the art in wind tunnel testing, computational modeling, and information technology, but to unify these often disparate elements into a cohesive design resource. This paper will address Seamless Data Transfer, the critical central nervous system that will enable a wide variety of varied components to work together.

  7. The Adaptive Effects Of Virtual Interfaces: Vestibulo-Ocular Reflex and Simulator Sickness.

    DTIC Science & Technology

    1998-08-07

    rearrangement: a pattern of stimulation differing from that existing as a result of normal interactions with the real world. Stimulus rearrangements can...is immersive and interactive . virtual interface: a system of transducers, signal processors, computer hardware and software that create an... interactive medium through which: 1) information is transmitted to the senses in the form of two- and three dimensional virtual images and 2) psychomotor

  8. Open multi-agent control architecture to support virtual-reality-based man-machine interfaces

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Juergen; Brasch, Marcel

    2001-10-01

    Projective Virtual Reality is a new and promising approach to intuitively operable man machine interfaces for the commanding and supervision of complex automation systems. The user interface part of Projective Virtual Reality heavily builds on latest Virtual Reality techniques, a task deduction component and automatic action planning capabilities. In order to realize man machine interfaces for complex applications, not only the Virtual Reality part has to be considered but also the capabilities of the underlying robot and automation controller are of great importance. This paper presents a control architecture that has proved to be an ideal basis for the realization of complex robotic and automation systems that are controlled by Virtual Reality based man machine interfaces. The architecture does not just provide a well suited framework for the real-time control of a multi robot system but also supports Virtual Reality metaphors and augmentations which facilitate the user's job to command and supervise a complex system. The developed control architecture has already been used for a number of applications. Its capability to integrate sensor information from sensors of different levels of abstraction in real-time helps to make the realized automation system very responsive to real world changes. In this paper, the architecture will be described comprehensively, its main building blocks will be discussed and one realization that is built based on an open source real-time operating system will be presented. The software design and the features of the architecture which make it generally applicable to the distributed control of automation agents in real world applications will be explained. Furthermore its application to the commanding and control of experiments in the Columbus space laboratory, the European contribution to the International Space Station (ISS), is only one example which will be described.

  9. Virtual Resting Pd/Pa From Coronary Angiography and Blood Flow Modelling: Diagnostic Performance Against Fractional Flow Reserve.

    PubMed

    Papafaklis, Michail I; Muramatsu, Takashi; Ishibashi, Yuki; Bourantas, Christos V; Fotiadis, Dimitrios I; Brilakis, Emmanouil S; Garcia-Garcia, Héctor M; Escaned, Javier; Serruys, Patrick W; Michalis, Lampros K

    2018-03-01

    Fractional flow reserve (FFR) has been established as a useful diagnostic tool. The distal coronary pressure to aortic pressure (Pd/Pa) ratio at rest is a simpler physiologic index but also requires the use of the pressure wire, whereas recently proposed virtual functional indices derived from coronary imaging require complex blood flow modelling and/or are time-consuming. Our aim was to test the diagnostic performance of virtual resting Pd/Pa using routine angiographic images and a simple flow model. Three-dimensional quantitative coronary angiography (3D-QCA) was performed in 139 vessels (120 patients) with intermediate lesions assessed by FFR. The resting Pd/Pa for each lesion was assessed by computational fluid dynamics. The discriminatory power of virtual resting Pd/Pa against FFR (reference: ≤0.80) was high (area under the receiver operator characteristic curve [AUC]: 90.5% [95% CI: 85.4-95.6%]). Diagnostic accuracy, sensitivity and specificity for the optimal virtual resting Pd/Pa cut-off (≤0.94) were 84.9%, 90.4% and 81.6%, respectively. Virtual resting Pd/Pa demonstrated superior performance (p<0.001) versus 3D-QCA %area stenosis (AUC: 77.5% [95% CI: 69.8-85.3%]). There was a good correlation between virtual resting Pd/Pa and FFR (r=0.69, p<0.001). Virtual resting Pd/Pa using routine angiographic data and a simple flow model provides fast functional assessment of coronary lesions without requiring the pressure-wire and hyperaemia induction. The high diagnostic performance of virtual resting Pd/Pa for predicting FFR shows promise for using this simple/fast virtual index in clinical practice. Copyright © 2017 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  10. Simulation of Physical Experiments in Immersive Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Wasfy, Tamer M.

    2001-01-01

    An object-oriented event-driven immersive Virtual environment is described for the creation of virtual labs (VLs) for simulating physical experiments. Discussion focuses on a number of aspects of the VLs, including interface devices, software objects, and various applications. The VLs interface with output devices, including immersive stereoscopic screed(s) and stereo speakers; and a variety of input devices, including body tracking (head and hands), haptic gloves, wand, joystick, mouse, microphone, and keyboard. The VL incorporates the following types of primitive software objects: interface objects, support objects, geometric entities, and finite elements. Each object encapsulates a set of properties, methods, and events that define its behavior, appearance, and functions. A container object allows grouping of several objects. Applications of the VLs include viewing the results of the physical experiment, viewing a computer simulation of the physical experiment, simulation of the experiments procedure, computational steering, and remote control of the physical experiment. In addition, the VL can be used as a risk-free (safe) environment for training. The implementation of virtual structures testing machines, virtual wind tunnels, and a virtual acoustic testing facility is described.

  11. Combined virtual and real robotic test-bed for single operator control of multiple robots

    NASA Astrophysics Data System (ADS)

    Lee, Sam Y.-S.; Hunt, Shawn; Cao, Alex; Pandya, Abhilash

    2010-04-01

    Teams of heterogeneous robots with different dynamics or capabilities could perform a variety of tasks such as multipoint surveillance, cooperative transport and explorations in hazardous environments. In this study, we work with heterogeneous robots of semi-autonomous ground and aerial robots for contaminant localization. We developed a human interface system which linked every real robot to its virtual counterpart. A novel virtual interface has been integrated with Augmented Reality that can monitor the position and sensory information from video feed of ground and aerial robots in the 3D virtual environment, and improve user situational awareness. An operator can efficiently control the real multi-robots using the Drag-to-Move method on the virtual multi-robots. This enables an operator to control groups of heterogeneous robots in a collaborative way for allowing more contaminant sources to be pursued simultaneously. The advanced feature of the virtual interface system is guarded teleoperation. This can be used to prevent operators from accidently driving multiple robots into walls and other objects. Moreover, the feature of the image guidance and tracking is able to reduce operator workload.

  12. Virtual Reality: An Overview.

    ERIC Educational Resources Information Center

    Franchi, Jorge

    1994-01-01

    Highlights of this overview of virtual reality include optics; interface devices; virtual worlds; potential applications, including medicine and archaeology; problems, including costs; current research and development; future possibilities; and a listing of vendors and suppliers of virtual reality products. (Contains 11 references.) (LRW)

  13. A Typology of Ethnographic Scales for Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Boellstorff, Tom

    This chapter outlines a typology of genres of ethnographic research with regard to virtual worlds, informed by extensive research the author has completed both in Second Life and in Indonesia. It begins by identifying four confusions about virtual worlds: they are not games, they need not be graphical or even visual, they are not mass media, and they need not be defined in terms of escapist role-playing. A three-part typology of methods for ethnographic research in virtual worlds focuses on the relationship between research design and ethnographic scale. One class of methods for researching virtual worlds with regard to ethnographic scale explores interfaces between virtual worlds and the actual world, whereas a second examines interfaces between two or more virtual worlds. The third class involves studying a single virtual world in its own terms. Recognizing that all three approaches have merit for particular research purposes, ethnography of virtual worlds can be a vibrant field of research, contributing to central debates about human selfhood and sociality.

  14. [THE VIRTUAL CYTOLOGIC SLIDES FOR EXTERNAL EVALUATION OF QUALITY OF IMPLEMENTATION OF CYTOLOGIC ANALYSES IN CLINICAL DIAGNOSTIC LABORATORIES: POSSIBILITIES AND PERSPECTIVES].

    PubMed

    Djangirova, T V; Shabalova, I P; Pronichev, A N; Polyakov, E V

    2015-08-01

    The article considers application of technology of analysis of cytological slides in external quality control of clinical diagnostic laboratories. The advantages of virtual slides are demonstrated against other applied technologies of external evaluation of quality i.e. slide plate and digital micro-photography. The conditions of formation of virtual slides for external evaluation of quality of clinical diagnostic laboratories. The technology of their application is described. The success of practical application of considered technology in the Federal system of external evaluation of quality is emphasized.

  15. CycloPs: generating virtual libraries of cyclized and constrained peptides including nonnatural amino acids.

    PubMed

    Duffy, Fergal J; Verniere, Mélanie; Devocelle, Marc; Bernard, Elise; Shields, Denis C; Chubb, Anthony J

    2011-04-25

    We introduce CycloPs, software for the generation of virtual libraries of constrained peptides including natural and nonnatural commercially available amino acids. The software is written in the cross-platform Python programming language, and features include generating virtual libraries in one-dimensional SMILES and three-dimensional SDF formats, suitable for virtual screening. The stand-alone software is capable of filtering the virtual libraries using empirical measurements, including peptide synthesizability by standard peptide synthesis techniques, stability, and the druglike properties of the peptide. The software and accompanying Web interface is designed to enable the rapid generation of large, structurally diverse, synthesizable virtual libraries of constrained peptides quickly and conveniently, for use in virtual screening experiments. The stand-alone software, and the Web interface for evaluating these empirical properties of a single peptide, are available at http://bioware.ucd.ie .

  16. Virtual gap element approach for the treatment of non-matching interface using three-dimensional solid elements

    NASA Astrophysics Data System (ADS)

    Song, Yeo-Ul; Youn, Sung-Kie; Park, K. C.

    2017-10-01

    A method for three-dimensional non-matching interface treatment with a virtual gap element is developed. When partitioned structures contain curved interfaces and have different brick meshes, the discretized models have gaps along the interfaces. As these gaps bring unexpected errors, special treatments are required to handle the gaps. In the present work, a virtual gap element is introduced to link the frame and surface domain nodes in the frame work of the mortar method. Since the surface of the hexahedron element is quadrilateral, the gap element is pyramidal. The pyramidal gap element consists of four domain nodes and one frame node. Zero-strain condition in the gap element is utilized for the interpolation of frame nodes in terms of the domain nodes. This approach is taken to satisfy the momentum and energy conservation. The present method is applicable not only to curved interfaces with gaps, but also to flat interfaces in three dimensions. Several numerical examples are given to describe the effectiveness and accuracy of the proposed method.

  17. Fusion interfaces for tactical environments: An application of virtual reality technology

    NASA Technical Reports Server (NTRS)

    Haas, Michael W.

    1994-01-01

    The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and nonvirtual concepts and devices across the visual, auditory, and haptic sensory modalities. A fusion interface is a multisensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion interface concepts. This new facility, the Fusion Interfaces for Tactical Environments (FITE) Facility is a specialized flight simulator enabling efficient concept development through rapid prototyping and direct experience of new fusion concepts. The FITE Facility also supports evaluation of fusion concepts by operation fighter pilots in an air combat environment. The facility is utilized by a multidisciplinary design team composed of human factors engineers, electronics engineers, computer scientists, experimental psychologists, and oeprational pilots. The FITE computational architecture is composed of twenty-five 80486-based microcomputers operating in real-time. The microcomputers generate out-the-window visuals, in-cockpit and head-mounted visuals, localized auditory presentations, haptic displays on the stick and rudder pedals, as well as executing weapons models, aerodynamic models, and threat models.

  18. Towards the virtual artery: a multiscale model for vascular physiology at the physics-chemistry-biology interface.

    PubMed

    Hoekstra, Alfons G; Alowayyed, Saad; Lorenz, Eric; Melnikova, Natalia; Mountrakis, Lampros; van Rooij, Britt; Svitenkov, Andrew; Závodszky, Gábor; Zun, Pavel

    2016-11-13

    This discussion paper introduces the concept of the Virtual Artery as a multiscale model for arterial physiology and pathologies at the physics-chemistry-biology (PCB) interface. The cellular level is identified as the mesoscopic level, and we argue that by coupling cell-based models with other relevant models on the macro- and microscale, a versatile model of arterial health and disease can be composed. We review the necessary ingredients, both models of arteries at many different scales, as well as generic methods to compose multiscale models. Next, we discuss how this can be combined into the virtual artery. Finally, we argue that the concept of models at the PCB interface could or perhaps should become a powerful paradigm, not only as in our case for studying physiology, but also for many other systems that have such PCB interfaces.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Authors.

  19. Design of an efficient framework for fast prototyping of customized human-computer interfaces and virtual environments for rehabilitation.

    PubMed

    Avola, Danilo; Spezialetti, Matteo; Placidi, Giuseppe

    2013-06-01

    Rehabilitation is often required after stroke, surgery, or degenerative diseases. It has to be specific for each patient and can be easily calibrated if assisted by human-computer interfaces and virtual reality. Recognition and tracking of different human body landmarks represent the basic features for the design of the next generation of human-computer interfaces. The most advanced systems for capturing human gestures are focused on vision-based techniques which, on the one hand, may require compromises from real-time and spatial precision and, on the other hand, ensure natural interaction experience. The integration of vision-based interfaces with thematic virtual environments encourages the development of novel applications and services regarding rehabilitation activities. The algorithmic processes involved during gesture recognition activity, as well as the characteristics of the virtual environments, can be developed with different levels of accuracy. This paper describes the architectural aspects of a framework supporting real-time vision-based gesture recognition and virtual environments for fast prototyping of customized exercises for rehabilitation purposes. The goal is to provide the therapist with a tool for fast implementation and modification of specific rehabilitation exercises for specific patients, during functional recovery. Pilot examples of designed applications and preliminary system evaluation are reported and discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. Virtual Reality: An Instructional Medium for Visual-Spatial Tasks.

    ERIC Educational Resources Information Center

    Regian, J. Wesley; And Others

    1992-01-01

    Describes an empirical exploration of the instructional potential of virtual reality as an interface for simulation-based training. Shows that subjects learned spatial-procedural and spatial-navigational skills in virtual reality. (SR)

  1. Avatars and virtual agents – relationship interfaces for the elderly

    PubMed Central

    2017-01-01

    In the Digital Era, the authors witness a change in the relationship between the patient and the care-giver or Health Maintenance Organization's providing the health services. Another fact is the use of various technologies to increase the effectiveness and quality of health services across all primary and secondary users. These technologies range from telemedicine systems, decision making tools, online and self-services applications and virtual agents; all providing information and assistance. The common thread between all these digital implementations, is they all require human machine interfaces. These interfaces must be interactive, user friendly and inviting, to create user involvement and cooperation incentives. The challenge is to design interfaces which will best fit the target users and enable smooth interaction especially, for the elderly users. Avatars and Virtual Agents are one of the interfaces used for both home care monitoring and companionship. They are also inherently multimodal in nature and allow an intimate relation between the elderly users and the Avatar. This study discusses the need and nature of these relationship models, the challenges of designing for the elderly. The study proposes key features for the design and evaluation in the area of assistive applications using Avatar and Virtual agents for the elderly users. PMID:28706725

  2. VEVI: A Virtual Reality Tool For Robotic Planetary Explorations

    NASA Technical Reports Server (NTRS)

    Piguet, Laurent; Fong, Terry; Hine, Butler; Hontalas, Phil; Nygren, Erik

    1994-01-01

    The Virtual Environment Vehicle Interface (VEVI), developed by the NASA Ames Research Center's Intelligent Mechanisms Group, is a modular operator interface for direct teleoperation and supervisory control of robotic vehicles. Virtual environments enable the efficient display and visualization of complex data. This characteristic allows operators to perceive and control complex systems in a natural fashion, utilizing the highly-evolved human sensory system. VEVI utilizes real-time, interactive, 3D graphics and position / orientation sensors to produce a range of interface modalities from the flat panel (windowed or stereoscopic) screen displays to head mounted/head-tracking stereo displays. The interface provides generic video control capability and has been used to control wheeled, legged, air bearing, and underwater vehicles in a variety of different environments. VEVI was designed and implemented to be modular, distributed and easily operated through long-distance communication links, using a communication paradigm called SYNERGY.

  3. Design strategies and functionality of the Visual Interface for Virtual Interaction Development (VIVID) tool

    NASA Technical Reports Server (NTRS)

    Nguyen, Lac; Kenney, Patrick J.

    1993-01-01

    Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.

  4. Virtual reality in the operating room of the future.

    PubMed

    Müller, W; Grosskopf, S; Hildebrand, A; Malkewitz, R; Ziegler, R

    1997-01-01

    In cooperation with the Max-Delbrück-Centrum/Robert-Rössle-Klinik (MDC/RRK) in Berlin, the Fraunhofer Institute for Computer Graphics is currently designing and developing a scenario for the operating room of the future. The goal of this project is to integrate new analysis, visualization and interaction tools in order to optimize and refine tumor diagnostics and therapy in combination with laser technology and remote stereoscopic video transfer. Hence, a human 3-D reference model is reconstructed using CT, MR, and anatomical cryosection images from the National Library of Medicine's Visible Human Project. Applying segmentation algorithms and surface-polygonization methods a 3-D representation is obtained. In addition, a "fly-through" the virtual patient is realized using 3-D input devices (data glove, tracking system, 6-DOF mouse). In this way, the surgeon can experience really new perspectives of the human anatomy. Moreover, using a virtual cutting plane any cut of the CT volume can be interactively placed and visualized in realtime. In conclusion, this project delivers visions for the application of effective visualization and VR systems. Commonly known as Virtual Prototyping and applied by the automotive industry long ago, this project shows, that the use of VR techniques can also prototype an operating room. After evaluating design and functionality of the virtual operating room, MDC plans to build real ORs in the near future. The use of VR techniques provides a more natural interface for the surgeon in the OR (e.g., controlling interactions by voice input). Besides preoperative planning future work will focus on supporting the surgeon in performing surgical interventions. An optimal synthesis of real and synthetic data, and the inclusion of visual, aural, and tactile senses in virtual environments can meet these requirements. This Augmented Reality could represent the environment for the surgeons of tomorrow.

  5. Virtual humans and formative assessment to train diagnostic skills in bulimia nervosa.

    PubMed

    Gutiérrez-Maldonado, José; Ferrer-Garcia, Marta; Pla, Joana; Andrés-Pueyo, Antonio

    2014-01-01

    Carrying out a diagnostic interview requires skills that need to be taught in a controlled environment. Virtual Reality (VR) environments are increasingly used in the training of professionals, as they offer the most realistic alternative while not requiring students to face situations for which they are yet unprepared. The results of the training of diagnostic skills can also be generalized to any other situation in which effective communication skills play a major role. Our aim with this study has been to develop a procedure of formative assessment in order to increment the effectiveness of virtual learning simulation systems and then to assess their efficacy.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasanah, Lilik, E-mail: lilikhasanah@upi.edu; Suhendi, Endi; Tayubi, Yuyu Rahmat

    In this work we discuss the surface roughness of Si interface impact to the tunneling current of the Si/Si{sub 1-x}Ge{sub x}/Si heterojunction bipolar transistor. The Si interface surface roughness can be analyzed from electrical characteristics through the transversal electron velocity obtained as fitting parameter factor. The results showed that surface roughness increase as Ge content of virtual substrate increase This model can be used to investigate the effect of Ge content of the virtual substrate to the interface surface condition through current-voltage characteristic.

  7. Virtual Reality: Toward Fundamental Improvements in Simulation-Based Training.

    ERIC Educational Resources Information Center

    Thurman, Richard A.; Mattoon, Joseph S.

    1994-01-01

    Considers the role and effectiveness of virtual reality in simulation-based training. The theoretical and practical implications of verity, integration, and natural versus artificial interface are discussed; a three-dimensional classification scheme for virtual reality is described; and the relationship between virtual reality and other…

  8. 78 FR 54626 - Announcing Approval of Federal Information Processing Standard (FIPS) Publication 201-2, Personal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-05

    ... its virtual contact interface be made mandatory as soon as possible for the many beneficial features... messaging and the virtual contact interface in the Standard, some Federal departments and agencies have... Laboratory Programs. [FR Doc. 2013-21491 Filed 9-4-13; 8:45 am] BILLING CODE 3510-13-P ...

  9. Linking Audio and Visual Information while Navigating in a Virtual Reality Kiosk Display

    ERIC Educational Resources Information Center

    Sullivan, Briana; Ware, Colin; Plumlee, Matthew

    2006-01-01

    3D interactive virtual reality museum exhibits should be easy to use, entertaining, and informative. If the interface is intuitive, it will allow the user more time to learn the educational content of the exhibit. This research deals with interface issues concerning activating audio descriptions of images in such exhibits while the user is…

  10. [Clinical pathology on the verge of virtual microscopy].

    PubMed

    Tolonen, Teemu; Näpänkangas, Juha; Isola, Jorma

    2015-01-01

    For more than 100 years, examinations of pathology specimens have relied on the use of the light microscope. The technological progress of the last few years is enabling the digitizing of histologic specimen slides and application of the virtual microscope in diagnostics. Virtual microscopy will facilitate consultation possibilities, and digital image analysis serves to enhance the level of diagnostics. Organizing and monitoring clinicopathological meetings will become easier. Digital archive of histologic specimens and the virtual microscopy network are expected to benefit training and research as well, particularly what applies to the Finnish biobank network which is currently being established.

  11. Virtual Reality: A Dream Come True or a Nightmare.

    ERIC Educational Resources Information Center

    Cornell, Richard; Bailey, Dan

    Virtual Reality (VR) is a new medium which allows total stimulation of one's senses through human/computer interfaces. VR has applications in training simulators, nano-science, medicine, entertainment, electronic technology, and manufacturing. This paper focuses on some current and potential problems of virtual reality and virtual environments…

  12. Assessment of Application Technology of Natural User Interfaces in the Creation of a Virtual Chemical Laboratory

    ERIC Educational Resources Information Center

    Jagodzinski, Piotr; Wolski, Robert

    2015-01-01

    Natural User Interfaces (NUI) are now widely used in electronic devices such as smartphones, tablets and gaming consoles. We have tried to apply this technology in the teaching of chemistry in middle school and high school. A virtual chemical laboratory was developed in which students can simulate the performance of laboratory activities similar…

  13. Human-machine interface for a VR-based medical imaging environment

    NASA Astrophysics Data System (ADS)

    Krapichler, Christian; Haubner, Michael; Loesch, Andreas; Lang, Manfred K.; Englmeier, Karl-Hans

    1997-05-01

    Modern 3D scanning techniques like magnetic resonance imaging (MRI) or computed tomography (CT) produce high- quality images of the human anatomy. Virtual environments open new ways to display and to analyze those tomograms. Compared with today's inspection of 2D image sequences, physicians are empowered to recognize spatial coherencies and examine pathological regions more facile, diagnosis and therapy planning can be accelerated. For that purpose a powerful human-machine interface is required, which offers a variety of tools and features to enable both exploration and manipulation of the 3D data. Man-machine communication has to be intuitive and efficacious to avoid long accustoming times and to enhance familiarity with and acceptance of the interface. Hence, interaction capabilities in virtual worlds should be comparable to those in the real work to allow utilization of our natural experiences. In this paper the integration of hand gestures and visual focus, two important aspects in modern human-computer interaction, into a medical imaging environment is shown. With the presented human- machine interface, including virtual reality displaying and interaction techniques, radiologists can be supported in their work. Further, virtual environments can even alleviate communication between specialists from different fields or in educational and training applications.

  14. The Design and Evaluation of a Large-Scale Real-Walking Locomotion Interface

    PubMed Central

    Peck, Tabitha C.; Fuchs, Henry; Whitton, Mary C.

    2014-01-01

    Redirected Free Exploration with Distractors (RFED) is a large-scale real-walking locomotion interface developed to enable people to walk freely in virtual environments that are larger than the tracked space in their facility. This paper describes the RFED system in detail and reports on a user study that evaluated RFED by comparing it to walking-in-place and joystick interfaces. The RFED system is composed of two major components, redirection and distractors. This paper discusses design challenges, implementation details, and lessons learned during the development of two working RFED systems. The evaluation study examined the effect of the locomotion interface on users’ cognitive performance on navigation and wayfinding measures. The results suggest that participants using RFED were significantly better at navigating and wayfinding through virtual mazes than participants using walking-in-place and joystick interfaces. Participants traveled shorter distances, made fewer wrong turns, pointed to hidden targets more accurately and more quickly, and were able to place and label targets on maps more accurately, and more accurately estimate the virtual environment size. PMID:22184262

  15. Multi-Robot Interfaces and Operator Situational Awareness: Study of the Impact of Immersion and Prediction

    PubMed Central

    Peña-Tapia, Elena; Martín-Barrio, Andrés; Olivares-Méndez, Miguel A.

    2017-01-01

    Multi-robot missions are a challenge for operators in terms of workload and situational awareness. These operators have to receive data from the robots, extract information, understand the situation properly, make decisions, generate the adequate commands, and send them to the robots. The consequences of excessive workload and lack of awareness can vary from inefficiencies to accidents. This work focuses on the study of future operator interfaces of multi-robot systems, taking into account relevant issues such as multimodal interactions, immersive devices, predictive capabilities and adaptive displays. Specifically, four interfaces have been designed and developed: a conventional, a predictive conventional, a virtual reality and a predictive virtual reality interface. The four interfaces have been validated by the performance of twenty-four operators that supervised eight multi-robot missions of fire surveillance and extinguishing. The results of the workload and situational awareness tests show that virtual reality improves the situational awareness without increasing the workload of operators, whereas the effects of predictive components are not significant and depend on their implementation. PMID:28749407

  16. A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50)

    NASA Astrophysics Data System (ADS)

    Kern, Bastian; Jöckel, Patrick

    2016-10-01

    Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.

  17. User Interface Technology Transfer to NASA's Virtual Wind Tunnel System

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1998-01-01

    Funded by NASA grants for four years, the Brown Computer Graphics Group has developed novel 3D user interfaces for desktop and immersive scientific visualization applications. This past grant period supported the design and development of a software library, the 3D Widget Library, which supports the construction and run-time management of 3D widgets. The 3D Widget Library is a mechanism for transferring user interface technology from the Brown Graphics Group to the Virtual Wind Tunnel system at NASA Ames as well as the public domain.

  18. Device Control Using Gestures Sensed from EMG

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.

    2003-01-01

    In this paper we present neuro-electric interfaces for virtual device control. The examples presented rely upon sampling Electromyogram data from a participants forearm. This data is then fed into pattern recognition software that has been trained to distinguish gestures from a given gesture set. The pattern recognition software consists of hidden Markov models which are used to recognize the gestures as they are being performed in real-time. Two experiments were conducted to examine the feasibility of this interface technology. The first replicated a virtual joystick interface, and the second replicated a keyboard.

  19. Manipulation of volumetric patient data in a distributed virtual reality environment.

    PubMed

    Dech, F; Ai, Z; Silverstein, J C

    2001-01-01

    Due to increases in network speed and bandwidth, distributed exploration of medical data in immersive Virtual Reality (VR) environments is becoming increasingly feasible. The volumetric display of radiological data in such environments presents a unique set of challenges. The shear size and complexity of the datasets involved not only make them difficult to transmit to remote sites, but these datasets also require extensive user interaction in order to make them understandable to the investigator and manageable to the rendering hardware. A sophisticated VR user interface is required in order for the clinician to focus on the aspects of the data that will provide educational and/or diagnostic insight. We will describe a software system of data acquisition, data display, Tele-Immersion, and data manipulation that supports interactive, collaborative investigation of large radiological datasets. The hardware required in this strategy is still at the high-end of the graphics workstation market. Future software ports to Linux and NT, along with the rapid development of PC graphics cards, open the possibility for later work with Linux or NT PCs and PC clusters.

  20. Will Anything Useful Come Out of Virtual Reality? Examination of a Naval Application

    DTIC Science & Technology

    1993-05-01

    The term virtual reality can encompass varying meanings, but some generally accepted attributes of a virtual environment are that it is immersive...technology, but at present there are few practical applications which are utilizing the broad range of virtual reality technology. This paper will discuss an...Operability, operator functions, Virtual reality , Man-machine interface, Decision aids/decision making, Decision support. ASW.

  1. Virtual Frame Buffer Interface Program

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas L.

    1990-01-01

    Virtual Frame Buffer Interface program makes all frame buffers appear as generic frame buffer with specified set of characteristics, allowing programmers to write codes that run unmodified on all supported hardware. Converts generic commands to actual device commands. Consists of definition of capabilities and FORTRAN subroutines called by application programs. Developed in FORTRAN 77 for DEC VAX 11/780 or DEC VAX 11/750 computer under VMS 4.X.

  2. Live Virtual Constructive (LVC): Interface Control Document (ICD) for the LVC Gateway. [Flight Test 3

    NASA Technical Reports Server (NTRS)

    Jovic, Srba

    2015-01-01

    This Interface Control Document (ICD) documents and tracks the necessary information required for the Live Virtual and Constructive (LVC) systems components as well as protocols for communicating with them in order to achieve all research objectives captured by the experiment requirements. The purpose of this ICD is to clearly communicate all inputs and outputs from the subsystem components.

  3. The Virtual Solar Observatory and the Heliophysics Meta-Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Gurman, Joseph B.

    2007-01-01

    The Virtual Solar Observatory (VSO) is now able to search for solar data ranging from the radio to gamma rays, obtained from space and groundbased observatories, from 26 sources at 12 data providers, and from 1915 to the present. The solar physics community can use a Web interface or an Application Programming Interface (API) that allows integrating VSO searches into other software, including other Web services. Over the next few years, this integration will be especially obvious as the NASA Heliophysics division sponsors the development of a heliophysics-wide virtual observatory (VO), based on existing VO's in heliospheric, magnetospheric, and ionospheric physics as well as the VSO. We examine some of the challenges and potential of such a "meta-VO."

  4. Virtual Sensor Test Instrumentation

    NASA Technical Reports Server (NTRS)

    Wang, Roy

    2011-01-01

    Virtual Sensor Test Instrumentation is based on the concept of smart sensor technology for testing with intelligence needed to perform sell-diagnosis of health, and to participate in a hierarchy of health determination at sensor, process, and system levels. A virtual sensor test instrumentation consists of five elements: (1) a common sensor interface, (2) microprocessor, (3) wireless interface, (4) signal conditioning and ADC/DAC (analog-to-digital conversion/ digital-to-analog conversion), and (5) onboard EEPROM (electrically erasable programmable read-only memory) for metadata storage and executable software to create powerful, scalable, reconfigurable, and reliable embedded and distributed test instruments. In order to maximize the efficient data conversion through the smart sensor node, plug-and-play functionality is required to interface with traditional sensors to enhance their identity and capabilities for data processing and communications. Virtual sensor test instrumentation can be accessible wirelessly via a Network Capable Application Processor (NCAP) or a Smart Transducer Interlace Module (STIM) that may be managed under real-time rule engines for mission-critical applications. The transducer senses the physical quantity being measured and converts it into an electrical signal. The signal is fed to an A/D converter, and is ready for use by the processor to execute functional transformation based on the sensor characteristics stored in a Transducer Electronic Data Sheet (TEDS). Virtual sensor test instrumentation is built upon an open-system architecture with standardized protocol modules/stacks to interface with industry standards and commonly used software. One major benefit for deploying the virtual sensor test instrumentation is the ability, through a plug-and-play common interface, to convert raw sensor data in either analog or digital form, to an IEEE 1451 standard-based smart sensor, which has instructions to program sensors for a wide variety of functions. The sensor data is processed in a distributed fashion across the network, providing a large pool of resources in real time to meet stringent latency requirements.

  5. Transduction between worlds: using virtual and mixed reality for earth and planetary science

    NASA Astrophysics Data System (ADS)

    Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.

    2017-12-01

    Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.

  6. Making Information Overload Work: The Dragon Software System on a Virtual Reality Responsive Workbench

    DTIC Science & Technology

    1998-03-01

    Research Laboratory’s Virtual Reality Responsive Workbench (VRRWB) and Dragon software system which together address the problem of battle space...and describe the lessons which have been learned. Interactive graphics, workbench, battle space visualization, virtual reality , user interface.

  7. A kickball game for ankle rehabilitation by JAVA, JNI, and VRML

    NASA Astrophysics Data System (ADS)

    Choi, Hyungjeen; Ryu, Jeha; Lee, Chansu

    2004-03-01

    This paper presents development of a virtual environment that can be applied to the ankle rehabilitation procedure. We developed a virtual football stadium to intrigue a patient, where two degree of freedom (DOF) plate-shaped object is oriented to kick a ball falling from the sky in accordance with the data from the ankle's dorisflexion/plantarflexion and inversion/eversion motion on the moving platform of the K-Platform. This Kickball Game is implemented by Virtual Reality Modeling Language (VRML). To control virtual objects, data from the K-Platform are transmitted through the communication module implemented in C++. Java, Java Native Interface (JNI) and VRML plug-in are combined together so as to interface the communication module with the virtual environment by VRML. This game may be applied to the Active Range of Motion (AROM) exercise procedure that is one of the ankle rehabilitation procedures.

  8. Data communications in a parallel active messaging interface of a parallel computer

    DOEpatents

    Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2014-09-02

    Eager send data communications in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI composed of data communications endpoints that specify a client, a context, and a task, including receiving an eager send data communications instruction with transfer data disposed in a send buffer characterized by a read/write send buffer memory address in a read/write virtual address space of the origin endpoint; determining for the send buffer a read-only send buffer memory address in a read-only virtual address space, the read-only virtual address space shared by both the origin endpoint and the target endpoint, with all frames of physical memory mapped to pages of virtual memory in the read-only virtual address space; and communicating by the origin endpoint to the target endpoint an eager send message header that includes the read-only send buffer memory address.

  9. Data communications in a parallel active messaging interface of a parallel computer

    DOEpatents

    Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2014-09-16

    Eager send data communications in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI composed of data communications endpoints that specify a client, a context, and a task, including receiving an eager send data communications instruction with transfer data disposed in a send buffer characterized by a read/write send buffer memory address in a read/write virtual address space of the origin endpoint; determining for the send buffer a read-only send buffer memory address in a read-only virtual address space, the read-only virtual address space shared by both the origin endpoint and the target endpoint, with all frames of physical memory mapped to pages of virtual memory in the read-only virtual address space; and communicating by the origin endpoint to the target endpoint an eager send message header that includes the read-only send buffer memory address.

  10. Active tactile exploration using a brain-machine-brain interface.

    PubMed

    O'Doherty, Joseph E; Lebedev, Mikhail A; Ifft, Peter J; Zhuang, Katie Z; Shokur, Solaiman; Bleuler, Hannes; Nicolelis, Miguel A L

    2011-10-05

    Brain-machine interfaces use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. It is hoped that brain-machine interfaces can be used to restore the normal sensorimotor functions of the limbs, but so far they have lacked tactile sensation. Here we report the operation of a brain-machine-brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and allows signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex. Monkeys performed an active exploration task in which an actuator (a computer cursor or a virtual-reality arm) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in the primary motor cortex. ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search for and distinguish one of three visually identical objects, using the virtual-reality arm to identify the unique artificial texture associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic or even virtual prostheses.

  11. Three-Dimensional User Interfaces for Immersive Virtual Reality

    NASA Technical Reports Server (NTRS)

    vanDam, Andries

    1997-01-01

    The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.

  12. Centrally managed unified shared virtual address space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkes, John

    Systems, apparatuses, and methods for managing a unified shared virtual address space. A host may execute system software and manage a plurality of nodes coupled to the host. The host may send work tasks to the nodes, and for each node, the host may externally manage the node's view of the system's virtual address space. Each node may have a central processing unit (CPU) style memory management unit (MMU) with an internal translation lookaside buffer (TLB). In one embodiment, the host may be coupled to a given node via an input/output memory management unit (IOMMU) interface, where the IOMMU frontendmore » interface shares the TLB with the given node's MMU. In another embodiment, the host may control the given node's view of virtual address space via memory-mapped control registers.« less

  13. Assessment of Application Technology of Natural User Interfaces in the Creation of a Virtual Chemical Laboratory

    NASA Astrophysics Data System (ADS)

    Jagodziński, Piotr; Wolski, Robert

    2015-02-01

    Natural User Interfaces (NUI) are now widely used in electronic devices such as smartphones, tablets and gaming consoles. We have tried to apply this technology in the teaching of chemistry in middle school and high school. A virtual chemical laboratory was developed in which students can simulate the performance of laboratory activities similar to those that they perform in a real laboratory. Kinect sensor was used for the detection and analysis of the student's hand movements, which is an example of NUI. The studies conducted found the effectiveness of educational virtual laboratory. The extent to which the use of a teaching aid increased the students' progress in learning chemistry was examined. The results indicate that the use of NUI creates opportunities to both enhance and improve the quality of the chemistry education. Working in a virtual laboratory using the Kinect interface results in greater emotional involvement and an increased sense of self-efficacy in the laboratory work among students. As a consequence, students are getting higher marks and are more interested in the subject of chemistry.

  14. Computational Virtual Reality (VR) as a human-computer interface in the operation of telerobotic systems

    NASA Technical Reports Server (NTRS)

    Bejczy, Antal K.

    1995-01-01

    This presentation focuses on the application of computer graphics or 'virtual reality' (VR) techniques as a human-computer interface tool in the operation of telerobotic systems. VR techniques offer very valuable task realization aids for planning, previewing and predicting robotic actions, operator training, and for visual perception of non-visible events like contact forces in robotic tasks. The utility of computer graphics in telerobotic operation can be significantly enhanced by high-fidelity calibration of virtual reality images to actual TV camera images. This calibration will even permit the creation of artificial (synthetic) views of task scenes for which no TV camera views are available.

  15. Virtual workstations and telepresence interfaces: Design accommodations and prototypes for Space Station Freedom evolution

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1990-01-01

    An advanced human-system interface is being developed for evolutionary Space Station Freedom as part of the NASA Office of Space Station (OSS) Advanced Development Program. The human-system interface is based on body-pointed display and control devices. The project will identify and document the design accommodations ('hooks and scars') required to support virtual workstations and telepresence interfaces, and prototype interface systems will be built, evaluated, and refined. The project is a joint enterprise of Marquette University, Astronautics Corporation of America (ACA), and NASA's ARC. The project team is working with NASA's JSC and McDonnell Douglas Astronautics Company (the Work Package contractor) to ensure that the project is consistent with space station user requirements and program constraints. Documentation describing design accommodations and tradeoffs will be provided to OSS, JSC, and McDonnell Douglas, and prototype interface devices will be delivered to ARC and JSC. ACA intends to commercialize derivatives of the interface for use with computer systems developed for scientific visualization and system simulation.

  16. Cortical Spiking Network Interfaced with Virtual Musculoskeletal Arm and Robotic Arm.

    PubMed

    Dura-Bernal, Salvador; Zhou, Xianlian; Neymotin, Samuel A; Przekwas, Andrzej; Francis, Joseph T; Lytton, William W

    2015-01-01

    Embedding computational models in the physical world is a critical step towards constraining their behavior and building practical applications. Here we aim to drive a realistic musculoskeletal arm model using a biomimetic cortical spiking model, and make a robot arm reproduce the same trajectories in real time. Our cortical model consisted of a 3-layered cortex, composed of several hundred spiking model-neurons, which display physiologically realistic dynamics. We interconnected the cortical model to a two-joint musculoskeletal model of a human arm, with realistic anatomical and biomechanical properties. The virtual arm received muscle excitations from the neuronal model, and fed back proprioceptive information, forming a closed-loop system. The cortical model was trained using spike timing-dependent reinforcement learning to drive the virtual arm in a 2D reaching task. Limb position was used to simultaneously control a robot arm using an improved network interface. Virtual arm muscle activations responded to motoneuron firing rates, with virtual arm muscles lengths encoded via population coding in the proprioceptive population. After training, the virtual arm performed reaching movements which were smoother and more realistic than those obtained using a simplistic arm model. This system provided access to both spiking network properties and to arm biophysical properties, including muscle forces. The use of a musculoskeletal virtual arm and the improved control system allowed the robot arm to perform movements which were smoother than those reported in our previous paper using a simplistic arm. This work provides a novel approach consisting of bidirectionally connecting a cortical model to a realistic virtual arm, and using the system output to drive a robotic arm in real time. Our techniques are applicable to the future development of brain neuroprosthetic control systems, and may enable enhanced brain-machine interfaces with the possibility for finer control of limb prosthetics.

  17. Human-computer interface glove using flexible piezoelectric sensors

    NASA Astrophysics Data System (ADS)

    Cha, Youngsu; Seo, Jeonggyu; Kim, Jun-Sik; Park, Jung-Min

    2017-05-01

    In this note, we propose a human-computer interface glove based on flexible piezoelectric sensors. We select polyvinylidene fluoride as the piezoelectric material for the sensors because of advantages such as a steady piezoelectric characteristic and good flexibility. The sensors are installed in a fabric glove by means of pockets and Velcro bands. We detect changes in the angles of the finger joints from the outputs of the sensors, and use them for controlling a virtual hand that is utilized in virtual object manipulation. To assess the sensing ability of the piezoelectric sensors, we compare the processed angles from the sensor outputs with the real angles from a camera recoding. With good agreement between the processed and real angles, we successfully demonstrate the user interaction system with the virtual hand and interface glove based on the flexible piezoelectric sensors, for four hand motions: fist clenching, pinching, touching, and grasping.

  18. Encountered-Type Haptic Interface for Representation of Shape and Rigidity of 3D Virtual Objects.

    PubMed

    Takizawa, Naoki; Yano, Hiroaki; Iwata, Hiroo; Oshiro, Yukio; Ohkohchi, Nobuhiro

    2017-01-01

    This paper describes the development of an encountered-type haptic interface that can generate the physical characteristics, such as shape and rigidity, of three-dimensional (3D) virtual objects using an array of newly developed non-expandable balloons. To alter the rigidity of each non-expandable balloon, the volume of air in it is controlled through a linear actuator and a pressure sensor based on Hooke's law. Furthermore, to change the volume of each balloon, its exposed surface area is controlled by using another linear actuator with a trumpet-shaped tube. A position control mechanism is constructed to display virtual objects using the balloons. The 3D position of each balloon is controlled using a flexible tube and a string. The performance of the system is tested and the results confirm the effectiveness of the proposed principle and interface.

  19. Direct Manipulation in Virtual Reality

    NASA Technical Reports Server (NTRS)

    Bryson, Steve

    2003-01-01

    Virtual Reality interfaces offer several advantages for scientific visualization such as the ability to perceive three-dimensional data structures in a natural way. The focus of this chapter is direct manipulation, the ability for a user in virtual reality to control objects in the virtual environment in a direct and natural way, much as objects are manipulated in the real world. Direct manipulation provides many advantages for the exploration of complex, multi-dimensional data sets, by allowing the investigator the ability to intuitively explore the data environment. Because direct manipulation is essentially a control interface, it is better suited for the exploration and analysis of a data set than for the publishing or communication of features found in that data set. Thus direct manipulation is most relevant to the analysis of complex data that fills a volume of three-dimensional space, such as a fluid flow data set. Direct manipulation allows the intuitive exploration of that data, which facilitates the discovery of data features that would be difficult to find using more conventional visualization methods. Using a direct manipulation interface in virtual reality, an investigator can, for example, move a data probe about in space, watching the results and getting a sense of how the data varies within its spatial volume.

  20. Intelligent virtual reality in the setting of fuzzy sets

    NASA Technical Reports Server (NTRS)

    Dockery, John; Littman, David

    1992-01-01

    The authors have previously introduced the concept of virtual reality worlds governed by artificial intelligence. Creation of an intelligent virtual reality was further proposed as a universal interface for the handicapped. This paper extends consideration of intelligent virtual realty to a context in which fuzzy set principles are explored as a major tool for implementing theory in the domain of applications to the disabled.

  1. PiCO QL: A software library for runtime interactive queries on program data

    NASA Astrophysics Data System (ADS)

    Fragkoulis, Marios; Spinellis, Diomidis; Louridas, Panos

    PiCO QL is an open source C/C++ software whose scientific scope is real-time interactive analysis of in-memory data through SQL queries. It exposes a relational view of a system's or application's data structures, which is queryable through SQL. While the application or system is executing, users can input queries through a web-based interface or issue web service requests. Queries execute on the live data structures through the respective relational views. PiCO QL makes a good candidate for ad-hoc data analysis in applications and for diagnostics in systems settings. Applications of PiCO QL include the Linux kernel, the Valgrind instrumentation framework, a GIS application, a virtual real-time observatory of stellar objects, and a source code analyser.

  2. Skills based evaluation of alternative input methods to command a semi-autonomous electric wheelchair.

    PubMed

    Rojas, Mario; Ponce, Pedro; Molina, Arturo

    2016-08-01

    This paper presents the evaluation, under standardized metrics, of alternative input methods to steer and maneuver a semi-autonomous electric wheelchair. The Human-Machine Interface (HMI), which includes a virtual joystick, head movements and speech recognition controls, was designed to facilitate mobility skills for severely disabled people. Thirteen tasks, which are common to all the wheelchair users, were attempted five times by controlling it with the virtual joystick and the hands-free interfaces in different areas for disabled and non-disabled people. Even though the prototype has an intelligent navigation control, based on fuzzy logic and ultrasonic sensors, the evaluation was done without assistance. The scored values showed that both controls, the head movements and the virtual joystick have similar capabilities, 92.3% and 100%, respectively. However, the 54.6% capacity score obtained for the speech control interface indicates the needs of the navigation assistance to accomplish some of the goals. Furthermore, the evaluation time indicates those skills which require more user's training with the interface and specifications to improve the total performance of the wheelchair.

  3. Three-dimensional virtual bronchoscopy using a tablet computer to guide real-time transbronchial needle aspiration.

    PubMed

    Fiorelli, Alfonso; Raucci, Antonio; Cascone, Roberto; Reginelli, Alfonso; Di Natale, Davide; Santoriello, Carlo; Capuozzo, Antonio; Grassi, Roberto; Serra, Nicola; Polverino, Mario; Santini, Mario

    2017-04-01

    We proposed a new virtual bronchoscopy tool to improve the accuracy of traditional transbronchial needle aspiration for mediastinal staging. Chest-computed tomographic images (1 mm thickness) were reconstructed with Osirix software to produce a virtual bronchoscopic simulation. The target adenopathy was identified by measuring its distance from the carina on multiplanar reconstruction images. The static images were uploaded in iMovie Software, which produced a virtual bronchoscopic movie from the images; the movie was then transferred to a tablet computer to provide real-time guidance during a biopsy. To test the validity of our tool, we divided all consecutive patients undergoing transbronchial needle aspiration retrospectively in two groups based on whether the biopsy was guided by virtual bronchoscopy (virtual bronchoscopy group) or not (traditional group). The intergroup diagnostic yields were statistically compared. Our analysis included 53 patients in the traditional and 53 in the virtual bronchoscopy group. The sensitivity, specificity, positive predictive value, negative predictive value and diagnostic accuracy for the traditional group were 66.6%, 100%, 100%, 10.53% and 67.92%, respectively, and for the virtual bronchoscopy group were 84.31%, 100%, 100%, 20% and 84.91%, respectively. The sensitivity ( P  = 0.011) and diagnostic accuracy ( P  = 0.011) of sampling the paratracheal station were better for the virtual bronchoscopy group than for the traditional group; no significant differences were found for the subcarinal lymph node. Our tool is simple, economic and available in all centres. It guided in real time the needle insertion, thereby improving the accuracy of traditional transbronchial needle aspiration, especially when target lesions are located in a difficult site like the paratracheal station. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  4. Virtual reality in rhinology-a new dimension of clinical experience.

    PubMed

    Klapan, Ivica; Raos, Pero; Galeta, Tomislav; Kubat, Goranka

    2016-07-01

    There is often a need to more precisely identify the extent of pathology and the fine elements of intracranial anatomic features during the diagnostic process and during many operations in the nose, sinus, orbit, and skull base region. In two case reports, we describe the methods used in the diagnostic workup and surgical therapy in the nose and paranasal sinus region. Besides baseline x-ray, multislice computed tomography, and magnetic resonance imaging, operative field imaging was performed via a rapid prototyping model, virtual endoscopy, and 3-D imaging. Different head tissues were visualized in different colors, showing their anatomic interrelations and the extent of pathologic tissue within the operative field. This approach has not yet been used as a standard preoperative or intraoperative procedure in otorhinolaryngology. In this way, we tried to understand the new, visualized "world of anatomic relations within the patient's head" by creating an impression of perception (virtual perception) of the given position of all elements in a particular anatomic region of the head, which does not exist in the real world (virtual world). This approach was aimed at upgrading the diagnostic workup and surgical therapy by ensuring a faster, safer and, above all, simpler operative procedure. In conclusion, any ENT specialist can provide virtual reality support in implementing surgical procedures, with additional control of risks and within the limits of normal tissue, without additional trauma to the surrounding tissue in the anatomic region. At the same time, the virtual reality support provides an impression of the virtual world as the specialist navigates through it and manipulates virtual objects.

  5. Multimodal correlation and intraoperative matching of virtual models in neurosurgery

    NASA Technical Reports Server (NTRS)

    Ceresole, Enrico; Dalsasso, Michele; Rossi, Aldo

    1994-01-01

    The multimodal correlation between different diagnostic exams, the intraoperative calibration of pointing tools and the correlation of the patient's virtual models with the patient himself, are some examples, taken from the biomedical field, of a unique problem: determine the relationship linking representation of the same object in different reference frames. Several methods have been developed in order to determine this relationship, among them, the surface matching method is one that gives the patient minimum discomfort and the errors occurring are compatible with the required precision. The surface matching method has been successfully applied to the multimodal correlation of diagnostic exams such as CT, MR, PET and SPECT. Algorithms for automatic segmentation of diagnostic images have been developed to extract the reference surfaces from the diagnostic exams, whereas the surface of the patient's skull has been monitored, in our approach, by means of a laser sensor mounted on the end effector of an industrial robot. An integrated system for virtual planning and real time execution of surgical procedures has been realized.

  6. Lessons Learned during the Development and Operation of Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Ohishi, M.; Shirasaki, Y.; Komiya, Y.; Mizumoto, Y.; Yasuda, N.; Tanaka, M.

    2010-12-01

    In the last a few years several Virtual Observatory (VO) projects have entered from the research and development phase to the operations phase. The VO projects include AstroGrid (UK), Virtual Astronomical Observatory (former National Virtual Observatory, USA), EURO-VO (EU), Japanese Virtual Observatory (Japan), and so on. This successful transition from the development phase to the operations phase owes primarily to the concerted action to develop standard interfaces among the VO projects in the world, that has been conducted in the International Virtual Observatory Alliance. The registry interface has been one of the most important key to share among the VO projects and data centers (data providers) with the observed data and the catalog data. Data access protocols and/or language (SIAP, SSAP, ADQL) and the common data format (VOTable) are other keys. Consequently we are able to find scientific papers so far published. However, we had faced some experience during the implementation process as follows:

  7. At the initial stage of the registry implementation, some fraction of the registry meta data were not correctly set, or some meta data were missing. IVOA members found that it would be needed to have validation tools to check the compliance before making the interface public;
  8. It seemed that some data centers and/or data providers might find some difficulties to implement various standardized interfaces (protocols) in order to publish their data through the VO interfaces. If there were some kind of VO interface toolkits, it would be much easier for the data centers to implement the VO interfaces; At the current VO standardization, it has not been discussed in depth on the quality assurance on the published data, or how we could provide indexes on the data quality. Such measures would be quite helpful for the data users in order to judge the data quality. It would be needed to discuss this issue not only within IVOA but with observatories and data providers;
  9. Past and current development in the VO projects have been driven from the technology side. However, since the ultimate purpose of the VOs is to accelerate getting astronomical insights from, e.g., huge amount of data or multi-wavelength data, science driven advertisement (including schools to train astronomers) would be needed;
  10. Some data centers and data providers mentioned that they need to be credited. In the Data-Centric science era it would be crucial to explicitly respect the observatories, data centers and data providers;
  11. Some suggestions to these issues are described.

  12. Simplifying the exploration of volumetric images: development of a 3D user interface for the radiologist's workplace.

    PubMed

    Teistler, M; Breiman, R S; Lison, T; Bott, O J; Pretschner, D P; Aziz, A; Nowinski, W L

    2008-10-01

    Volumetric imaging (computed tomography and magnetic resonance imaging) provides increased diagnostic detail but is associated with the problem of navigation through large amounts of data. In an attempt to overcome this problem, a novel 3D navigation tool has been designed and developed that is based on an alternative input device. A 3D mouse allows for simultaneous definition of position and orientation of orthogonal or oblique multiplanar reformatted images or slabs, which are presented within a virtual 3D scene together with the volume-rendered data set and additionally as 2D images. Slabs are visualized with maximum intensity projection, average intensity projection, or standard volume rendering technique. A prototype has been implemented based on PC technology that has been tested by several radiologists. It has shown to be easily understandable and usable after a very short learning phase. Our solution may help to fully exploit the diagnostic potential of volumetric imaging by allowing for a more efficient reading process compared to currently deployed solutions based on conventional mouse and keyboard.

  13. Monitoring and Control Interface Based on Virtual Sensors

    PubMed Central

    Escobar, Ricardo F.; Adam-Medina, Manuel; García-Beltrán, Carlos D.; Olivares-Peregrino, Víctor H.; Juárez-Romero, David; Guerrero-Ramírez, Gerardo V.

    2014-01-01

    In this article, a toolbox based on a monitoring and control interface (MCI) is presented and applied in a heat exchanger. The MCI was programed in order to realize sensor fault detection and isolation and fault tolerance using virtual sensors. The virtual sensors were designed from model-based high-gain observers. To develop the control task, different kinds of control laws were included in the monitoring and control interface. These control laws are PID, MPC and a non-linear model-based control law. The MCI helps to maintain the heat exchanger under operation, even if a temperature outlet sensor fault occurs; in the case of outlet temperature sensor failure, the MCI will display an alarm. The monitoring and control interface is used as a practical tool to support electronic engineering students with heat transfer and control concepts to be applied in a double-pipe heat exchanger pilot plant. The method aims to teach the students through the observation and manipulation of the main variables of the process and by the interaction with the monitoring and control interface (MCI) developed in LabVIEW©. The MCI provides the electronic engineering students with the knowledge of heat exchanger behavior, since the interface is provided with a thermodynamic model that approximates the temperatures and the physical properties of the fluid (density and heat capacity). An advantage of the interface is the easy manipulation of the actuator for an automatic or manual operation. Another advantage of the monitoring and control interface is that all algorithms can be manipulated and modified by the users. PMID:25365462

  14. DEC Ada interface to Screen Management Guidelines (SMG)

    NASA Technical Reports Server (NTRS)

    Laomanachareon, Somsak; Lekkos, Anthony A.

    1986-01-01

    DEC's Screen Management Guidelines are the Run-Time Library procedures that perform terminal-independent screen management functions on a VT100-class terminal. These procedures assist users in designing, composing, and keeping track of complex images on a video screen. There are three fundamental elements in the screen management model: the pasteboard, the virtual display, and the virtual keyboard. The pasteboard is like a two-dimensional area on which a user places and manipulates screen displays. The virtual display is a rectangular part of the terminal screen to which a program writes data with procedure calls. The virtual keyboard is a logical structure for input operation associated with a physical keyboard. SMG can be called by all major VAX languages. Through Ada, predefined language Pragmas are used to interface with SMG. These features and elements of SMG are briefly discussed.

  15. The Virtual Solar Observatory and the Heliophysics Meta-Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Gurman, J. B.; Hourclé, J. A.; Bogart, R. S.; Tian, K.; Hill, F.; Suàrez-Sola, I.; Zarro, D. M.; Davey, A. R.; Martens, P. C.; Yoshimura, K.; Reardon, K. M.

    2006-12-01

    The Virtual Solar Observatory (VSO) has survived its infancy and provides metadata search and data identification for measurements from 45 instrument data sets held at 12 online archives, as well as flare and coronal mass ejection (CME) event lists. Like any toddler, the VSO is good at getting into anything and everything, and is now extending its grasp to more data sets, new missions, and new access methods using its application programming interface (API). We discuss and demonstrate recent changes, including developments for STEREO and SDO, and an IDL-callable interface for the VSO API. We urge the heliophysics community to help civilize this obstreperous youngster by providing input on ways to make the VSO even more useful for system science research in its role as part of the growing cluster of Heliophysics Virtual Observatories.

  16. Functional performance comparison between real and virtual tasks in older adults

    PubMed Central

    Bezerra, Ítalla Maria Pinheiro; Crocetta, Tânia Brusque; Massetti, Thais; da Silva, Talita Dias; Guarnieri, Regiani; Meira, Cassio de Miranda; Arab, Claudia; de Abreu, Luiz Carlos; de Araujo, Luciano Vieira; Monteiro, Carlos Bandeira de Mello

    2018-01-01

    Abstract Introduction: Ageing is usually accompanied by deterioration of physical abilities, such as muscular strength, sensory sensitivity, and functional capacity, making chronic diseases, and the well-being of older adults new challenges to global public health. Objective: The purpose of this study was to evaluate whether a task practiced in a virtual environment could promote better performance and enable transfer to the same task in a real environment. Method: The study evaluated 65 older adults of both genders, aged 60 to 82 years (M = 69.6, SD = 6.3). A timing coincident task was applied to measure the perceptual-motor ability to perform a motor response. The participants were divided into 2 groups: started in a real interface and started in a virtual interface. Results: All subjects improved their performance during the practice, but improvement was not observed for the real interface, as the participants were near maximum performance from the beginning of the task. However, there was no transfer of performance from the virtual to real environment or vice versa. Conclusions: The virtual environment was shown to provide improvement of performance with a short-term motor learning protocol in a timing coincident task. This result suggests that the practice of tasks in a virtual environment seems to be a promising tool for the assessment and training of healthy older adults, even though there was no transfer of performance to a real environment. Trial registration: ISRCTN02960165. Registered 8 November 2016. PMID:29369177

  17. Virtual interface environment

    NASA Technical Reports Server (NTRS)

    Fisher, Scott S.

    1986-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed for use as a multipurpose interface environment. The system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, application scenarios, and research directions are described.

  18. Human Motion Tracking and Glove-Based User Interfaces for Virtual Environments in ANVIL

    NASA Technical Reports Server (NTRS)

    Dumas, Joseph D., II

    2002-01-01

    The Army/NASA Virtual Innovations Laboratory (ANVIL) at Marshall Space Flight Center (MSFC) provides an environment where engineers and other personnel can investigate novel applications of computer simulation and Virtual Reality (VR) technologies. Among the many hardware and software resources in ANVIL are several high-performance Silicon Graphics computer systems and a number of commercial software packages, such as Division MockUp by Parametric Technology Corporation (PTC) and Jack by Unigraphics Solutions, Inc. These hardware and software platforms are used in conjunction with various VR peripheral I/O (input / output) devices, CAD (computer aided design) models, etc. to support the objectives of the MSFC Engineering Systems Department/Systems Engineering Support Group (ED42) by studying engineering designs, chiefly from the standpoint of human factors and ergonomics. One of the more time-consuming tasks facing ANVIL personnel involves the testing and evaluation of peripheral I/O devices and the integration of new devices with existing hardware and software platforms. Another important challenge is the development of innovative user interfaces to allow efficient, intuitive interaction between simulation users and the virtual environments they are investigating. As part of his Summer Faculty Fellowship, the author was tasked with verifying the operation of some recently acquired peripheral interface devices and developing new, easy-to-use interfaces that could be used with existing VR hardware and software to better support ANVIL projects.

  19. Brain-computer interface: changes in performance using virtual reality techniques.

    PubMed

    Ron-Angevin, Ricardo; Díaz-Estrella, Antonio

    2009-01-09

    The ability to control electroencephalographic (EEG) signals when different mental tasks are carried out would provide a method of communication for people with serious motor function problems. This system is known as a brain-computer interface (BCI). Due to the difficulty of controlling one's own EEG signals, a suitable training protocol is required to motivate subjects, as it is necessary to provide some type of visual feedback allowing subjects to see their progress. Conventional systems of feedback are based on simple visual presentations, such as a horizontal bar extension. However, virtual reality is a powerful tool with graphical possibilities to improve BCI-feedback presentation. The objective of the study is to explore the advantages of the use of feedback based on virtual reality techniques compared to conventional systems of feedback. Sixteen untrained subjects, divided into two groups, participated in the experiment. A group of subjects was trained using a BCI system, which uses conventional feedback (bar extension), and another group was trained using a BCI system, which submits subjects to a more familiar environment, such as controlling a car to avoid obstacles. The obtained results suggest that EEG behaviour can be modified via feedback presentation. Significant differences in classification error rates between both interfaces were obtained during the feedback period, confirming that an interface based on virtual reality techniques can improve the feedback control, specifically for untrained subjects.

  20. Reprint of: Client interfaces to the Virtual Observatory Registry

    NASA Astrophysics Data System (ADS)

    Demleitner, M.; Harrison, P.; Taylor, M.; Normand, J.

    2015-06-01

    The Virtual Observatory Registry is a distributed directory of information systems and other resources relevant to astronomy. To make it useful, facilities to query that directory must be provided to humans and machines alike. This article reviews the development and status of such facilities, also considering the lessons learnt from about a decade of experience with Registry interfaces. After a brief outline of the history of the standards development, it describes the use of Registry interfaces in some popular clients as well as dedicated UIs for interrogating the Registry. It continues with a thorough discussion of the design of the two most recent Registry interface standards, RegTAP on the one hand and a full-text-based interface on the other hand. The article finally lays out some of the less obvious conventions that emerged in the interaction between providers of registry records and Registry users as well as remaining challenges and current developments.

  21. Client interfaces to the Virtual Observatory Registry

    NASA Astrophysics Data System (ADS)

    Demleitner, M.; Harrison, P.; Taylor, M.; Normand, J.

    2015-04-01

    The Virtual Observatory Registry is a distributed directory of information systems and other resources relevant to astronomy. To make it useful, facilities to query that directory must be provided to humans and machines alike. This article reviews the development and status of such facilities, also considering the lessons learnt from about a decade of experience with Registry interfaces. After a brief outline of the history of the standards development, it describes the use of Registry interfaces in some popular clients as well as dedicated UIs for interrogating the Registry. It continues with a thorough discussion of the design of the two most recent Registry interface standards, RegTAP on the one hand and a full-text-based interface on the other hand. The article finally lays out some of the less obvious conventions that emerged in the interaction between providers of registry records and Registry users as well as remaining challenges and current developments.

  22. Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms.

    PubMed

    Rutkowski, Tomasz M

    2016-01-01

    The paper reviews nine robotic and virtual reality (VR) brain-computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI-lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in applications to direct brain-robot and brain-virtual-reality-agent control interfaces using tactile and auditory BCI technologies. The BCI user intentions are decoded from the brainwaves in realtime using a non-invasive electroencephalography (EEG) and they are translated to a symbiotic robot or virtual reality agent thought-based only control. A communication protocol between the BCI output and the robot or the virtual environment is realized in a symbiotic communication scenario using an user datagram protocol (UDP), which constitutes an internet of things (IoT) control scenario. Results obtained from healthy users reproducing simple brain-robot and brain-virtual-agent control tasks in online experiments support the research goal of a possibility to interact with robotic devices and virtual reality agents using symbiotic thought-based BCI technologies. An offline BCI classification accuracy boosting method, using a previously proposed information geometry derived approach, is also discussed in order to further support the reviewed robotic and virtual reality thought-based control paradigms.

  23. Robotic and Virtual Reality BCIs Using Spatial Tactile and Auditory Oddball Paradigms

    PubMed Central

    Rutkowski, Tomasz M.

    2016-01-01

    The paper reviews nine robotic and virtual reality (VR) brain–computer interface (BCI) projects developed by the author, in collaboration with his graduate students, within the BCI–lab research group during its association with University of Tsukuba, Japan. The nine novel approaches are discussed in applications to direct brain-robot and brain-virtual-reality-agent control interfaces using tactile and auditory BCI technologies. The BCI user intentions are decoded from the brainwaves in realtime using a non-invasive electroencephalography (EEG) and they are translated to a symbiotic robot or virtual reality agent thought-based only control. A communication protocol between the BCI output and the robot or the virtual environment is realized in a symbiotic communication scenario using an user datagram protocol (UDP), which constitutes an internet of things (IoT) control scenario. Results obtained from healthy users reproducing simple brain-robot and brain-virtual-agent control tasks in online experiments support the research goal of a possibility to interact with robotic devices and virtual reality agents using symbiotic thought-based BCI technologies. An offline BCI classification accuracy boosting method, using a previously proposed information geometry derived approach, is also discussed in order to further support the reviewed robotic and virtual reality thought-based control paradigms. PMID:27999538

  24. Human-scale interaction for virtual model displays: a clear case for real tools

    NASA Astrophysics Data System (ADS)

    Williams, George C.; McDowall, Ian E.; Bolas, Mark T.

    1998-04-01

    We describe a hand-held user interface for interacting with virtual environments displayed on a Virtual Model Display. The tool, constructed entirely of transparent materials, is see-through. We render a graphical counterpart of the tool on the display and map it one-to-one with the real tool. This feature, combined with a capability for touch- sensitive, discrete input, results in a useful spatial input device that is visually versatile. We discuss the tool's design and interaction techniques it supports. Briefly, we look at the human factors issues and engineering challenges presented by this tool and, in general, by the class of hand-held user interfaces that are see-through.

  25. Multi-modal cockpit interface for improved airport surface operations

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J. (Inventor); Bailey, Randall E. (Inventor); Prinzel, III, Lawrence J. (Inventor); Kramer, Lynda J. (Inventor); Williams, Steven P. (Inventor)

    2010-01-01

    A system for multi-modal cockpit interface during surface operation of an aircraft comprises a head tracking device, a processing element, and a full-color head worn display. The processing element is configured to receive head position information from the head tracking device, to receive current location information of the aircraft, and to render a virtual airport scene corresponding to the head position information and the current aircraft location. The full-color head worn display is configured to receive the virtual airport scene from the processing element and to display the virtual airport scene. The current location information may be received from one of a global positioning system or an inertial navigation system.

  1. Toward a Virtual Solar Observatory: Starting Before the Petabytes Fall

    NASA Technical Reports Server (NTRS)

    Gurman, J. B.; Fisher, Richard R. (Technical Monitor)

    2002-01-01

    NASA is currently engaged in the study phase of a modest effort to establish a Virtual Solar Observatory (VSO). The VSO would serve ground- and space-based solar physics data sets from a distributed network of archives through a small number of interfaces to the scientific community. The basis of this approach, as of all planned virtual observatories, is the translation of metadata from the various sources via source-specific dictionaries so the user will not have to distinguish among keyword usages. A single Web interface should give access to all the distributed data. We present the current status of the VSO, its initial scope, and its relation to the European EGSO effort.

  2. Open access for ALICE analysis based on virtualization technology

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  3. Efficacy of a Virtual Teaching Assistant in an Open Laboratory Environment for Electric Circuits

    ERIC Educational Resources Information Center

    Saleheen, Firdous; Wang, Zicong; Picone, Joseph; Butz, Brian P.; Won, Chang-Hee

    2018-01-01

    In order to provide an on-demand, open electrical engineering laboratory, we developed an innovative software-based Virtual Open Laboratory Teaching Assistant (VOLTA). This web-based virtual assistant provides laboratory instructions, equipment usage videos, circuit simulation assistance, and hardware implementation diagnostics. VOLTA allows…

  4. Rehabilitation of activities of daily living in virtual environments with intuitive user interface and force feedback.

    PubMed

    Chiang, Vico Chung-Lim; Lo, King-Hung; Choi, Kup-Sze

    2017-10-01

    To investigate the feasibility of using a virtual rehabilitation system with intuitive user interface and force feedback to improve the skills in activities of daily living (ADL). A virtual training system equipped with haptic devices was developed for the rehabilitation of three ADL tasks - door unlocking, water pouring and meat cutting. Twenty subjects with upper limb disabilities, supervised by two occupational therapists, received a four-session training using the system. The task completion time and the amount of water poured into a virtual glass were recorded. The performance of the three tasks in reality was assessed before and after the virtual training. Feedback of the participants was collected with questionnaires after the study. The completion time of the virtual tasks decreased during the training (p < 0.01) while the percentage of water successfully poured increased (p = 0.051). The score of the Borg scale of perceived exertion was 1.05 (SD = 1.85; 95% CI =  0.18-1.92) and that of the task specific feedback questionnaire was 31 (SD =  4.85; 95% CI =  28.66-33.34). The feedback of the therapists suggested a positive rehabilitation effect. The participants had positive perception towards the system. The system can potentially be used as a tool to complement conventional rehabilitation approaches of ADL. Implications for rehabilitation Rehabilitation of activities of daily living can be facilitated using computer-assisted approaches. The existing approaches focus on cognitive training rather than the manual skills. A virtual training system with intuitive user interface and force feedback was designed to improve the learning of the manual skills. The study shows that system could be used as a training tool to complement conventional rehabilitation approaches.

  5. Interfacing laboratory instruments to multiuser, virtual memory computers

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Stang, David B.; Roth, Don J.

    1989-01-01

    Incentives, problems and solutions associated with interfacing laboratory equipment with multiuser, virtual memory computers are presented. The major difficulty concerns how to utilize these computers effectively in a medium sized research group. This entails optimization of hardware interconnections and software to facilitate multiple instrument control, data acquisition and processing. The architecture of the system that was devised, and associated programming and subroutines are described. An example program involving computer controlled hardware for ultrasonic scan imaging is provided to illustrate the operational features.

  6. Integrating UniTree with the data migration API

    NASA Technical Reports Server (NTRS)

    Schrodel, David G.

    1994-01-01

    The Data Migration Application Programming Interface (DMAPI) has the potential to allow developers of open systems Hierarchical Storage Management (HSM) products to virtualize native file systems without the requirement to make changes to the underlying operating system. This paper describes advantages of virtualizing native file systems in hierarchical storage management systems, the DMAPI at a high level, what the goals are for the interface, and the integration of the Convex UniTree+HSM with DMAPI along with some of the benefits derived in the resulting product.

  7. New developments in digital pathology: from telepathology to virtual pathology laboratory.

    PubMed

    Kayser, Klaus; Kayser, Gian; Radziszowski, Dominik; Oehmann, Alexander

    2004-01-01

    To analyse the present status and future development of computerized diagnostic pathology in terms of work-flow integrative telepathology and virtual laboratory. Telepathology has left its childhood. The technical development of telepathology is mature, in contrast to that of virtual pathology. Two kinds of virtual pathology laboratories are emerging: a) those with distributed pathologists and distributed (>=1) laboratories associated to individual biopsy stations/surgical theatres, and b) distributed pathologists working in a centralized laboratory. Both are under technical development. Telepathology can be used for e-learning and e-training in pathology, as exemplarily demonstrated on Digital Lung Pathology Pathology (www.pathology-online.org). A virtual pathology institution (mode a) accepts a complete case with the patient's history, clinical findings, and (pre-selected) images for first diagnosis. The diagnostic responsibility is that of a conventional institution. The internet serves as platform for information transfer, and an open server such as the iPATH (http://telepath.patho.unibas.ch) for coordination and performance of the diagnostic procedure. The size of images has to be limited, and usual different magnifications have to be used. A group of pathologists is "on duty", or selects one member for a predefined duty period. The diagnostic statement of the pathologist(s) on duty is retransmitted to the sender with full responsibility. First experiences of a virtual pathology institution group working with the iPATH server (Dr. L. Banach, Dr. G. Haroske, Dr. I. Hurwitz, Dr. K. Kayser, Dr. K.D. Kunze, Dr. M. Oberholzer,) working with a small hospital of the Salomon islands are promising. A centralized virtual pathology institution (mode b) depends upon the digitalisation of a complete slide, and the transfer of large sized images to different pathologists working in one institution. The technical performance of complete slide digitalisation is still under development and does not completely fulfil the requirements of a conventional pathology institution at present. VIRTUAL PATHOLOGY AND E-LEARNING: At present, e-learning systems are "stand-alone" solutions distributed on CD or via internet. A characteristic example is the Digital Lung Pathology CD (www.pathology-online.org), which includes about 60 different rare and common lung diseases and internet access to scientific library systems (PubMed), distant measurement servers (EuroQuant), or electronic journals (Elec J Pathol Histol). A new and complete data base based upon this CD will combine e-learning and e-teaching with the actual workflow in a virtual pathology institution (mode a). The technological problems are solved and do not depend upon technical constraints such as slide scanning systems. Telepathology serves as promotor for a new landscape in diagnostic pathology, the so-called virtual pathology institution. Industrial and scientific efforts will probably allow an implementation of this technique within the next two years.

  8. HyFinBall: A Two-Handed, Hybrid 2D/3D Desktop VR Interface for Visualization

    DTIC Science & Technology

    2013-01-01

    user study . This is done in the context of a rich, visual analytics interface containing coordinated views with 2D and 3D visualizations and...the user interface (hardware and software), the design space, as well as preliminary results of a formal user study . This is done in the context of a ... virtual reality , user interface , two-handed interface , hybrid user interface , multi-touch, gesture,

  9. Alexa, Can I Trust You?

    PubMed Central

    Chung, Hyunji; Iorga, Michaela; Voas, Jeffrey; Lee, Sangjin

    2017-01-01

    Security diagnostics expose vulnerabilities and privacy threats that exist in commercial Intelligent Virtual Assistants (IVA) – diagnostics offer the possibility of securer IVA ecosystems. PMID:29213147

  10. An intelligent control and virtual display system for evolutionary space station workstation design

    NASA Technical Reports Server (NTRS)

    Feng, Xin; Niederjohn, Russell J.; Mcgreevy, Michael W.

    1992-01-01

    Research and development of the Advanced Display and Computer Augmented Control System (ADCACS) for the space station Body-Ported Cupola Virtual Workstation (BP/VCWS) were pursued. The potential applications were explored of body ported virtual display and intelligent control technology for the human-system interfacing applications is space station environment. The new system is designed to enable crew members to control and monitor a variety of space operations with greater flexibility and efficiency than existing fixed consoles. The technologies being studied include helmet mounted virtual displays, voice and special command input devices, and microprocessor based intelligent controllers. Several research topics, such as human factors, decision support expert systems, and wide field of view, color displays are being addressed. The study showed the significant advantages of this uniquely integrated display and control system, and its feasibility for human-system interfacing applications in the space station command and control environment.

  11. Detection of Bone Marrow Edema in Nondisplaced Hip Fractures: Utility of a Virtual Noncalcium Dual-Energy CT Application.

    PubMed

    Kellock, Trenton T; Nicolaou, Savvas; Kim, Sandra S Y; Al-Busaidi, Sultan; Louis, Luck J; O'Connell, Tim W; Ouellette, Hugue A; McLaughlin, Patrick D

    2017-09-01

    Purpose To quantify the sensitivity and specificity of dual-energy computed tomographic (CT) virtual noncalcium images in the detection of nondisplaced hip fractures and to assess whether obtaining these images as a complement to bone reconstructions alters sensitivity, specificity, or diagnostic confidence. Materials and Methods The clinical research ethics board approved chart review, and the requirement to obtain informed consent was waived. The authors retrospectively identified 118 patients who presented to a level 1 trauma center emergency department and who underwent dual-energy CT for suspicion of a nondisplaced traumatic hip fracture. Clinical follow-up was the standard of reference. Three radiologists interpreted virtual noncalcium images for traumatic bone marrow edema. Bone reconstructions for the same cases were interpreted alone and then with virtual noncalcium images. Diagnostic confidence was rated on a scale of 1 to 10. McNemar, Fleiss κ, and Wilcoxon signed-rank tests were used for statistical analysis. Results Twenty-two patients had nondisplaced hip fractures and 96 did not have hip fractures. Sensitivity with virtual noncalcium images was 77% and 91% (17 and 20 of 22 patients), and specificity was 92%-99% (89-95 of 96 patients). Sensitivity increased by 4%-5% over that with bone reconstruction images alone for two of the three readers when both bone reconstruction and virtual noncalcium images were used. Specificity remained unchanged (99% and 100%). Diagnostic confidence in the exclusion of fracture was improved with combined bone reconstruction and virtual noncalcium images (median score: 10, 9, and 10 for readers 1, 2, and 3, respectively) compared with bone reconstruction images alone (median score: 9, 8, and 9). Conclusion When used as a supplement to standard bone reconstructions, dual-energy CT virtual noncalcium images increased sensitivity for the detection of nondisplaced traumatic hip fractures and improved diagnostic confidence in the exclusion of these fractures. © RSNA, 2017 Online supplemental material is available for this article. An earlier incorrect version of this article appeared online. This article was corrected on March 17, 2017.

  12. [Virtual clinical diagnosis support system of degenerative stenosis of the lumbar spinal canal].

    PubMed

    Shevelev, I N; Konovalov, N A; Cherkashov, A M; Molodchenkov, A A; Sharamko, T G; Asiutin, D S; Nazarenko, A G

    2013-01-01

    The aim of the study was to develop a virtual clinical diagnostic support system of degenerative lumbar spinal stenosis on database of spine registry. Choice of criteria's for diagnostic system was made on symptom analysis of 298 patients with lumbar spinal stenosis. Also was analysed a group of patient with disc herniation's for sensitivity and specify assessment of developed diagnostic support system. Represented clinical diagnostic support system allows identifying patients with degenerative lumbar spinal stenosis on stage of patient's primary visit. System sensitivity and specify are 90 and 71% respectively. "Online" mode of diagnostic system in structure of spine registry provides maximal availability for specialists, regardless of their locations. Development of tools "medicine 2.0" is the actual direction for carrying out further researches with which carrying out the centralized baea collection by means of specialized registers helps.

  13. Virtual reality and brain computer interface in neurorehabilitation

    PubMed Central

    Dahdah, Marie; Driver, Simon; Parsons, Thomas D.; Richter, Kathleen M.

    2016-01-01

    The potential benefit of technology to enhance recovery after central nervous system injuries is an area of increasing interest and exploration. The primary emphasis to date has been motor recovery/augmentation and communication. This paper introduces two original studies to demonstrate how advanced technology may be integrated into subacute rehabilitation. The first study addresses the feasibility of brain computer interface with patients on an inpatient spinal cord injury unit. The second study explores the validity of two virtual environments with acquired brain injury as part of an intensive outpatient neurorehabilitation program. These preliminary studies support the feasibility of advanced technologies in the subacute stage of neurorehabilitation. These modalities were well tolerated by participants and could be incorporated into patients' inpatient and outpatient rehabilitation regimens without schedule disruptions. This paper expands the limited literature base regarding the use of advanced technologies in the early stages of recovery for neurorehabilitation populations and speaks favorably to the potential integration of brain computer interface and virtual reality technologies as part of a multidisciplinary treatment program. PMID:27034541

  14. Exploring the simulation requirements for virtual regional anesthesia training

    NASA Astrophysics Data System (ADS)

    Charissis, V.; Zimmer, C. R.; Sakellariou, S.; Chan, W.

    2010-01-01

    This paper presents an investigation towards the simulation requirements for virtual regional anaesthesia training. To this end we have developed a prototype human-computer interface designed to facilitate Virtual Reality (VR) augmenting educational tactics for regional anaesthesia training. The proposed interface system, aims to compliment nerve blocking techniques methods. The system is designed to operate in real-time 3D environment presenting anatomical information and enabling the user to explore the spatial relation of different human parts without any physical constrains. Furthermore the proposed system aims to assist the trainee anaesthetists so as to build a mental, three-dimensional map of the anatomical elements and their depictive relationship to the Ultra-Sound imaging which is used for navigation of the anaesthetic needle. Opting for a sophisticated approach of interaction, the interface elements are based on simplified visual representation of real objects, and can be operated through haptic devices and surround auditory cues. This paper discusses the challenges involved in the HCI design, introduces the visual components of the interface and presents a tentative plan of future work which involves the development of realistic haptic feedback and various regional anaesthesia training scenarios.

  15. Enabling the mission through trans-atlantic remote mentored musculoskeletal ultrasound: case report of a portable hand-carried tele-ultrasound system for medical relief missions.

    PubMed

    Kirkpatrick, Andrew W; Blaivas, Michael; Sargsyan, Ashot E; McBeth, Paul B; Patel, Chirag; Xiao, Zhengwen; Pian, Linping; Panebianco, Nova; Hamilton, Douglas R; Ball, Chad G; Dulchavsky, Scott A

    2013-07-01

    Modern medical practice has become extremely dependent upon diagnostic imaging technologies to confirm the results of clinical examination and to guide the response to therapies. Of the various diagnostic imaging techniques, ultrasound is the most portable modality and one that is repeatable, dynamic, relatively cheap, and safe as long as the imaging provided is accurately interpreted. It is, however, the most user-dependent, a characteristic that has prompted the development of remote guidance techniques, wherein remote experts guide distant users through the use of information technologies. Medical mission work often brings specialist physicians to less developed locations, where they wish to provide the highest levels of care but are often bereft of diagnostic imaging resources on which they depend. Furthermore, if these personnel become ill or injured, their own care received may not be to the standard they have left at home. We herein report the utilization of a compact hand-carried remote tele-ultrasound system that allowed real-time diagnosis and follow-up of an acutely torn adductor muscle by a team of ultrasonographers, surgeons, and physicians. The patient was one of the mission surgeons who was guided to self-image. The virtual network of supporting experts was located across North America, whereas the patient was in Lome, Togo, West Africa. The system consisted of a hand-carried ultrasound, the output of which was digitized and streamed to the experts within standard voice-over-Internet-protocol software with an embedded simultaneous videocamera image of the ultrasonographer's hands using a customized graphical user interface. The practical concept of a virtual tele-ultrasound support network was illustrated through the clinical guidance of multiple physicians, including National Aeronautics and Space Administration Medical Operations remote guiders, Olympic team-associated surgeons, and ultrasound-focused emergentologists.

  16. Lessons learned from an Ada conversion project

    NASA Technical Reports Server (NTRS)

    Porter, Tim

    1988-01-01

    Background; SAVVAS architecture; software portability; history of Ada; isolation of non-portable code; simple terminal interface package; constraints of language features; and virtual interfaces are outlined. This presentation is represented by viewgraphs only.

  17. Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation

    PubMed Central

    2011-01-01

    This paper covers the use of depth sensors such as Microsoft Kinect and ASUS Xtion to provide a natural user interface (NUI) for controlling 3-D (three-dimensional) virtual globes such as Google Earth (including its Street View mode), Bing Maps 3D, and NASA World Wind. The paper introduces the Microsoft Kinect device, briefly describing how it works (the underlying technology by PrimeSense), as well as its market uptake and application potential beyond its original intended purpose as a home entertainment and video game controller. The different software drivers available for connecting the Kinect device to a PC (Personal Computer) are also covered, and their comparative pros and cons briefly discussed. We survey a number of approaches and application examples for controlling 3-D virtual globes using the Kinect sensor, then describe Kinoogle, a Kinect interface for natural interaction with Google Earth, developed by students at Texas A&M University. Readers interested in trying out the application on their own hardware can download a Zip archive (included with the manuscript as additional files 1, 2, &3) that contains a 'Kinnogle installation package for Windows PCs'. Finally, we discuss some usability aspects of Kinoogle and similar NUIs for controlling 3-D virtual globes (including possible future improvements), and propose a number of unique, practical 'use scenarios' where such NUIs could prove useful in navigating a 3-D virtual globe, compared to conventional mouse/3-D mouse and keyboard-based interfaces. PMID:21791054

  18. Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation.

    PubMed

    Boulos, Maged N Kamel; Blanchard, Bryan J; Walker, Cory; Montero, Julio; Tripathy, Aalap; Gutierrez-Osuna, Ricardo

    2011-07-26

    This paper covers the use of depth sensors such as Microsoft Kinect and ASUS Xtion to provide a natural user interface (NUI) for controlling 3-D (three-dimensional) virtual globes such as Google Earth (including its Street View mode), Bing Maps 3D, and NASA World Wind. The paper introduces the Microsoft Kinect device, briefly describing how it works (the underlying technology by PrimeSense), as well as its market uptake and application potential beyond its original intended purpose as a home entertainment and video game controller. The different software drivers available for connecting the Kinect device to a PC (Personal Computer) are also covered, and their comparative pros and cons briefly discussed. We survey a number of approaches and application examples for controlling 3-D virtual globes using the Kinect sensor, then describe Kinoogle, a Kinect interface for natural interaction with Google Earth, developed by students at Texas A&M University. Readers interested in trying out the application on their own hardware can download a Zip archive (included with the manuscript as additional files 1, 2, &3) that contains a 'Kinnogle installation package for Windows PCs'. Finally, we discuss some usability aspects of Kinoogle and similar NUIs for controlling 3-D virtual globes (including possible future improvements), and propose a number of unique, practical 'use scenarios' where such NUIs could prove useful in navigating a 3-D virtual globe, compared to conventional mouse/3-D mouse and keyboard-based interfaces.

  19. Evaluation of navigation interfaces in virtual environments

    NASA Astrophysics Data System (ADS)

    Mestre, Daniel R.

    2014-02-01

    When users are immersed in cave-like virtual reality systems, navigational interfaces have to be used when the size of the virtual environment becomes larger than the physical extent of the cave floor. However, using navigation interfaces, physically static users experience self-motion (visually-induced vection). As a consequence, sensorial incoherence between vision (indicating self-motion) and other proprioceptive inputs (indicating immobility) can make them feel dizzy and disoriented. We tested, in two experimental studies, different locomotion interfaces. The objective was twofold: testing spatial learning and cybersickness. In a first experiment, using first-person navigation with a flystick ®, we tested the effect of sensorial aids, a spatialized sound or guiding arrows on the ground, attracting the user toward the goal of the navigation task. Results revealed that sensorial aids tended to impact negatively spatial learning. Moreover, subjects reported significant levels of cybersickness. In a second experiment, we tested whether such negative effects could be due to poorly controlled rotational motion during simulated self-motion. Subjects used a gamepad, in which rotational and translational displacements were independently controlled by two joysticks. Furthermore, we tested first- versus third-person navigation. No significant difference was observed between these two conditions. Overall, cybersickness tended to be lower, as compared to experiment 1, but the difference was not significant. Future research should evaluate further the hypothesis of the role of passively perceived optical flow in cybersickness, but manipulating the virtual environment'sperrot structure. It also seems that video-gaming experience might be involved in the user's sensitivity to cybersickness.

  20. Assessing and training standing balance in older adults: a novel approach using the 'Nintendo Wii' Balance Board.

    PubMed

    Young, William; Ferguson, Stuart; Brault, Sébastien; Craig, Cathy

    2011-02-01

    Older adults, deemed to be at a high risk of falling, are often unable to participate in dynamic exercises due to physical constraints and/or a fear of falling. Using the Nintendo 'Wii Balance Board' (WBB) (Nintendo, Kyoto, Japan), we have developed an interface that allows a user to accurately calculate a participant's centre of pressure (COP) and incorporate it into a virtual environment to create bespoke diagnostic or training programmes that exploit real-time visual feedback of current COP position. This platform allows researchers to design, control and validate tasks that both train and test balance function. This technology provides a safe, adaptable and low-cost balance training/testing solution for older adults, particularly those at high-risk of falling. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. PLIF Study of Mars Science Laboratory Capsule Reaction Control System Jets

    NASA Technical Reports Server (NTRS)

    Johansen, C. T.; Danehy, P. M.; Ashcraft, S. W.; Bathel, B. F.; Inman, J. A.; Jones, S. B.

    2011-01-01

    Nitric-oxide planar laser-induced fluorescence (NO PLIF) was used to visualize the flow in the wake of a Mars Science Lab (MSL) entry capsule with activated reaction control system (RCS) jets in NASA Langley Research Center s 31-Inch Mach 10 Air Tunnel facility. Images were processed using the Virtual Diagnostics Interface (ViDI) method, which brings out the three-dimensional nature of the flow visualization data while showing the relative location of the data with respect to the model. Comparison of wind-on and wind-off results illustrates the effect that the hypersonic crossflow has on the trajectory and structure of individual RCS jets. The visualization and comparison of both single and multiple activated RCS jets indicate low levels of jet-jet interaction. Quantitative streamwise velocity was also obtained via NO PLIF molecular tagging velocimetry (MTV).

  2. PLIF Study of Mars Science Laboratory Capsule Reaction Control System Jets

    NASA Technical Reports Server (NTRS)

    Johansen, C. T.; Danehy, P. M.; Ashcraft, S. W.; Bathel, B. F.; Inman, J. A.; Jones, S. B.

    2011-01-01

    Nitric-oxide planar laser-induced fluorescence (NO PLIF) was used to visualize the flow in the wake of a Mars Science Lab (MSL) entry capsule with activated reaction control system (RCS) jets in NASA Langley Research Center's 31-Inch Mach 10 Air Tunnel facility. Images were processed using the Virtual Diagnostics Interface (ViDI) method, which brings out the three-dimensional nature of the flow visualization data while showing the relative location of the data with respect to the model. Comparison of wind-on and wind-off results illustrates the effect that the hypersonic crossflow has on the trajectory and structure of individual RCS jets. The visualization and comparison of both single and multiple activated RCS jets indicate low levels of jet-jet interaction. Quantitative streamwise velocity was also obtained via NO PLIF molecular tagging velocimetry (MTV).

  3. Cortical Spiking Network Interfaced with Virtual Musculoskeletal Arm and Robotic Arm

    PubMed Central

    Dura-Bernal, Salvador; Zhou, Xianlian; Neymotin, Samuel A.; Przekwas, Andrzej; Francis, Joseph T.; Lytton, William W.

    2015-01-01

    Embedding computational models in the physical world is a critical step towards constraining their behavior and building practical applications. Here we aim to drive a realistic musculoskeletal arm model using a biomimetic cortical spiking model, and make a robot arm reproduce the same trajectories in real time. Our cortical model consisted of a 3-layered cortex, composed of several hundred spiking model-neurons, which display physiologically realistic dynamics. We interconnected the cortical model to a two-joint musculoskeletal model of a human arm, with realistic anatomical and biomechanical properties. The virtual arm received muscle excitations from the neuronal model, and fed back proprioceptive information, forming a closed-loop system. The cortical model was trained using spike timing-dependent reinforcement learning to drive the virtual arm in a 2D reaching task. Limb position was used to simultaneously control a robot arm using an improved network interface. Virtual arm muscle activations responded to motoneuron firing rates, with virtual arm muscles lengths encoded via population coding in the proprioceptive population. After training, the virtual arm performed reaching movements which were smoother and more realistic than those obtained using a simplistic arm model. This system provided access to both spiking network properties and to arm biophysical properties, including muscle forces. The use of a musculoskeletal virtual arm and the improved control system allowed the robot arm to perform movements which were smoother than those reported in our previous paper using a simplistic arm. This work provides a novel approach consisting of bidirectionally connecting a cortical model to a realistic virtual arm, and using the system output to drive a robotic arm in real time. Our techniques are applicable to the future development of brain neuroprosthetic control systems, and may enable enhanced brain-machine interfaces with the possibility for finer control of limb prosthetics. PMID:26635598

  4. Naval Applications of Virtual Reality,

    DTIC Science & Technology

    1993-01-01

    Expert Virtual Reality Special Report 󈨡, pp. 67- 72. 14. SUBJECT TERMS 15 NUMBER o0 PAGES man-machine interface virtual reality decision support...collective and individual performance. -" Virtual reality projects could help *y by Mark Gembicki Av-t-abilty CodesA Avafllat Idt Iofe and David Rousseau...alt- 67 VIRTUAL . REALITY SPECIAl, REPORT r-OPY avcriaikxb to DD)C qg .- 154,41X~~~~~~~~~~~~j 1411 iI..:41 T a].’ 1,1 4 1111 I 4 1 * .11 ~ 4 l.~w111511 I

  5. Emerging CAE technologies and their role in Future Ambient Intelligence Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2011-03-01

    Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.

  6. Ownership and Agency of an Independent Supernumerary Hand Induced by an Imitation Brain-Computer Interface.

    PubMed

    Bashford, Luke; Mehring, Carsten

    2016-01-01

    To study body ownership and control, illusions that elicit these feelings in non-body objects are widely used. Classically introduced with the Rubber Hand Illusion, these illusions have been replicated more recently in virtual reality and by using brain-computer interfaces. Traditionally these illusions investigate the replacement of a body part by an artificial counterpart, however as brain-computer interface research develops it offers us the possibility to explore the case where non-body objects are controlled in addition to movements of our own limbs. Therefore we propose a new illusion designed to test the feeling of ownership and control of an independent supernumerary hand. Subjects are under the impression they control a virtual reality hand via a brain-computer interface, but in reality there is no causal connection between brain activity and virtual hand movement but correct movements are observed with 80% probability. These imitation brain-computer interface trials are interspersed with movements in both the subjects' real hands, which are in view throughout the experiment. We show that subjects develop strong feelings of ownership and control over the third hand, despite only receiving visual feedback with no causal link to the actual brain signals. Our illusion is crucially different from previously reported studies as we demonstrate independent ownership and control of the third hand without loss of ownership in the real hands.

  7. Virtual Reality: Real Promises and False Expectations.

    ERIC Educational Resources Information Center

    Homan, Willem J.

    1994-01-01

    Examines virtual reality (VR), and discusses the dilemma of defining VR, the limitations of the current technology, and the implications of VR for education. Highlights include a VR experience; human factors and the interface; and altered reality versus VR. (Author/AEF)

  8. Development of a multimodal transportation educational virtual appliance (MTEVA) to study congestion during extreme tropical events.

    DOT National Transportation Integrated Search

    2011-11-28

    In this study, a prototype Multimodal Transportation Educational Virtual Appliance (MTEVA) is developed to assist in transportation and cyberinfrastructure undergraduate education. This initial version of the MTEVA provides a graphical user interface...

  9. Virtual reality aided visualization of fluid flow simulations with application in medical education and diagnostics.

    PubMed

    Djukic, Tijana; Mandic, Vesna; Filipovic, Nenad

    2013-12-01

    Medical education, training and preoperative diagnostics can be drastically improved with advanced technologies, such as virtual reality. The method proposed in this paper enables medical doctors and students to visualize and manipulate three-dimensional models created from CT or MRI scans, and also to analyze the results of fluid flow simulations. Simulation of fluid flow using the finite element method is performed, in order to compute the shear stress on the artery walls. The simulation of motion through the artery is also enabled. The virtual reality system proposed here could shorten the length of training programs and make the education process more effective. © 2013 Published by Elsevier Ltd.

  10. Shared virtual environments for telerehabilitation.

    PubMed

    Popescu, George V; Burdea, Grigore; Boian, Rares

    2002-01-01

    Current VR telerehabilitation systems use offline remote monitoring from the clinic and patient-therapist videoconferencing. Such "store and forward" and video-based systems cannot implement medical services involving patient therapist direct interaction. Real-time telerehabilitation applications (including remote therapy) can be developed using a shared Virtual Environment (VE) architecture. We developed a two-user shared VE for hand telerehabilitation. Each site has a telerehabilitation workstation with a videocamera and a Rutgers Master II (RMII) force feedback glove. Each user can control a virtual hand and interact hapticly with virtual objects. Simulated physical interactions between therapist and patient are implemented using hand force feedback. The therapist's graphic interface contains several virtual panels, which allow control over the rehabilitation process. These controls start a videoconferencing session, collect patient data, or apply therapy. Several experimental telerehabilitation scenarios were successfully tested on a LAN. A Web-based approach to "real-time" patient telemonitoring--the monitoring portal for hand telerehabilitation--was also developed. The therapist interface is implemented as a Java3D applet that monitors patient hand movement. The monitoring portal gives real-time performance on off-the-shelf desktop workstations.

  11. Diagnostic pathology in 2012: development of digital pathology in an open access journal

    PubMed Central

    2013-01-01

    Abstract Herein we describe and interpret the digital world of diagnostic surgical pathology, and take the in Pathology leading Open Access Journal Diagnostic Pathology as example. Virtual slide http://www.diagnosticpathology.diagnomx.eu/vs/1944221953867351 PMID:23305209

  12. Research and realization of signal simulation on virtual instrument

    NASA Astrophysics Data System (ADS)

    Zhao, Qi; He, Wenting; Guan, Xiumei

    2010-02-01

    In the engineering project, arbitrary waveform generator controlled by software interface is needed by simulation and test. This article discussed the program using the SCPI (Standard Commands For Programmable Instruments) protocol and the VISA (Virtual Instrument System Architecture) library to control the Agilent signal generator (Agilent N5182A) by instrument communication over the LAN interface. The program can conduct several signal generations such as CW (continuous wave), AM (amplitude modulation), FM (frequency modulation), ΦM (phase modulation), Sweep. As the result, the program system has good operability and portability.

  13. The Next Wave: Humans, Computers, and Redefining Reality

    NASA Technical Reports Server (NTRS)

    Little, William

    2018-01-01

    The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.

  14. Hearing in True 3-D

    NASA Technical Reports Server (NTRS)

    2004-01-01

    In 1984, researchers from Ames Research Center came together to develop advanced human interfaces for NASA s teleoperations that would come to be known as "virtual reality." The basis of the work theorized that if the sensory interfaces met a certain threshold and sufficiently supported each other, then the operator would feel present in the remote/synthetic environment, rather than present in their physical location. Twenty years later, this prolific research continues to pay dividends to society in the form of cutting-edge virtual reality products, such as an interactive audio simulation system.

  15. Diagnostic apparatus and method for use in the alignment of one or more laser means onto a fiber optics interface

    DOEpatents

    Johnson, Steve A.; Shannon, Robert R.

    1987-01-01

    Diagnostic apparatus for use in determining the proper alignment of a plurality of laser beams onto a fiber optics interface is disclosed. The apparatus includes a lens assembly which serves two functions, first to focus a plurality of laser beams onto the fiber optics interface, and secondly to reflect and image the interface using scattered light to a monitor means. The monitor means permits indirect observation of the alignment or focusing of the laser beams onto the fiber optics interface.

  16. Diagnostic apparatus and method for use in the alignment of one or more laser means onto a fiber optics interface

    DOEpatents

    Johnson, S.A.; Shannon, R.R.

    1985-01-18

    Diagnostic apparatus for use in determining the proper alignment of a plurality of laser beams onto a fiber optics interface is disclosed. The apparatus includes a lens assembly which serves two functions, first to focus a plurality of laser beams onto the fiber optics interface, and secondly to reflect and image the interface using scattered light to a monitor means. The monitor means permits indirect observation of the alignment or focusing of the laser beams onto the fiber optics interface.

  17. Digital approach to planning computer-guided surgery and immediate provisionalization in a partially edentulous patient.

    PubMed

    Arunyanak, Sirikarn P; Harris, Bryan T; Grant, Gerald T; Morton, Dean; Lin, Wei-Shao

    2016-07-01

    This report describes a digital approach for computer-guided surgery and immediate provisionalization in a partially edentulous patient. With diagnostic data obtained from cone-beam computed tomography and intraoral digital diagnostic scans, a digital pathway of virtual diagnostic waxing, a virtual prosthetically driven surgical plan, a computer-aided design and computer-aided manufacturing (CAD/CAM) surgical template, and implant-supported screw-retained interim restorations were realized with various open-architecture CAD/CAM systems. The optional CAD/CAM diagnostic casts with planned implant placement were also additively manufactured to facilitate preoperative inspection of the surgical template and customization of the CAD/CAM-fabricated interim restorations. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  18. Virtual displays for 360-degree video

    NASA Astrophysics Data System (ADS)

    Gilbert, Stephen; Boonsuk, Wutthigrai; Kelly, Jonathan W.

    2012-03-01

    In this paper we describe a novel approach for comparing users' spatial cognition when using different depictions of 360- degree video on a traditional 2D display. By using virtual cameras within a game engine and texture mapping of these camera feeds to an arbitrary shape, we were able to offer users a 360-degree interface composed of four 90-degree views, two 180-degree views, or one 360-degree view of the same interactive environment. An example experiment is described using these interfaces. This technique for creating alternative displays of wide-angle video facilitates the exploration of how compressed or fish-eye distortions affect spatial perception of the environment and can benefit the creation of interfaces for surveillance and remote system teleoperation.

  19. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia

    PubMed Central

    Segkouli, Sofia; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  20. Virtual environment architecture for rapid application development

    NASA Technical Reports Server (NTRS)

    Grinstein, Georges G.; Southard, David A.; Lee, J. P.

    1993-01-01

    We describe the MITRE Virtual Environment Architecture (VEA), a product of nearly two years of investigations and prototypes of virtual environment technology. This paper discusses the requirements for rapid prototyping, and an architecture we are developing to support virtual environment construction. VEA supports rapid application development by providing a variety of pre-built modules that can be reconfigured for each application session. The modules supply interfaces for several types of interactive I/O devices, in addition to large-screen or head-mounted displays.

  1. BacNet and Analog/Digital Interfaces of the Building Controls Virtual Testbed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nouidui, Thierry Stephane; Wetter, Michael; Li, Zhengwei

    2011-11-01

    This paper gives an overview of recent developments in the Building Controls Virtual Test Bed (BCVTB), a framework for co-simulation and hardware-in-the-loop. First, a general overview of the BCVTB is presented. Second, we describe the BACnet interface, a link which has been implemented to couple BACnet devices to the BCVTB. We present a case study where the interface was used to couple a whole building simulation program to a building control system to assess in real-time the performance of a real building. Third, we present the ADInterfaceMCC, an analog/digital interface that allows a USB-based analog/digital converter to be linked tomore » the BCVTB. In a case study, we show how the link was used to couple the analog/digital converter to a building simulation model for local loop control.« less

  2. Evaluation of the Next-Gen Exercise Software Interface in the NEEMO Analog

    NASA Technical Reports Server (NTRS)

    Hanson, Andrea; Kalogera, Kent; Sandor, Aniko; Hardy, Marc; Frank, Andrew; English, Kirk; Williams, Thomas; Perera, Jeevan; Amonette, William

    2017-01-01

    NSBRI (National Space Biomedical Research Institute) funded research grant to develop the 'NextGen' exercise software for the NEEMO (NASA Extreme Environment Mission Operations) analog. Develop a software architecture to integrate instructional, motivational and socialization techniques into a common portal to enhance exercise countermeasures in remote environments. Increase user efficiency and satisfaction, and institute commonality across multiple exercise systems. Utilized GUI (Graphical User Interface) design principals focused on intuitive ease of use to minimize training time and realize early user efficiency. Project requirement to test the software in an analog environment. Top Level Project Aims: 1) Improve the usability of crew interface software to exercise CMS (Crew Management System) through common app-like interfaces. 2) Introduce virtual instructional motion training. 3) Use virtual environment to provide remote socialization with family and friends, improve exercise technique, adherence, motivation and ultimately performance outcomes.

  3. The electronic-commerce-oriented virtual merchandise model

    NASA Astrophysics Data System (ADS)

    Fang, Xiaocui; Lu, Dongming

    2004-03-01

    Electronic commerce has been the trend of commerce activities. Providing with Virtual Reality interface, electronic commerce has better expressing capacity and interaction means. But most of the applications of virtual reality technology in EC, 3D model is only the appearance description of merchandises. There is almost no information concerned with commerce information and interaction information. This resulted in disjunction of virtual model and commerce information. So we present Electronic Commerce oriented Virtual Merchandise Model (ECVMM), which combined a model with commerce information, interaction information and figure information of virtual merchandise. ECVMM with abundant information provides better support to information obtainment and communication in electronic commerce.

  4. TeleMed: An example of a new system developed with object technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forslund, D.; Phillips, R.; Tomlinson, B.

    1996-12-01

    Los Alamos National Laboratory has developed a virtual patient record system called TeleMed which is based on a distributed national radiographic and patient record repository located throughout the country. Without leaving their offices, participating doctors can view clinical drug and radiographic data via a sophisticated multimedia interface. For example, a doctor can match a patient`s radiographic information with the data in the repository, review treatment history and success, and then determine the best treatment. Furthermore, the features of TeleMed that make it attractive to clinicians and diagnosticians make it valuable for teaching and presentation as well. Thus, a resident canmore » use TeleMed for self-training in diagnostic techniques and a physician can use it to explain to a patient the course of their illness. In fact, the data can be viewed simultaneously by users at two or more distant locations for consultation with specialists in different fields. This capability is of enormous value to a wide spectrum of healthcare providers. It is made possible by the integration of multimedia information using commercial CORBA technology linking object-enabled databases with client interfaces using a three-tiered architecture.« less

  5. What CORBA can do: An example of a new system developed with object technology: TeleMed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forslund, D.; Phillips, R.; Tomlinson, B.

    1996-05-01

    The TeleMed application grew out of a relationship with physicians at the National Jewish Center for Immunology and Respiratory Medicine (NJC) in Denver. These physicians are experts in pulmonary diseases and radiology, helping patients combat effects of TB and other lung diseases. To make the knowledge and experience at NJC available to a wider audience, LANL has developed a virtual patient record system called TeleMed which is based on distributed national radiographic and patient record repository located throughout the country. Without leaving their offices, participating doctors can view clinical drug and radiographic data via a sophisticated multimedia interface. TeleMed ismore » also valuable for teaching and presentation as well. Thus a resident can use TeleMed for self-training in diagnostic techniques and a physician can use it to explain to a patient the course of their illness. Data can be viewed simultaneously by users at two or more distant locations for consultation with specialists in different fields. This capability is made possible by integration of multimedia information using commercial CORBA technology linking object-enable databases with client interfaces using a three-tiered architecture.« less

  6. Comparative assessment of two interfaces for delivering a multimedia medical course in the French-speaking Virtual Medical University (UMVF).

    PubMed

    Brunetaud, Jean Marc; Leroy, Nicolas; Pelayo, Sylvia; Wascat, Caroline; Renard, Jean Marie; Prin, Lionel; Beuscart-Zéphir, Marie Catherine

    2003-01-01

    The UMVF aims at helping medical students during their normal curriculum via the facilities provided by Internet based techniques. This paper describes a comparative assessment of two interfaces delivering a multimedia course: a conventional web server (WS) and an integrated e-learning platform in the form of a Virtual Campus (VC). Eleven students were arbitrarily divided into two groups. We used a qualitative method for comparing their acceptance of the on line course provided by the two different interfaces. The two groups were globally satisfied. However, a decrease in satisfaction was noted at the end of the experimentation in the VC group. This may be explained by a more complex Graphical User Interface (GUI) of the VC and some constraints which do not exist with the WS. The current e-learning platforms are probably not optimised for working conditions where presential and virtual activities are mixed. We think that a new type of "light" platforms should be developed for these specific working conditions. Students of the two groups also had limitations about the multimedia environment. They may change their opinion if they get more accustomed with the multimedia environment and if their teachers make a more adequate use of the multimedia techniques.

  7. Haptic interfaces: Hardware, software and human performance

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandayam A.

    1995-01-01

    Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed.

  8. High Resolution Integrated Hohlraum-Capsule Simulations for Virtual NIF Ignition Campaign

    NASA Astrophysics Data System (ADS)

    Jones, O. S.; Marinak, M. M.; Cerjan, C. J.; Clark, D. S.; Edwards, M. J.; Haan, S. W.; Langer, S. H.; Salmonson, J. D.

    2009-11-01

    We have undertaken a virtual campaign to assess the viability of the sequence of NIF experiments planned for 2010 that will experimentally tune the shock timing, symmetry, and ablator thickness of a cryogenic ignition capsule prior to the first ignition attempt. The virtual campaign consists of two teams. The ``red team'' creates realistic simulated diagnostic data for a given experiment from the output of a detailed radiation hydrodynamics calculation that has physics models that have been altered in a way that is consistent with probable physics uncertainties. The ``blue team'' executes a series of virtual experiments and interprets the simulated diagnostic data from those virtual experiments. To support this effort we have developed a capability to do very high spatial resolution integrated hohlraum-capsule simulations using the Hydra code. Surface perturbations for all ablator layer surfaces and the DT ice layer are calculated explicitly through mode 30. The effects of the fill tube, cracks in the ice layer, and defects in the ablator are included in models extracted from higher resolution calculations. Very high wave number mix is included through a mix model. We will show results from these calculations in the context of the ongoing virtual campaign.

  9. An Augmented Virtuality Display for Improving UAV Usability

    DTIC Science & Technology

    2005-01-01

    cockpit. For a more universally-understood metaphor, we have turned to virtual environments of the type represented in video games . Many of the...people who have the need to fly UAVs (such as military personnel) have experience with playing video games . They are skilled in navigating virtual...Another aspect of tailoring the interface to those with video game experience is to use familiar controls. Microsoft has developed a popular and

  10. Advanced Technology for Portable Personal Visualization

    DTIC Science & Technology

    1993-01-01

    have no cable to drag. " We submitted a short article describing the ceiling tracker and the requirements demanded of trackers in see-through systems...Newspaper/Magazine Articles : "Virtual Reality: It’s All in the Mind," Atlanta Consnrution, 29 September 1992 "Virtual Reality: Exploring the Future...basic scientific investigation of the human haptic system or to serve as haptic interfaces for virtual environments and teleloperation. 2. Research

  11. Virtual microscopy: an evaluation of its validity and diagnostic performance in routine histologic diagnosis of skin tumors.

    PubMed

    Nielsen, Patricia Switten; Lindebjerg, Jan; Rasmussen, Jan; Starklint, Henrik; Waldstrøm, Marianne; Nielsen, Bjarne

    2010-12-01

    Digitization of histologic slides is associated with many advantages, and its use in routine diagnosis holds great promise. Nevertheless, few articles evaluate virtual microscopy in routine settings. This study is an evaluation of the validity and diagnostic performance of virtual microscopy in routine histologic diagnosis of skin tumors. Our aim is to investigate whether conventional microscopy of skin tumors can be replaced by virtual microscopy. Ninety-six skin tumors and skin-tumor-like changes were consecutively gathered over a 1-week period. Specimens were routinely processed, and digital slides were captured on Mirax Scan (Carl Zeiss MicroImaging, Göttingen, Germany). Four pathologists evaluated the 96 virtual slides and the associated 96 conventional slides twice with intermediate time intervals of at least 3 weeks. Virtual slides that caused difficulties were reevaluated to identify possible reasons for this. The accuracy was 89.2% for virtual microscopy and 92.7% for conventional microscopy. All κ coefficients expressed very good intra- and interobserver agreement. The sensitivities were 85.7% (78.0%-91.0%) and 92.0% (85.5%-95.7%) for virtual and conventional microscopy, respectively. The difference between the sensitivities was 6.3% (0.8%-12.6%). The subsequent reevaluation showed that virtual slides were as useful as conventional slides when rendering a diagnosis. Differences seen are presumed to be due to the pathologists' lack of experience using the virtual microscope. We conclude that it is feasible to make histologic diagnosis on the skin tumor types represented in this study using virtual microscopy after pathologists have completed a period of training. Larger studies should be conducted to verify whether virtual microscopy can replace conventional microscopy in routine practice. Copyright © 2010 Elsevier Inc. All rights reserved.

  12. SAFARI: An Environment for Creating Tutoring Systems in Industrial Training.

    ERIC Educational Resources Information Center

    Gecsei, J.; Frasson, C.

    Safari is a cooperative project involving four Quebec universities, two industrial partners (Virtual Prototypes, Inc., providing the VAPS software package, and Novasys, Inc., a consulting firm specializing in artificial intelligence and training), and government. VAPS (Virtual Applications Prototyping System) is a commercial interface-building and…

  13. Virtual reality in surgical training.

    PubMed

    Lange, T; Indelicato, D J; Rosen, J M

    2000-01-01

    Virtual reality in surgery and, more specifically, in surgical training, faces a number of challenges in the future. These challenges are building realistic models of the human body, creating interface tools to view, hear, touch, feel, and manipulate these human body models, and integrating virtual reality systems into medical education and treatment. A final system would encompass simulators specifically for surgery, performance machines, telemedicine, and telesurgery. Each of these areas will need significant improvement for virtual reality to impact medicine successfully in the next century. This article gives an overview of, and the challenges faced by, current systems in the fast-changing field of virtual reality technology, and provides a set of specific milestones for a truly realistic virtual human body.

  14. Virtual reality applications to automated rendezvous and capture

    NASA Technical Reports Server (NTRS)

    Hale, Joseph; Oneil, Daniel

    1991-01-01

    Virtual Reality (VR) is a rapidly developing Human/Computer Interface (HCI) technology. The evolution of high-speed graphics processors and development of specialized anthropomorphic user interface devices, that more fully involve the human senses, have enabled VR technology. Recently, the maturity of this technology has reached a level where it can be used as a tool in a variety of applications. This paper provides an overview of: VR technology, VR activities at Marshall Space Flight Center (MSFC), applications of VR to Automated Rendezvous and Capture (AR&C), and identifies areas of VR technology that requires further development.

  15. Nature and origins of virtual environments - A bibliographical essay

    NASA Technical Reports Server (NTRS)

    Ellis, S. R.

    1991-01-01

    Virtual environments presented via head-mounted, computer-driven displays provide a new media for communication. They may be analyzed by considering: (1) what may be meant by an environment; (2) what is meant by the process of virtualization; and (3) some aspects of human performance that constrain environmental design. Their origins are traced from previous work in vehicle simulation and multimedia research. Pointers are provided to key technical references, in the dispersed, archival literature, that are relevant to the development and evaluation of virtual-environment interface systems.

  16. NEDE: an open-source scripting suite for developing experiments in 3D virtual environments.

    PubMed

    Jangraw, David C; Johri, Ansh; Gribetz, Meron; Sajda, Paul

    2014-09-30

    As neuroscientists endeavor to understand the brain's response to ecologically valid scenarios, many are leaving behind hyper-controlled paradigms in favor of more realistic ones. This movement has made the use of 3D rendering software an increasingly compelling option. However, mastering such software and scripting rigorous experiments requires a daunting amount of time and effort. To reduce these startup costs and make virtual environment studies more accessible to researchers, we demonstrate a naturalistic experimental design environment (NEDE) that allows experimenters to present realistic virtual stimuli while still providing tight control over the subject's experience. NEDE is a suite of open-source scripts built on the widely used Unity3D game development software, giving experimenters access to powerful rendering tools while interfacing with eye tracking and EEG, randomizing stimuli, and providing custom task prompts. Researchers using NEDE can present a dynamic 3D virtual environment in which randomized stimulus objects can be placed, allowing subjects to explore in search of these objects. NEDE interfaces with a research-grade eye tracker in real-time to maintain precise timing records and sync with EEG or other recording modalities. Python offers an alternative for experienced programmers who feel comfortable mastering and integrating the various toolboxes available. NEDE combines many of these capabilities with an easy-to-use interface and, through Unity's extensive user base, a much more substantial body of assets and tutorials. Our flexible, open-source experimental design system lowers the barrier to entry for neuroscientists interested in developing experiments in realistic virtual environments. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Brain-computer interface users speak up: the Virtual Users' Forum at the 2013 International Brain-Computer Interface Meeting.

    PubMed

    Peters, Betts; Bieker, Gregory; Heckman, Susan M; Huggins, Jane E; Wolf, Catherine; Zeitlin, Debra; Fried-Oken, Melanie

    2015-03-01

    More than 300 researchers gathered at the 2013 International Brain-Computer Interface (BCI) Meeting to discuss current practice and future goals for BCI research and development. The authors organized the Virtual Users' Forum at the meeting to provide the BCI community with feedback from users. We report on the Virtual Users' Forum, including initial results from ongoing research being conducted by 2 BCI groups. Online surveys and in-person interviews were used to solicit feedback from people with disabilities who are expert and novice BCI users. For the Virtual Users' Forum, their responses were organized into 4 major themes: current (non-BCI) communication methods, experiences with BCI research, challenges of current BCIs, and future BCI developments. Two authors with severe disabilities gave presentations during the Virtual Users' Forum, and their comments are integrated with the other results. While participants' hopes for BCIs of the future remain high, their comments about available systems mirror those made by consumers about conventional assistive technology. They reflect concerns about reliability (eg, typing accuracy/speed), utility (eg, applications and the desire for real-time interactions), ease of use (eg, portability and system setup), and support (eg, technical support and caregiver training). People with disabilities, as target users of BCI systems, can provide valuable feedback and input on the development of BCI as an assistive technology. To this end, participatory action research should be considered as a valuable methodology for future BCI research. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. Design and Development of a Virtual Facility Tour Using iPIX(TM) Technology

    NASA Technical Reports Server (NTRS)

    Farley, Douglas L.

    2002-01-01

    The capabilities of the iPIX virtual tour software, in conjunction with a web-based interface create a unique and valuable system that provides users with an efficient virtual capability to tour facilities while being able to acquire the necessary technical content is demonstrated. A users guide to the Mechanics and Durability Branch's virtual tour is presented. The guide provides the user with instruction on operating both scripted and unscripted tours as well as a discussion of the tours for Buildings 1148, 1205 and 1256 and NASA Langley Research Center. Furthermore, an indepth discussion has been presented on how to develop a virtual tour using the iPIX software interface with conventional html and JavaScript. The main aspects for discussion are on network and computing issues associated with using this capability. A discussion of how to take the iPIX pictures, manipulate them and bond them together to form hemispherical images is also presented. Linking of images with additional multimedia content is discussed. Finally, a method to integrate the iPIX software with conventional HTML and JavaScript to facilitate linking with multi-media is presented.

  19. Can walking motions improve visually induced rotational self-motion illusions in virtual reality?

    PubMed

    Riecke, Bernhard E; Freiberg, Jacob B; Grechkin, Timofey Y

    2015-02-04

    Illusions of self-motion (vection) can provide compelling sensations of moving through virtual environments without the need for complex motion simulators or large tracked physical walking spaces. Here we explore the interaction between biomechanical cues (stepping along a rotating circular treadmill) and visual cues (viewing simulated self-rotation) for providing stationary users a compelling sensation of rotational self-motion (circular vection). When tested individually, biomechanical and visual cues were similarly effective in eliciting self-motion illusions. However, in combination they yielded significantly more intense self-motion illusions. These findings provide the first compelling evidence that walking motions can be used to significantly enhance visually induced rotational self-motion perception in virtual environments (and vice versa) without having to provide for physical self-motion or motion platforms. This is noteworthy, as linear treadmills have been found to actually impair visually induced translational self-motion perception (Ash, Palmisano, Apthorp, & Allison, 2013). Given the predominant focus on linear walking interfaces for virtual-reality locomotion, our findings suggest that investigating circular and curvilinear walking interfaces offers a promising direction for future research and development and can help to enhance self-motion illusions, presence and immersion in virtual-reality systems. © 2015 ARVO.

  20. Virtual reality applied to teletesting

    NASA Astrophysics Data System (ADS)

    van den Berg, Thomas J.; Smeenk, Roland J. M.; Mazy, Alain; Jacques, Patrick; Arguello, Luis; Mills, Simon

    2003-05-01

    The activity "Virtual Reality applied to Teletesting" is related to a wider European Space Agency (ESA) initiative of cost reduction, in particular the reduction of test costs. Reduction of costs of space related projects have to be performed on test centre operating costs and customer company costs. This can accomplished by increasing the automation and remote testing ("teletesting") capabilities of the test centre. Main problems related to teletesting are a lack of situational awareness and the separation of control over the test environment. The objective of the activity is to evaluate the use of distributed computing and Virtual Reality technology to support the teletesting of a payload under vacuum conditions, and to provide a unified man-machine interface for the monitoring and control of payload, vacuum chamber and robotics equipment. The activity includes the development and testing of a "Virtual Reality Teletesting System" (VRTS). The VRTS is deployed at one of the ESA certified test centres to perform an evaluation and test campaign using a real payload. The VRTS is entirely written in the Java programming language, using the J2EE application model. The Graphical User Interface runs as an applet in a Web browser, enabling easy access from virtually any place.

  1. An Online Virtual Laboratory of Electricity

    ERIC Educational Resources Information Center

    Gómez Tejedor, J. A.; Moltó Martínez, G.; Barros Vidaurre, C.

    2008-01-01

    In this article, we describe a Java-based virtual laboratory, accessible via the Internet by means of a Web browser. This remote laboratory enables the students to build both direct and alternating current circuits. The program includes a graphical user interface which resembles the connection board, and also the electrical components and tools…

  2. Technology-Enhanced Learning and Community with Market Appeal.

    ERIC Educational Resources Information Center

    Young, Brian Alexander

    2000-01-01

    Describes the University of Dayton's Personalized Virtual Room. This Web interface to a virtual space that looks and feels like a campus residence was designed to encourage communication and connectivity among first-year students before they arrive on campus. Discusses the initiative's goals and successes, student reaction, and lessons learned.…

  3. A high performance two degree-of-freedom kinesthetic interface

    NASA Technical Reports Server (NTRS)

    Adelstein, Bernard D.; Rosen, Michael J.

    1991-01-01

    This summary focuses on the kinesthetic interface of a virtual environment system that was developed at the Newman Laboratory for Biomechanics and Human Rehabilitation at M.I.T. for the study of manual control in both motorically impaired and able-bodied individuals.

  4. Spatial issues in user interface design from a graphic design perspective

    NASA Technical Reports Server (NTRS)

    Marcus, Aaron

    1989-01-01

    The user interface of a computer system is a visual display that provides information about the status of operations on data within the computer and control options to the user that enable adjustments to these operations. From the very beginning of computer technology the user interface was a spatial display, although its spatial features were not necessarily complex or explicitly recognized by the users. All text and nonverbal signs appeared in a virtual space generally thought of as a single flat plane of symbols. Current technology of high performance workstations permits any element of the display to appear as dynamic, multicolor, 3-D signs in a virtual 3-D space. The complexity of appearance and the user's interaction with the display provide significant challenges to the graphic designer of current and future user interfaces. In particular, spatial depiction provides many opportunities for effective communication of objects, structures, processes, navigation, selection, and manipulation. Issues are presented that are relevant to the graphic designer seeking to optimize the user interface's spatial attributes for effective visual communication.

  5. Fragment-Based Docking: Development of the CHARMMing Web User Interface as a Platform for Computer-Aided Drug Design

    PubMed Central

    2015-01-01

    Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser.1 One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing’s capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of “re-dockings” with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing’s docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening. PMID:25151852

  6. Fragment-based docking: development of the CHARMMing Web user interface as a platform for computer-aided drug design.

    PubMed

    Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee

    2014-09-22

    Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.

  7. Role of virtual bronchoscopy in children with a vegetable foreign body in the tracheobronchial tree.

    PubMed

    Behera, G; Tripathy, N; Maru, Y K; Mundra, R K; Gupta, Y; Lodha, M

    2014-12-01

    Multidetector computed tomography virtual bronchoscopy is a non-invasive diagnostic tool which provides a three-dimensional view of the tracheobronchial airway. This study aimed to evaluate the usefulness of virtual bronchoscopy in cases of vegetable foreign body aspiration in children. The medical records of patients with a history of foreign body aspiration from August 2006 to August 2010 were reviewed. Data were collected regarding their clinical presentation and chest X-ray, virtual bronchoscopy and rigid bronchoscopy findings. Cases of metallic and other non-vegetable foreign bodies were excluded from the analysis. Patients with multidetector computed tomography virtual bronchoscopy showing features of vegetable foreign body were included in the analysis. For each patient, virtual bronchoscopy findings were reviewed and compared with those of rigid bronchoscopy. A total of 60 patients; all children ranging from 1 month to 8 years of age, were included. The mean age at presentation was 2.01 years. Rigid bronchoscopy confirmed the results of multidetector computed tomography virtual bronchoscopy (i.e. presence of foreign body, site of lodgement, and size and shape) in 59 patients. In the remaining case, a vegetable foreign body identified by virtual bronchoscopy was revealed by rigid bronchoscopy to be a thick mucus plug. Thus, the positive predictive value of virtual bronchoscopy was 98.3 per cent. Multidetector computed tomography virtual bronchoscopy is a sensitive and specific diagnostic tool for identifying radiolucent vegetable foreign bodies in the tracheobronchial tree. It can also provide a useful pre-operative road map for rigid bronchoscopy. Patients suspected of having an airway foreign body or chronic unexplained respiratory symptoms should undergo multidetector computed tomography virtual bronchoscopy to rule out a vegetable foreign body in the tracheobronchial tree and avoid general anaesthesia and invasive rigid bronchoscopy.

  8. Operative and diagnostic hysteroscopy: A novel learning model combining new animal models and virtual reality simulation.

    PubMed

    Bassil, Alfred; Rubod, Chrystèle; Borghesi, Yves; Kerbage, Yohan; Schreiber, Elie Servan; Azaïs, Henri; Garabedian, Charles

    2017-04-01

    Hysteroscopy is one of the most common gynaecological procedure. Training for diagnostic and operative hysteroscopy can be achieved through numerous previously described models like animal models or virtual reality simulation. We present our novel combined model associating virtual reality and bovine uteruses and bladders. End year residents in obstetrics and gynaecology attended a full day workshop. The workshop was divided in theoretical courses from senior surgeons and hands-on training in operative hysteroscopy and virtual reality Essure ® procedures using the EssureSim™ and Pelvicsim™ simulators with multiple scenarios. Theoretical and operative knowledge was evaluated before and after the workshop and General Points Averages (GPAs) were calculated and compared using a Student's T test. GPAs were significantly higher after the workshop was completed. The biggest difference was observed in operative knowledge (0,28 GPA before workshop versus 0,55 after workshop, p<0,05). All of the 25 residents having completed the workshop applauded the realism an efficiency of this type of training. The force feedback allowed by the cattle uteruses gives the residents the possibility to manage thickness of resection as in real time surgery. Furthermore, the two-horned bovine uteruses allowed to reproduce septa resection in conditions close to human surgery CONCLUSION: Teaching operative and diagnostic hysteroscopy is essential. Managing this training through a full day workshop using a combined animal model and virtual reality simulation is an efficient model not described before. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. From telepathology to virtual pathology institution: the new world of digital pathology.

    PubMed

    Kayser, K; Kayser, G; Radziszowski, D; Oehmann, A

    Telepathology has left its childhood. Its technical development is mature, and its use for primary (frozen section) and secondary (expert consultation) diagnosis has been expanded to a great amount. This is in contrast to a virtual pathology laboratory, which is still under technical constraints. Similar to telepathology, which can also be used for e-learning and e-training in pathology, as exemplarily is demonstrated on Digital Lung Pathology (Klaus.Kayser@charite.de) at least two kinds of virtual pathology laboratories will be implemented in the near future: a) those with distributed pathologists and distributed (> or = 1) laboratories associated to individual biopsy stations/surgical theatres, and b) distributed pathologists (usually situated in one institution) and a centralized laboratory, which digitizes complete histological slides. Both scenarios are under intensive technical investigations. The features of virtual pathology comprise a virtual pathology institution (mode a) that accepts a complete case with the patient's history, clinical findings, and (pre-selected) images for first diagnosis. The diagnostic responsibility is that of a conventional institution. The Internet serves as platform for information transfer, and an open server such as the iPATH (http://telepath.patho.unibas.ch) for coordination and performance of the diagnostic procedure. The size and number of transferred images have to be limited, and usual different magnifications have to be used. The sender needs to possess experiences in image sampling techniques, which present with the most significant information. A group of pathologists is "on duty", or selects one member for a predefined duty period. The diagnostic statement of the pathologist(s) on duty is retransmitted to the sender with full responsibility. The first experiences of a virtual pathology institution group working with the iPATH server working with a small hospital of the Salomon islands are promising. A centralized virtual pathology institution (mode b) depends upon the digitalization of a complete slide, and the transfer of large sized images to different pathologists working in one institution. The technical performance of complete slide digitalization is still under development. Virtual pathology can be combined with e-learning and e-training, that will serve for a powerful daily-work-integrated pathology system. At present, e-learning systems are "stand-alone" solutions distributed on CD or via Internet. A characteristic example is the Digital Lung Pathology CD, which includes about 60 different rare and common lung diseases with some features of electronic communication. These features include access to scientific library systems (PubMed), distant measurement servers (EuroQuant), automated immunohisto-chemistry measurements, or electronic journals (Elec J Pathol Histol, www.pathology-online.org). It combines e-learning and e-training with some acoustic support. A new and complete database based upon this CD will combine e-learning and e-teaching with the actual workflow in a virtual pathology institution (mode a). The technological problems are solved and do not depend upon technical constraints such as slide scanning systems. At present, telepathology serves as promoter for a complete new landscape in diagnostic pathology, the so-called virtual pathology institution. Industrial and scientific efforts will probably allow an implementation of this technique within the next two years with exciting diagnostic and scientific perspectives.

  10. Virtual Reality and Online Databases: Will "Look and Feel" Literally Mean "Look" and "Feel"? [and]"Online" Interviews Dr. Thomas A. Furness III, Virtual Reality Pioneer.

    ERIC Educational Resources Information Center

    Miller, Carmen

    1992-01-01

    The first of two articles discusses virtual reality (VR) and online databases; the second one reports on an interview with Thomas A. Furness III, who defines VR and explains work at the Human Interface Technology Laboratory (HIT). Sidebars contain a glossary of VR terms and a conversation with Toni Emerson, the HIT lab's librarian. (LRW)

  11. Implementing virtual reality interfaces for the geosciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, W.; Jacobsen, J.; Austin, A.

    1996-06-01

    For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called {open_quotes}Virtual Reality{close_quotes} (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools. VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter threemore » or six-dimensional parameters. In the latter, three or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world, but the keyboard is the obstacle in that typing is cumbersome, error-prone and typically slow. In the latter, the user can interact with these parameters by means of motor skills which are highly developed. Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as a tool for computing streamlines via manipulation of a {open_quotes}rake.{close_quotes} The rake is presented to the user in the form of a {open_quotes}virtual well{close_quotes} icon, and provides parameters used by the streamlines algorithm.« less

  12. Dynamic Extension of a Virtualized Cluster by using Cloud Resources

    NASA Astrophysics Data System (ADS)

    Oberst, Oliver; Hauth, Thomas; Kernert, David; Riedel, Stephan; Quast, Günter

    2012-12-01

    The specific requirements concerning the software environment within the HEP community constrain the choice of resource providers for the outsourcing of computing infrastructure. The use of virtualization in HPC clusters and in the context of cloud resources is therefore a subject of recent developments in scientific computing. The dynamic virtualization of worker nodes in common batch systems provided by ViBatch serves each user with a dynamically virtualized subset of worker nodes on a local cluster. Now it can be transparently extended by the use of common open source cloud interfaces like OpenNebula or Eucalyptus, launching a subset of the virtual worker nodes within the cloud. This paper demonstrates how a dynamically virtualized computing cluster is combined with cloud resources by attaching remotely started virtual worker nodes to the local batch system.

  13. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    NASA Astrophysics Data System (ADS)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  14. The Human Interface Technology Laboratory.

    ERIC Educational Resources Information Center

    Washington Univ., Seattle. Washington Technology Center.

    This booklet contains information about the Human Interface Technology Laboratory (HITL), which was established by the Washington Technology Center at the University of Washington to transform virtual world concepts and research into practical, economically viable technology products. The booklet is divided into seven sections: (1) a brief…

  15. Creating widely accessible spatial interfaces: mobile VR for managing persistent pain.

    PubMed

    Schroeder, David; Korsakov, Fedor; Jolton, Joseph; Keefe, Francis J; Haley, Alex; Keefe, Daniel F

    2013-01-01

    Using widely accessible VR technologies, researchers have implemented a series of multimodal spatial interfaces and virtual environments. The results demonstrate the degree to which we can now use low-cost (for example, mobile-phone based) VR environments to create rich virtual experiences involving motion sensing, physiological inputs, stereoscopic imagery, sound, and haptic feedback. Adapting spatial interfaces to these new platforms can open up exciting application areas for VR. In this case, the application area was in-home VR therapy for patients suffering from persistent pain (for example, arthritis and cancer pain). For such therapy to be successful, a rich spatial interface and rich visual aesthetic are particularly important. So, an interdisciplinary team with expertise in technology, design, meditation, and the psychology of pain collaborated to iteratively develop and evaluate several prototype systems. The video at http://youtu.be/mMPE7itReds demonstrates how the sine wave fitting responds to walking motions, for a walking-in-place application.

  16. Virtual Observatory Interfaces to the Chandra Data Archive

    NASA Astrophysics Data System (ADS)

    Tibbetts, M.; Harbo, P.; Van Stone, D.; Zografou, P.

    2014-05-01

    The Chandra Data Archive (CDA) plays a central role in the operation of the Chandra X-ray Center (CXC) by providing access to Chandra data. Proprietary interfaces have been the backbone of the CDA throughout the Chandra mission. While these interfaces continue to provide the depth and breadth of mission specific access Chandra users expect, the CXC has been adding Virtual Observatory (VO) interfaces to the Chandra proposal catalog and observation catalog. VO interfaces provide standards-based access to Chandra data through simple positional queries or more complex queries using the Astronomical Data Query Language. Recent development at the CDA has generalized our existing VO services to create a suite of services that can be configured to provide VO interfaces to any dataset. This approach uses a thin web service layer for the individual VO interfaces, a middle-tier query component which is shared among the VO interfaces for parsing, scheduling, and executing queries, and existing web services for file and data access. The CXC VO services provide Simple Cone Search (SCS), Simple Image Access (SIA), and Table Access Protocol (TAP) implementations for both the Chandra proposal and observation catalogs within the existing archive architecture. Our work with the Chandra proposal and observation catalogs, as well as additional datasets beyond the CDA, illustrates how we can provide configurable VO services to extend core archive functionality.

  17. VirGO: A Visual Browser for the ESO Science Archive Facility

    NASA Astrophysics Data System (ADS)

    Hatziminaoglou, Evanthia; Chéreau, Fabien

    2009-03-01

    VirGO is the next generation Visual Browser for the ESO Science Archive Facility (SAF) developed in the Virtual Observatory Project Office. VirGO enables astronomers to discover and select data easily from millions of observations in a visual and intuitive way. It allows real-time access and the graphical display of a large number of observations by showing instrumental footprints and image previews, as well as their selection and filtering for subsequent download from the ESO SAF web interface. It also permits the loading of external FITS files or VOTables, as well as the superposition of Digitized Sky Survey images to be used as background. All data interfaces are based on Virtual Observatory (VO) standards that allow access to images and spectra from external data centres, and interaction with the ESO SAF web interface or any other VO applications.

  18. How Effective Is a Virtual Consultation Process in Facilitating Multidisciplinary Decision-Making for Malignant Epidural Spinal Cord Compression?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fitzpatrick, David; St Luke's Hospital, Dublin; Grabarz, Daniel

    Purpose: The purpose of this study was to assess the accuracy of a virtual consultation (VC) process in determining treatment strategy for patients with malignant epidural spinal cord compression (MESCC). Methods and Materials: A prospective clinical database was maintained for patients with MESCC. A virtual consultation process (involving exchange of key predetermined clinical information and diagnostic imaging) facilitated rapid decision-making between oncologists and spinal surgeons. Diagnostic imaging was reviewed retrospectively (by R.R.) for surgical opinions in all patients. The primary outcome was the accuracy of virtual consultation opinion in predicting the final treatment recommendation. Results: After excluding 20 patients whomore » were referred directly to the spinal surgeon, 125 patients were eligible for virtual consultation. Of the 46 patients who had a VC, surgery was recommended in 28 patients and actually given to 23. A retrospective review revealed that 5/79 patients who did not have a VC would have been considered surgical candidates. The overall accuracy of the virtual consultation process was estimated at 92%. Conclusion: The VC process for MESCC patients provides a reliable means of arriving at a multidisciplinary opinion while minimizing patient transfer. This can potentially shorten treatment decision time and enhance clinical outcomes.« less

  19. CT Arthrography and Virtual Arthroscopy in the Diagnosis of the Anterior Cruciate Ligament and Meniscal Abnormalities of the Knee Joint

    PubMed Central

    Lee, Whal; Kim, Ho Sung; Kim, Seok Jung; Kim, Hyung Ho; Chung, Jin Wook; Kang, Heung Sik; Choi, Ja-Young

    2004-01-01

    Objective To determine the diagnostic accuracy of CT arthrography and virtual arthroscopy in the diagnosis of anterior cruciate ligament and meniscus pathology. Materials and Methods Thirty-eight consecutive patients who underwent CT arthrography and arthroscopy of the knee were included in this study. The ages of the patients ranged from 19 to 52 years and all of the patients were male. Sagittal, coronal, transverse and oblique coronal multiplanar reconstruction images were reformatted from CT arthrography. Virtual arthroscopy was performed from 6 standard views using a volume rendering technique. Three radiologists analyzed the MPR images and two orthopedic surgeons analyzed the virtual arthroscopic images. Results The sensitivity and specificity of CT arthrography for the diagnosis of anterior cruciate ligament abnormalities were 87.5%-100% and 93.3-96.7%, respectively, and those for meniscus abnormalities were 91.7%-100% and 98.1%, respectively. The sensitivity and specificity of virtual arthroscopy for the diagnosis of anterior cruciate ligament abnormalities were 87.5% and 83.3-90%, respectively, and those for meniscus abnormalities were 83.3%-87.5% and 96.1-98.1%, respectively. Conclusion CT arthrography and virtual arthroscopy showed good diagnostic accuracy for anterior cruciate ligament and meniscal abnormalities. PMID:15064559

  20. Virtual prototyping and testing of in-vehicle interfaces.

    PubMed

    Bullinger, Hans-Jörg; Dangelmaier, Manfred

    2003-01-15

    Electronic innovations that are slowly but surely changing the very nature of driving need to be tested before being introduced to the market. To meet this need a system for integrated virtual prototyping and testing has been developed. Functional virtual prototypes of various traffic systems, such as driver assistance, driver information, and multimedia systems can now be easily tested in a driving simulator by a rapid prototyping approach. The system has been applied in recent R&D projects.

  1. Guidelines for developing distributed virtual environment applications

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.

    1998-08-01

    We have conducted a variety of projects that served to investigate the limits of virtual environments and distributed virtual environment (DVE) technology for the military and medical professions. The projects include an application that allows the user to interactively explore a high-fidelity, dynamic scale model of the Solar System and a high-fidelity, photorealistic, rapidly reconfigurable aircraft simulator. Additional projects are a project for observing, analyzing, and understanding the activity in a military distributed virtual environment, a project to develop a distributed threat simulator for training Air Force pilots, a virtual spaceplane to determine user interface requirements for a planned military spaceplane system, and an automated wingman for use in supplementing or replacing human-controlled systems in a DVE. The last two projects are a virtual environment user interface framework; and a project for training hospital emergency department personnel. In the process of designing and assembling the DVE applications in support of these projects, we have developed rules of thumb and insights into assembling DVE applications and the environment itself. In this paper, we open with a brief review of the applications that were the source for our insights and then present the lessons learned as a result of these projects. The lessons we have learned fall primarily into five areas. These areas are requirements development, software architecture, human-computer interaction, graphical database modeling, and construction of computer-generated forces.

  2. Seamless 3D interaction for virtual tables, projection planes, and CAVEs

    NASA Astrophysics Data System (ADS)

    Encarnacao, L. M.; Bimber, Oliver; Schmalstieg, Dieter; Barton, Robert J., III

    2000-08-01

    The Virtual Table presents stereoscopic graphics to a user in a workbench-like setting. This device shares with other large- screen display technologies (such as data walls and surround- screen projection systems) the lack of human-centered unencumbered user interfaces and 3D interaction technologies. Such shortcomings present severe limitations to the application of virtual reality (VR) technology to time- critical applications as well as employment scenarios that involve heterogeneous groups of end-users without high levels of computer familiarity and expertise. Traditionally such employment scenarios are common in planning-related application areas such as mission rehearsal and command and control. For these applications, a high grade of flexibility with respect to the system requirements (display and I/O devices) as well as to the ability to seamlessly and intuitively switch between different interaction modalities and interaction are sought. Conventional VR techniques may be insufficient to meet this challenge. This paper presents novel approaches for human-centered interfaces to Virtual Environments focusing on the Virtual Table visual input device. It introduces new paradigms for 3D interaction in virtual environments (VE) for a variety of application areas based on pen-and-clipboard, mirror-in-hand, and magic-lens metaphors, and introduces new concepts for combining VR and augmented reality (AR) techniques. It finally describes approaches toward hybrid and distributed multi-user interaction environments and concludes by hypothesizing on possible use cases for defense applications.

  3. Affective Interaction with a Virtual Character Through an fNIRS Brain-Computer Interface.

    PubMed

    Aranyi, Gabor; Pecune, Florian; Charles, Fred; Pelachaud, Catherine; Cavazza, Marc

    2016-01-01

    Affective brain-computer interfaces (BCI) harness Neuroscience knowledge to develop affective interaction from first principles. In this article, we explore affective engagement with a virtual agent through Neurofeedback (NF). We report an experiment where subjects engage with a virtual agent by expressing positive attitudes towards her under a NF paradigm. We use for affective input the asymmetric activity in the dorsolateral prefrontal cortex (DL-PFC), which has been previously found to be related to the high-level affective-motivational dimension of approach/avoidance. The magnitude of left-asymmetric DL-PFC activity, measured using functional near infrared spectroscopy (fNIRS) and treated as a proxy for approach, is mapped onto a control mechanism for the virtual agent's facial expressions, in which action units (AUs) are activated through a neural network. We carried out an experiment with 18 subjects, which demonstrated that subjects are able to successfully engage with the virtual agent by controlling their mental disposition through NF, and that they perceived the agent's responses as realistic and consistent with their projected mental disposition. This interaction paradigm is particularly relevant in the case of affective BCI as it facilitates the volitional activation of specific areas normally not under conscious control. Overall, our contribution reconciles a model of affect derived from brain metabolic data with an ecologically valid, yet computationally controllable, virtual affective communication environment.

  4. Automated detection of heuristics and biases among pathologists in a computer-based system.

    PubMed

    Crowley, Rebecca S; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-08-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to diagnostic errors. The authors conducted the study using a computer-based system to view and diagnose virtual slide cases. The software recorded participant responses throughout the diagnostic process, and automatically classified participant actions based on definitions of eight common heuristics and/or biases. The authors measured frequency of heuristic use and bias across three levels of training. Biases studied were detected at varying frequencies, with availability and search satisficing observed most frequently. There were few significant differences by level of training. For representativeness and anchoring, the heuristic was used appropriately as often or more often than it was used in biased judgment. Approximately half of the diagnostic errors were associated with one or more biases. We conclude that heuristic use and biases were observed among physicians at all levels of training using the virtual slide system, although their frequencies varied. The system can be employed to detect heuristic use and to test methods for decreasing diagnostic errors resulting from cognitive biases.

  5. Safety in numbers 3: Authenticity, Building knowledge & skills and Competency development & assessment: the ABC of safe medication dosage calculation problem-solving pedagogy.

    PubMed

    Weeks, Keith W; Meriel Hutton, B; Coben, Diana; Clochesy, John M; Pontin, David

    2013-03-01

    When designing learning and assessment environments it is essential to articulate the underpinning education philosophy, theory, model and learning style support mechanisms that inform their structure and content. We elaborate on original PhD research that articulates the design rationale of authentic medication dosage calculation problem-solving (MDC-PS) learning and diagnostic assessment environments. These environments embody the principles of authenticity, building knowledge and skills and competency assessment and are designed to support development of competence and bridging of the theory-practice gap. Authentic learning and diagnostic assessment environments capture the features and expert practices that are located in real world practice cultures and recreate them in authentic virtual clinical environments. We explore how this provides students with a safe virtual authentic environment to actively experience, practice and undertake MDC-PS learning and assessment activities. We argue that this is integral to the construction and diagnostic assessment of schemata validity (mental constructions and frameworks that are an individual's internal representation of their world), bridging of the theory-practice gap and cognitive and functional competence development. We illustrate these principles through the underpinning pedagogical design of two online virtual authentic learning and diagnostic assessment environments (safeMedicate and eDose™). Copyright © 2012. Published by Elsevier Ltd.

  6. The system of technical diagnostics of the industrial safety information network

    NASA Astrophysics Data System (ADS)

    Repp, P. V.

    2017-01-01

    This research is devoted to problems of safety of the industrial information network. Basic sub-networks, ensuring reliable operation of the elements of the industrial Automatic Process Control System, were identified. The core tasks of technical diagnostics of industrial information safety were presented. The structure of the technical diagnostics system of the information safety was proposed. It includes two parts: a generator of cyber-attacks and the virtual model of the enterprise information network. The virtual model was obtained by scanning a real enterprise network. A new classification of cyber-attacks was proposed. This classification enables one to design an efficient generator of cyber-attacks sets for testing the virtual modes of the industrial information network. The numerical method of the Monte Carlo (with LPτ - sequences of Sobol), and Markov chain was considered as the design method for the cyber-attacks generation algorithm. The proposed system also includes a diagnostic analyzer, performing expert functions. As an integrative quantitative indicator of the network reliability the stability factor (Kstab) was selected. This factor is determined by the weight of sets of cyber-attacks, identifying the vulnerability of the network. The weight depends on the frequency and complexity of cyber-attacks, the degree of damage, complexity of remediation. The proposed Kstab is an effective integral quantitative measure of the information network reliability.

  7. Virtual Reality: An Experiential Tool for Clinical Psychology

    ERIC Educational Resources Information Center

    Riva, Giuseppe

    2009-01-01

    Several Virtual Reality (VR) applications for the understanding, assessment and treatment of mental health problems have been developed in the last 15 years. Typically, in VR the patient learns to manipulate problematic situations related to his/her problem. In fact, VR can be described as an advanced form of human-computer interface that is able…

  8. The Best of All Worlds: Immersive Interfaces for Art Education in Virtual and Real World Teaching and Learning Environments

    ERIC Educational Resources Information Center

    Grenfell, Janette

    2013-01-01

    Selected ubiquitous technologies encourage collaborative participation between higher education students and educators within a virtual socially networked e-learning landscape. Multiple modes of teaching and learning, ranging from real world experiences, to text and digital images accessed within the Deakin studies online learning management…

  9. WebIntera-Classroom: An Interaction-Aware Virtual Learning Environment for Augmenting Learning Interactions

    ERIC Educational Resources Information Center

    Chen, Jingjing; Xu, Jianliang; Tang, Tao; Chen, Rongchao

    2017-01-01

    Interaction is critical for successful teaching and learning in a virtual learning environment (VLE). This paper presents a web-based interaction-aware VLE--WebIntera-classroom--which aims to augment learning interactions by increasing the learner-to-content and learner-to-instructor interactions. We design a ubiquitous interactive interface that…

  10. Virtual Classroom for Business Planning Formulation.

    ERIC Educational Resources Information Center

    Osorio, J.; Rubio-Royo, E.; Ocon, A.

    One of the most promising possibilities of the World Wide Web resides in its potential to support distance education. In 1996, the University of Las Palmas de Gran Canaria developed the "INNOVA Project" in order to promote Web-based training and learning. As a result, the Virtual Classroom Interface (IVA) was created. Several software…

  11. Active Learning Environments with Robotic Tangibles: Children's Physical and Virtual Spatial Programming Experiences

    ERIC Educational Resources Information Center

    Burleson, Winslow S.; Harlow, Danielle B.; Nilsen, Katherine J.; Perlin, Ken; Freed, Natalie; Jensen, Camilla Nørgaard; Lahey, Byron; Lu, Patrick; Muldner, Kasia

    2018-01-01

    As computational thinking becomes increasingly important for children to learn, we must develop interfaces that leverage the ways that young children learn to provide opportunities for them to develop these skills. Active Learning Environments with Robotic Tangibles (ALERT) and Robopad, an analogous on-screen virtual spatial programming…

  12. Robot Teleoperation and Perception Assistance with a Virtual Holographic Display

    NASA Technical Reports Server (NTRS)

    Goddard, Charles O.

    2012-01-01

    Teleoperation of robots in space from Earth has historically been dfficult. Speed of light delays make direct joystick-type control infeasible, so it is desirable to command a robot in a very high-level fashion. However, in order to provide such an interface, knowledge of what objects are in the robot's environment and how they can be interacted with is required. In addition, many tasks that would be desirable to perform are highly spatial, requiring some form of six degree of freedom input. These two issues can be combined, allowing the user to assist the robot's perception by identifying the locations of objects in the scene. The zSpace system, a virtual holographic environment, provides a virtual three-dimensional space superimposed over real space and a stylus tracking position and rotation inside of it. Using this system, a possible interface for this sort of robot control is proposed.

  13. The expert surgical assistant. An intelligent virtual environment with multimodal input.

    PubMed

    Billinghurst, M; Savage, J; Oppenheimer, P; Edmond, C

    1996-01-01

    Virtual Reality has made computer interfaces more intuitive but not more intelligent. This paper shows how an expert system can be coupled with multimodal input in a virtual environment to provide an intelligent simulation tool or surgical assistant. This is accomplished in three steps. First, voice and gestural input is interpreted and represented in a common semantic form. Second, a rule-based expert system is used to infer context and user actions from this semantic representation. Finally, the inferred user actions are matched against steps in a surgical procedure to monitor the user's progress and provide automatic feedback. In addition, the system can respond immediately to multimodal commands for navigational assistance and/or identification of critical anatomical structures. To show how these methods are used we present a prototype sinus surgery interface. The approach described here may easily be extended to a wide variety of medical and non-medical training applications by making simple changes to the expert system database and virtual environment models. Successful implementation of an expert system in both simulated and real surgery has enormous potential for the surgeon both in training and clinical practice.

  14. Evaluating the Usability of Pinchigator, a system for Navigating Virtual Worlds using Pinch Gloves

    NASA Technical Reports Server (NTRS)

    Hamilton, George S.; Brookman, Stephen; Dumas, Joseph D. II; Tilghman, Neal

    2003-01-01

    Appropriate design of two dimensional user interfaces (2D U/I) utilizing the well known WIMP (Window, Icon, Menu, Pointing device) environment for computer software is well studied and guidance can be found in several standards. Three-dimensional U/I design is not nearly so mature as 2D U/I, and standards bodies have not reached consensus on what makes a usable interface. This is especially true when the tools for interacting with the virtual environment may include stereo viewing, real time trackers and pinch gloves instead of just a mouse & keyboard. Over the last several years the authors have created a 3D U/I system dubbed Pinchigator for navigating virtual worlds based on the dVise dV/Mockup visualization software, Fakespace Pinch Gloves and Pohlemus trackers. The current work is to test the usability of the system on several virtual worlds, suggest improvements to increase Pinchigator s usability, and then to generalize about what was learned and how those lessons might be applied to improve other 3D U/I systems.

  15. Strategies for combining physics videos and virtual laboratories in the training of physics teachers

    NASA Astrophysics Data System (ADS)

    Dickman, Adriana; Vertchenko, Lev; Martins, Maria Inés

    2007-03-01

    Among the multimedia resources used in physics education, the most prominent are virtual laboratories and videos. On one hand, computer simulations and applets have very attractive graphic interfaces, showing an incredible amount of detail and movement. On the other hand, videos, offer the possibility of displaying high quality images, and are becoming more feasible with the increasing availability of digital resources. We believe it is important to discuss, throughout the teacher training program, both the functionality of information and communication technology (ICT) in physics education and, the varied applications of these resources. In our work we suggest the introduction of ICT resources in a sequence integrating these important tools in the teacher training program, as opposed to the traditional approach, in which virtual laboratories and videos are introduced separately. In this perspective, when we introduce and utilize virtual laboratory techniques we also provide for its use in videos, taking advantage of graphic interfaces. Thus the students in our program learn to use instructional software in the production of videos for classroom use.

  16. Closed-loop dialog model of face-to-face communication with a photo-real virtual human

    NASA Astrophysics Data System (ADS)

    Kiss, Bernadette; Benedek, Balázs; Szijárto, Gábor; Takács, Barnabás

    2004-01-01

    We describe an advanced Human Computer Interaction (HCI) model that employs photo-realistic virtual humans to provide digital media users with information, learning services and entertainment in a highly personalized and adaptive manner. The system can be used as a computer interface or as a tool to deliver content to end-users. We model the interaction process between the user and the system as part of a closed loop dialog taking place between the participants. This dialog, exploits the most important characteristics of a face-to-face communication process, including the use of non-verbal gestures and meta communication signals to control the flow of information. Our solution is based on a Virtual Human Interface (VHI) technology that was specifically designed to be able to create emotional engagement between the virtual agent and the user, thus increasing the efficiency of learning and/or absorbing any information broadcasted through this device. The paper reviews the basic building blocks and technologies needed to create such a system and discusses its advantages over other existing methods.

  17. Human-computer interface

    DOEpatents

    Anderson, Thomas G.

    2004-12-21

    The present invention provides a method of human-computer interfacing. Force feedback allows intuitive navigation and control near a boundary between regions in a computer-represented space. For example, the method allows a user to interact with a virtual craft, then push through the windshield of the craft to interact with the virtual world surrounding the craft. As another example, the method allows a user to feel transitions between different control domains of a computer representation of a space. The method can provide for force feedback that increases as a user's locus of interaction moves near a boundary, then perceptibly changes (e.g., abruptly drops or changes direction) when the boundary is traversed.

  18. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    PubMed Central

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-01-01

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. PMID:24113680

  19. Force sensitive handles and capacitive touch sensor for driving a flexible haptic-based immersive system.

    PubMed

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-10-09

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  20. A convertor and user interface to import CAD files into worldtoolkit virtual reality systems

    NASA Technical Reports Server (NTRS)

    Wang, Peter Hor-Ching

    1996-01-01

    Virtual Reality (VR) is a rapidly developing human-to-computer interface technology. VR can be considered as a three-dimensional computer-generated Virtual World (VW) which can sense particular aspects of a user's behavior, allow the user to manipulate the objects interactively, and render the VW at real-time accordingly. The user is totally immersed in the virtual world and feel the sense of transforming into that VW. NASA/MSFC Computer Application Virtual Environments (CAVE) has been developing the space-related VR applications since 1990. The VR systems in CAVE lab are based on VPL RB2 system which consists of a VPL RB2 control tower, an LX eyephone, an Isotrak polhemus sensor, two Fastrak polhemus sensors, a folk of Bird sensor, and two VPL DG2 DataGloves. A dynamics animator called Body Electric from VPL is used as the control system to interface with all the input/output devices and to provide the network communications as well as VR programming environment. The RB2 Swivel 3D is used as the modelling program to construct the VW's. A severe limitation of the VPL VR system is the use of RB2 Swivel 3D, which restricts the files to a maximum of 1020 objects and doesn't have the advanced graphics texture mapping. The other limitation is that the VPL VR system is a turn-key system which does not provide the flexibility for user to add new sensors and C language interface. Recently, NASA/MSFC CAVE lab provides VR systems built on Sense8 WorldToolKit (WTK) which is a C library for creating VR development environments. WTK provides device drivers for most of the sensors and eyephones available on the VR market. WTK accepts several CAD file formats, such as Sense8 Neutral File Format, AutoCAD DXF and 3D Studio file format, Wave Front OBJ file format, VideoScape GEO file format, Intergraph EMS stereolithographics and CATIA Stereolithographics STL file formats. WTK functions are object-oriented in their naming convention, are grouped into classes, and provide easy C language interface. Using a CAD or modelling program to build a VW for WTK VR applications, we typically construct the stationary universe with all the geometric objects except the dynamic objects, and create each dynamic object in an individual file.

  1. Development and application of virtual reality for man/systems integration

    NASA Technical Reports Server (NTRS)

    Brown, Marcus

    1991-01-01

    While the graphical presentation of computer models signified a quantum leap over presentations limited to text and numbers, it still has the problem of presenting an interface barrier between the human user and the computer model. The user must learn a command language in order to orient themselves in the model. For example, to move left from the current viewpoint of the model, they might be required to type 'LEFT' at a keyboard. This command is fairly intuitive, but if the viewpoint moves far enough that there are no visual cues overlapping with the first view, the user does not know if the viewpoint has moved inches, feet, or miles to the left, or perhaps remained in the same position, but rotated to the left. Until the user becomes quite familiar with the interface language of the computer model presentation, they will be proned to lossing their bearings frequently. Even a highly skilled user will occasionally get lost in the model. A new approach to presenting type type of information is to directly interpret the user's body motions as the input language for determining what view to present. When the user's head turns 45 degrees to the left, the viewpoint should be rotated 45 degrees to the left. Since the head moves through several intermediate angles between the original view and the final one, several intermediate views should be presented, providing the user with a sense of continuity between the original view and the final one. Since the primary way a human physically interacts with their environment should monitor the movements of the user's hands and alter objects in the virtual model in a way consistent with the way an actual object would move when manipulated using the same hand movements. Since this approach to the man-computer interface closely models the same type of interface that humans have with the physical world, this type of interface is often called virtual reality, and the model is referred to as a virtual world. The task of this summer fellowship was to set up a virtual reality system at MSFC and begin applying it to some of the questions which concern scientists and engineers involved in space flight. A brief discussion of this work is presented.

  2. Efficient operating system level virtualization techniques for cloud resources

    NASA Astrophysics Data System (ADS)

    Ansu, R.; Samiksha; Anju, S.; Singh, K. John

    2017-11-01

    Cloud computing is an advancing technology which provides the servcies of Infrastructure, Platform and Software. Virtualization and Computer utility are the keys of Cloud computing. The numbers of cloud users are increasing day by day. So it is the need of the hour to make resources available on demand to satisfy user requirements. The technique in which resources namely storage, processing power, memory and network or I/O are abstracted is known as Virtualization. For executing the operating systems various virtualization techniques are available. They are: Full System Virtualization and Para Virtualization. In Full Virtualization, the whole architecture of hardware is duplicated virtually. No modifications are required in Guest OS as the OS deals with the VM hypervisor directly. In Para Virtualization, modifications of OS is required to run in parallel with other OS. For the Guest OS to access the hardware, the host OS must provide a Virtual Machine Interface. OS virtualization has many advantages such as migrating applications transparently, consolidation of server, online maintenance of OS and providing security. This paper briefs both the virtualization techniques and discusses the issues in OS level virtualization.

  3. Robotics virtual rail system and method

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID; Walton, Miles C [Idaho Falls, ID

    2011-07-05

    A virtual track or rail system and method is described for execution by a robot. A user, through a user interface, generates a desired path comprised of at least one segment representative of the virtual track for the robot. Start and end points are assigned to the desired path and velocities are also associated with each of the at least one segment of the desired path. A waypoint file is generated including positions along the virtual track representing the desired path with the positions beginning from the start point to the end point including the velocities of each of the at least one segment. The waypoint file is sent to the robot for traversing along the virtual track.

  4. The advantages of advanced computer-assisted diagnostics and three-dimensional preoperative planning on implant position in orbital reconstruction.

    PubMed

    Jansen, Jesper; Schreurs, Ruud; Dubois, Leander; Maal, Thomas J J; Gooris, Peter J J; Becking, Alfred G

    2018-04-01

    Advanced three-dimensional (3D) diagnostics and preoperative planning are the first steps in computer-assisted surgery (CAS). They are an integral part of the workflow, and allow the surgeon to adequately assess the fracture and to perform virtual surgery to find the optimal implant position. The goal of this study was to evaluate the accuracy and predictability of 3D diagnostics and preoperative virtual planning without intraoperative navigation in orbital reconstruction. In 10 cadaveric heads, 19 complex orbital fractures were created. First, all fractures were reconstructed without preoperative planning (control group) and at a later stage the reconstructions were repeated with the help of preoperative planning. Preformed titanium mesh plates were used for the reconstructions by two experienced oral and maxillofacial surgeons. The preoperative virtual planning was easily accessible for the surgeon during the reconstruction. Computed tomographic scans were obtained before and after creation of the orbital fractures and postoperatively. Using a paired t-test, implant positioning accuracy (translation and rotations) of both groups were evaluated by comparing the planned implant position with the position of the implant on the postoperative scan. Implant position improved significantly (P < 0.05) for translation, yaw and roll in the group with preoperative planning (Table 1). Pitch did not improve significantly (P = 0.78). The use of 3D diagnostics and preoperative planning without navigation in complex orbital wall fractures has a positive effect on implant position. This is due to a better assessment of the fracture, the possibility of virtual surgery and because the planning can be used as a virtual guide intraoperatively. The surgeon has more control in positioning the implant in relation to the rim and other bony landmarks. Copyright © 2018 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  5. Virtual surgery in a (tele-)radiology framework.

    PubMed

    Glombitza, G; Evers, H; Hassfeld, S; Engelmann, U; Meinzer, H P

    1999-09-01

    This paper presents telemedicine as an extension of a teleradiology framework through tools for virtual surgery. To classify the described methods and applications, the research field of virtual reality (VR) is broadly reviewed. Differences with respect to technical equipment, methodological requirements and areas of application are pointed out. Desktop VR, augmented reality, and virtual reality are differentiated and discussed in some typical contexts of diagnostic support, surgical planning, therapeutic procedures, simulation and training. Visualization techniques are compared as a prerequisite for virtual reality and assigned to distinct levels of immersion. The advantage of a hybrid visualization kernel is emphasized with respect to the desktop VR applications that are subsequently shown. Moreover, software design aspects are considered by outlining functional openness in the architecture of the host system. Here, a teleradiology workstation was extended by dedicated tools for surgical planning through a plug-in mechanism. Examples of recent areas of application are introduced such as liver tumor resection planning, diagnostic support in heart surgery, and craniofacial surgery planning. In the future, surgical planning systems will become more important. They will benefit from improvements in image acquisition and communication, new image processing approaches, and techniques for data presentation. This will facilitate preoperative planning and intraoperative applications.

  6. Intelligent Motion and Interaction Within Virtual Environments

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R. (Editor); Slater, Mel (Editor); Alexander, Thomas (Editor)

    2007-01-01

    What makes virtual actors and objects in virtual environments seem real? How can the illusion of their reality be supported? What sorts of training or user-interface applications benefit from realistic user-environment interactions? These are some of the central questions that designers of virtual environments face. To be sure simulation realism is not necessarily the major, or even a required goal, of a virtual environment intended to communicate specific information. But for some applications in entertainment, marketing, or aspects of vehicle simulation training, realism is essential. The following chapters will examine how a sense of truly interacting with dynamic, intelligent agents may arise in users of virtual environments. These chapters are based on presentations at the London conference on Intelligent Motion and Interaction within a Virtual Environments which was held at University College, London, U.K., 15-17 September 2003.

  7. Application of digital diagnostic impression, virtual planning, and computer-guided implant surgery for a CAD/CAM-fabricated, implant-supported fixed dental prosthesis: a clinical report.

    PubMed

    Stapleton, Brandon M; Lin, Wei-Shao; Ntounis, Athanasios; Harris, Bryan T; Morton, Dean

    2014-09-01

    This clinical report demonstrated the use of an implant-supported fixed dental prosthesis fabricated with a contemporary digital approach. The digital diagnostic data acquisition was completed with a digital diagnostic impression with an intraoral scanner and cone-beam computed tomography with a prefabricated universal radiographic template to design a virtual prosthetically driven implant surgical plan. A surgical template fabricated with computer-aided design and computer-aided manufacturing (CAD/CAM) was used to perform computer-guided implant surgery. The definitive digital data were then used to design the definitive CAD/CAM-fabricated fixed dental prosthesis. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  8. Virtual reality hardware and graphic display options for brain-machine interfaces

    PubMed Central

    Marathe, Amar R.; Carey, Holle L.; Taylor, Dawn M.

    2009-01-01

    Virtual reality hardware and graphic displays are reviewed here as a development environment for brain-machine interfaces (BMIs). Two desktop stereoscopic monitors and one 2D monitor were compared in a visual depth discrimination task and in a 3D target-matching task where able-bodied individuals used actual hand movements to match a virtual hand to different target hands. Three graphic representations of the hand were compared: a plain sphere, a sphere attached to the fingertip of a realistic hand and arm, and a stylized pacman-like hand. Several subjects had great difficulty using either stereo monitor for depth perception when perspective size cues were removed. A mismatch in stereo and size cues generated inappropriate depth illusions. This phenomenon has implications for choosing target and virtual hand sizes in BMI experiments. Target matching accuracy was about as good with the 2D monitor as with either 3D monitor. However, users achieved this accuracy by exploring the boundaries of the hand in the target with carefully controlled movements. This method of determining relative depth may not be possible in BMI experiments if movement control is more limited. Intuitive depth cues, such as including a virtual arm, can significantly improve depth perception accuracy with or without stereo viewing. PMID:18006069

  9. STS-134 crew in Virtual Reality Lab during their MSS/EVAA SUPT2 Team training

    NASA Image and Video Library

    2010-08-27

    JSC2010-E-121049 (27 Aug. 2010) --- NASA astronaut Andrew Feustel (foreground), STS-134 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration

  10. STS-133 crew during MSS/EVAA TEAM training in Virtual Reality Lab

    NASA Image and Video Library

    2010-10-01

    JSC2010-E-170878 (1 Oct. 2010) --- NASA astronaut Michael Barratt, STS-133 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration

  11. STS-134 crew in Virtual Reality Lab during their MSS/EVAA SUPT2 Team training

    NASA Image and Video Library

    2010-08-27

    JSC2010-E-121056 (27 Aug. 2010) --- NASA astronaut Gregory H. Johnson, STS-134 pilot, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration

  12. STS-133 crew during MSS/EVAA TEAM training in Virtual Reality Lab

    NASA Image and Video Library

    2010-10-01

    JSC2010-E-170888 (1 Oct. 2010) --- NASA astronaut Nicole Stott, STS-133 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration

  13. STS-133 crew during MSS/EVAA TEAM training in Virtual Reality Lab

    NASA Image and Video Library

    2010-10-01

    JSC2010-E-170882 (1 Oct. 2010) --- NASA astronaut Nicole Stott, STS-133 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration

  14. Recommended Practices for Interactive Video Portability

    DTIC Science & Technology

    1990-10-01

    3-9 4. Implementation details 4-1 4.1 Installation issues ....................... 4-1 April 15, 1990 Release R 1.0 vii contents 4.1.1 VDI ...passed via an ASCII or binary application interface to the Virtual Device Interface ( VDI ) Management Software. ’ VDI Management, in turn, executes...the commands by calling appropriate low-level services and passes responses back to the application via the application interface. VDI Manage- ment is

  15. Building the Joint Battlespace Infosphere. Volume 2: Interactive Information Technologies

    DTIC Science & Technology

    1999-12-17

    G. A . Vouros, “ A Knowledge- Based Methodology for Supporting Multilingual and User -Tailored Interfaces ,” Interacting With Computers, Vol. 9 (1998), p...project is to develop a two-handed user interface to the stereoscopic field analyzer, an interactive 3-D scientific visualization system. The...62 See http://www.hitl.washington.edu/research/vrd/. 63 R. Baumann and R. Clavel, “Haptic Interface for Virtual Reality Based

  16. Virtual Environments in Scientific Visualization

    NASA Technical Reports Server (NTRS)

    Bryson, Steve; Lisinski, T. A. (Technical Monitor)

    1994-01-01

    Virtual environment technology is a new way of approaching the interface between computers and humans. Emphasizing display and user control that conforms to the user's natural ways of perceiving and thinking about space, virtual environment technologies enhance the ability to perceive and interact with computer generated graphic information. This enhancement potentially has a major effect on the field of scientific visualization. Current examples of this technology include the Virtual Windtunnel being developed at NASA Ames Research Center. Other major institutions such as the National Center for Supercomputing Applications and SRI International are also exploring this technology. This talk will be describe several implementations of virtual environments for use in scientific visualization. Examples include the visualization of unsteady fluid flows (the virtual windtunnel), the visualization of geodesics in curved spacetime, surface manipulation, and examples developed at various laboratories.

  17. The effects of virtual experience on attitudes toward real brands.

    PubMed

    Dobrowolski, Pawel; Pochwatko, Grzegorz; Skorko, Maciek; Bielecki, Maksymilian

    2014-02-01

    Although the commercial availability and implementation of virtual reality interfaces has seen rapid growth in recent years, little research has been conducted on the potential for virtual reality to affect consumer behavior. One unaddressed issue is how our real world attitudes are affected when we have a virtual experience with the target of those attitudes. This study compared participant (N=60) attitudes toward car brands before and after a virtual test drive of those cars was provided. Results indicated that attitudes toward test brands changed after experience with virtual representations of those brands. Furthermore, manipulation of the quality of this experience (in this case modification of driving difficulty) was reflected in the direction of attitude change. We discuss these results in the context of the associative-propositional evaluation model.

  18. An Introduction to 3-D Sound

    NASA Technical Reports Server (NTRS)

    Begault, Durand R.; Null, Cynthia H. (Technical Monitor)

    1997-01-01

    This talk will overview the basic technologies related to the creation of virtual acoustic images, and the potential of including spatial auditory displays in human-machine interfaces. Research into the perceptual error inherent in both natural and virtual spatial hearing is reviewed, since the formation of improved technologies is tied to psychoacoustic research. This includes a discussion of Head Related Transfer Function (HRTF) measurement techniques (the HRTF provides important perceptual cues within a virtual acoustic display). Many commercial applications of virtual acoustics have so far focused on games and entertainment ; in this review, other types of applications are examined, including aeronautic safety, voice communications, virtual reality, and room acoustic simulation. In particular, the notion that realistic simulation is optimized within a virtual acoustic display when head motion and reverberation cues are included within a perceptual model.

  19. New Tools to Search for Data in the European Space Agency's Planetary Science Archive

    NASA Astrophysics Data System (ADS)

    Grotheer, E.; Macfarlane, A. J.; Rios, C.; Arviset, C.; Heather, D.; Fraga, D.; Vallejo, F.; De Marchi, G.; Barbarisi, I.; Saiz, J.; Barthelemy, M.; Docasal, R.; Martinez, S.; Besse, S.; Lim, T.

    2016-12-01

    The European Space Agency's (ESA) Planetary Science Archive (PSA), which can be accessed at http://archives.esac.esa.int/psa, provides public access to the archived data of Europe's missions to our neighboring planets. These datasets are compliant with the Planetary Data System (PDS) standards. Recently, a new interface has been released, which includes upgrades to make PDS4 data available from newer missions such as ExoMars and BepiColombo. Additionally, the PSA development team has been working to ensure that the legacy PDS3 data will be more easily accessible via the new interface as well. In addition to a new querying interface, the new PSA also allows access via the EPN-TAP and PDAP protocols. This makes the PSA data sets compatible with other archive-related tools and projects, such as the Virtual European Solar and Planetary Access (VESPA) project for creating a virtual observatory.

  20. RoboCup-Rescue: an international cooperative research project of robotics and AI for the disaster mitigation problem

    NASA Astrophysics Data System (ADS)

    Tadokoro, Satoshi; Kitano, Hiroaki; Takahashi, Tomoichi; Noda, Itsuki; Matsubara, Hitoshi; Shinjoh, Atsushi; Koto, Tetsuo; Takeuchi, Ikuo; Takahashi, Hironao; Matsuno, Fumitoshi; Hatayama, Mitsunori; Nobe, Jun; Shimada, Susumu

    2000-07-01

    This paper introduces the RoboCup-Rescue Simulation Project, a contribution to the disaster mitigation, search and rescue problem. A comprehensive urban disaster simulator is constructed on distributed computers. Heterogeneous intelligent agents such as fire fighters, victims and volunteers conduct search and rescue activities in this virtual disaster world. A real world interface integrates various sensor systems and controllers of infrastructures in the real cities with the real world. Real-time simulation is synchronized with actual disasters, computing complex relationship between various damage factors and agent behaviors. A mission-critical man-machine interface provides portability and robustness of disaster mitigation centers, and augmented-reality interfaces for rescue in real disasters. It also provides a virtual- reality training function for the public. This diverse spectrum of RoboCup-Rescue contributes to the creation of the safer social system.

  1. Ambient Intelligence in Multimeda and Virtual Reality Environments for the rehabilitation

    NASA Astrophysics Data System (ADS)

    Benko, Attila; Cecilia, Sik Lanyi

    This chapter presents a general overview about the use of multimedia and virtual reality in rehabilitation and assistive and preventive healthcare. This chapter deals with multimedia, virtual reality applications based AI intended for use by medical doctors, nurses, special teachers and further interested persons. It describes methods how multimedia and virtual reality is able to assist their work. These include the areas how multimedia and virtual reality can help the patients everyday life and their rehabilitation. In the second part of the chapter we present the Virtual Therapy Room (VTR) a realized application for aphasic patients that was created for practicing communication and expressing emotions in a group therapy setting. The VTR shows a room that contains a virtual therapist and four virtual patients (avatars). The avatars are utilizing their knowledge base in order to answer the questions of the user providing an AI environment for the rehabilitation. The user of the VTR is the aphasic patient who has to solve the exercises. The picture that is relevant for the actual task appears on the virtual blackboard. Patient answers questions of the virtual therapist. Questions are about pictures describing an activity or an object in different levels. Patient can ask an avatar for answer. If the avatar knows the answer the avatars emotion changes to happy instead of sad. The avatar expresses its emotions in different dimensions. Its behavior, face-mimic, voice-tone and response also changes. The emotion system can be described as a deterministic finite automaton where places are emotion-states and the transition function of the automaton is derived from the input-response reaction of an avatar. Natural language processing techniques were also implemented in order to establish highquality human-computer interface windows for each of the avatars. Aphasic patients are able to interact with avatars via these interfaces. At the end of the chapter we visualize the possible future research field.

  2. Extending the Body to Virtual Tools Using a Robotic Surgical Interface: Evidence from the Crossmodal Congruency Task

    PubMed Central

    Sengül, Ali; van Elk, Michiel; Rognini, Giulio; Aspell, Jane Elizabeth; Bleuler, Hannes; Blanke, Olaf

    2012-01-01

    The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience. PMID:23227142

  3. Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task.

    PubMed

    Sengül, Ali; van Elk, Michiel; Rognini, Giulio; Aspell, Jane Elizabeth; Bleuler, Hannes; Blanke, Olaf

    2012-01-01

    The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience.

  4. Learning Intercultural Communication Skills with Virtual Humans: Feedback and Fidelity

    ERIC Educational Resources Information Center

    Lane, H. Chad; Hays, Matthew Jensen; Core, Mark G.; Auerbach, Daniel

    2013-01-01

    In the context of practicing intercultural communication skills, we investigated the role of fidelity in a game-based, virtual learning environment as well as the role of feedback delivered by an intelligent tutoring system. In 2 experiments, we compared variations on the game interface, use of the tutoring system, and the form of the feedback.…

  5. A Head in Virtual Reality: Development of A Dynamic Head and Neck Model

    ERIC Educational Resources Information Center

    Nguyen, Ngan; Wilson, Timothy D.

    2009-01-01

    Advances in computer and interface technologies have made it possible to create three-dimensional (3D) computerized models of anatomical structures for visualization, manipulation, and interaction in a virtual 3D environment. In the past few decades, a multitude of digital models have been developed to facilitate complex spatial learning of the…

  6. Designing Empathy: The Role of a "Control Room" in an E-Learning Environment

    ERIC Educational Resources Information Center

    Gentes, Annie; Cambone, Marie

    2013-01-01

    Purpose: The purpose of this paper is to focus on the challenge of designing an interface for a virtual class, where being represented together contributes to the learning process. It explores the possibility of virtual empathy. Design/methodology/approach: The challenges are: How can this feeling of empathy be recreated through a delicate staging…

  7. From Antarctica to space: Use of telepresence and virtual reality in control of remote vehicles

    NASA Technical Reports Server (NTRS)

    Stoker, Carol; Hine, Butler P., III; Sims, Michael; Rasmussen, Daryl; Hontalas, Phil; Fong, Terrence W.; Steele, Jay; Barch, Don; Andersen, Dale; Miles, Eric

    1994-01-01

    In the Fall of 1993, NASA Ames deployed a modified Phantom S2 Remotely-Operated underwater Vehicle (ROV) into an ice-covered sea environment near McMurdo Science Station, Antarctica. This deployment was part of the antarctic Space Analog Program, a joint program between NASA and the National Science Foundation to demonstrate technologies relevant for space exploration in realistic field setting in the Antarctic. The goal of the mission was to operationally test the use of telepresence and virtual reality technology in the operator interface to a remote vehicle, while performing a benthic ecology study. The vehicle was operated both locally, from above a dive hole in the ice through which it was launched, and remotely over a satellite communications link from a control room at NASA's Ames Research Center. Local control of the vehicle was accomplished using the standard Phantom control box containing joysticks and switches, with the operator viewing stereo video camera images on a stereo display monitor. Remote control of the vehicle over the satellite link was accomplished using the Virtual Environment Vehicle Interface (VEVI) control software developed at NASA Ames. The remote operator interface included either a stereo display monitor similar to that used locally or a stereo head-mounted head-tracked display. The compressed video signal from the vehicle was transmitted to NASA Ames over a 768 Kbps satellite channel. Another channel was used to provide a bi-directional Internet link to the vehicle control computer through which the command and telemetry signals traveled, along with a bi-directional telephone service. In addition to the live stereo video from the satellite link, the operator could view a computer-generated graphic representation of the underwater terrain, modeled from the vehicle's sensors. The virtual environment contained an animate graphic model of the vehicle which reflected the state of the actual vehicle, along with ancillary information such as the vehicle track, science markers, and locations of video snapshots. The actual vehicle was driven either from within the virtual environment or through a telepresence interface. All vehicle functions could be controlled remotely over the satellite link.

  8. Script-theory virtual case: A novel tool for education and research.

    PubMed

    Hayward, Jake; Cheung, Amandy; Velji, Alkarim; Altarejos, Jenny; Gill, Peter; Scarfe, Andrew; Lewis, Melanie

    2016-11-01

    Context/Setting: The script theory of diagnostic reasoning proposes that clinicians evaluate cases in the context of an "illness script," iteratively testing internal hypotheses against new information eventually reaching a diagnosis. We present a novel tool for teaching diagnostic reasoning to undergraduate medical students based on an adaptation of script theory. We developed a virtual patient case that used clinically authentic audio and video, interactive three-dimensional (3D) body images, and a simulated electronic medical record. Next, we used interactive slide bars to record respondents' likelihood estimates of diagnostic possibilities at various stages of the case. Responses were dynamically compared to data from expert clinicians and peers. Comparative frequency distributions were presented to the learner and final diagnostic likelihood estimates were analyzed. Detailed student feedback was collected. Over two academic years, 322 students participated. Student diagnostic likelihood estimates were similar year to year, but were consistently different from expert clinician estimates. Student feedback was overwhelmingly positive: students found the case was novel, innovative, clinically authentic, and a valuable learning experience. We demonstrate the successful implementation of a novel approach to teaching diagnostic reasoning. Future study may delineate reasoning processes associated with differences between novice and expert responses.

  9. Virtual hand: a 3D tactile interface to virtual environments

    NASA Astrophysics Data System (ADS)

    Rogowitz, Bernice E.; Borrel, Paul

    2008-02-01

    We introduce a novel system that allows users to experience the sensation of touch in a computer graphics environment. In this system, the user places his/her hand on an array of pins, which is moved about space on a 6 degree-of-freedom robot arm. The surface of the pins defines a surface in the virtual world. This "virtual hand" can move about the virtual world. When the virtual hand encounters an object in the virtual world, the heights of the pins are adjusted so that they represent the object's shape, surface, and texture. A control system integrates pin and robot arm motions to transmit information about objects in the computer graphics world to the user. It also allows the user to edit, change and move the virtual objects, shapes and textures. This system provides a general framework for touching, manipulating, and modifying objects in a 3-D computer graphics environment, which may be useful in a wide range of applications, including computer games, computer aided design systems, and immersive virtual worlds.

  10. Virtual Guidance Ultrasound: A Tool to Obtain Diagnostic Ultrasound for Remote Environments

    NASA Technical Reports Server (NTRS)

    Caine,Timothy L.; Martin David S.; Matz, Timothy; Lee, Stuart M. C.; Stenger, Michael B.; Platts, Steven H.

    2012-01-01

    Astronauts currently acquire ultrasound images on the International Space Station with the assistance of real-time remote guidance from an ultrasound expert in Mission Control. Remote guidance will not be feasible when significant communication delays exist during exploration missions beyond low-Earth orbit. For example, there may be as much as a 20- minute delay in communications between the Earth and Mars. Virtual-guidance, a pre-recorded audio-visual tutorial viewed in real-time, is a viable modality for minimally trained scanners to obtain diagnostically-adequate images of clinically relevant anatomical structures in an autonomous manner. METHODS: Inexperienced ultrasound operators were recruited to perform carotid artery (n = 10) and ophthalmic (n = 9) ultrasound examinations using virtual guidance as their only instructional tool. In the carotid group, each each untrained operator acquired two-dimensional, pulsed, and color Doppler of the carotid artery. In the ophthalmic group, operators acquired representative images of the anterior chamber of the eye, retina, optic nerve, and nerve sheath. Ultrasound image quality was evaluated by independent imaging experts. RESULTS: Eight of the 10 carotid studies were judged to be diagnostically adequate. With one exception the quality of all the ophthalmic images were adequate to excellent. CONCLUSION: Diagnostically-adequate carotid and ophthalmic ultrasound examinations can be obtained by untrained operators with instruction only from an audio/video tutorial viewed in real time while scanning. This form of quick-response-guidance, can be developed for other ultrasound examinations, represents an opportunity to acquire important medical and scientific information for NASA flight surgeons and researchers when trained medical personnel are not present. Further, virtual guidance will allow untrained personnel to autonomously obtain important medical information in remote locations on Earth where communication is difficult or absent.

  11. An Integrated FDD System for HVAC&R Based on Virtual Sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Woohyun

    According to the U.S Department of Energy, space heating, ventilation and air conditioning system account for 40% of residential primary energy use and for 30% of primary energy use in commercial buildings. A study released by the Energy Information Administration indicated that packaged air conditioners are widely used in 46% of all commercial buildings in the U.S. This study indicates that the annual cooling energy consumption related to the packaged air conditioner is about 160 trillion Btus. Therefore, an automated FDD system that can automatically detect and diagnose faults and evaluate fault impacts has the potential for improving energy efficiencymore » along with reducing service costs and comfort complaints. The primary bottlenecks to diagnostic implementation in the field are the high initial costs of additional sensors. To prevent those limitations, virtual sensors with low cost measurements and simple models are developed to estimate quantities that would be expensive and or difficult to measure directly. The use of virtual sensors can reduce costs compared to the use of real sensors and provide additional information for economic assessment. The virtual sensor can be embedded in a permanently installed control or monitoring system and continuous monitoring potentially leads to early detection of faults. The virtual sensors of individual equipment components can be integrated to estimate overall diagnostic information using the output of each virtual sensor.« less

  12. Surgery applications of virtual reality

    NASA Technical Reports Server (NTRS)

    Rosen, Joseph

    1994-01-01

    Virtual reality is a computer-generated technology which allows information to be displayed in a simulated, bus lifelike, environment. In this simulated 'world', users can move and interact as if they were actually a part of that world. This new technology will be useful in many different fields, including the field of surgery. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations, simulate and perform surgical procedures (telesurgery), and predict the outcomes of surgery. The authors of this paper describe the basic components of a virtual reality surgical system. These components include: the virtual world, the virtual tools, the anatomical model, the software platform, the host computer, the interface, and the head-coupled display. In the chapter they also review the progress towards using virtual reality for surgical training, planning, telesurgery, and predicting outcomes. Finally, the authors present a training system being developed for the practice of new procedures in abdominal surgery.

  13. myChEMBL: a virtual machine implementation of open data and cheminformatics tools.

    PubMed

    Ochoa, Rodrigo; Davies, Mark; Papadatos, George; Atkinson, Francis; Overington, John P

    2014-01-15

    myChEMBL is a completely open platform, which combines public domain bioactivity data with open source database and cheminformatics technologies. myChEMBL consists of a Linux (Ubuntu) Virtual Machine featuring a PostgreSQL schema with the latest version of the ChEMBL database, as well as the latest RDKit cheminformatics libraries. In addition, a self-contained web interface is available, which can be modified and improved according to user specifications. The VM is available at: ftp://ftp.ebi.ac.uk/pub/databases/chembl/VM/myChEMBL/current. The web interface and web services code is available at: https://github.com/rochoa85/myChEMBL.

  14. Experiments and Analysis on a Computer Interface to an Information-Retrieval Network.

    ERIC Educational Resources Information Center

    Marcus, Richard S.; Reintjes, J. Francis

    A primary goal of this project was to develop an interface that would provide direct access for inexperienced users to existing online bibliographic information retrieval networks. The experiment tested the concept of a virtual-system mode of access to a network of heterogeneous interactive retrieval systems and databases. An experimental…

  15. The Design of Motivational Agents and Avatars

    ERIC Educational Resources Information Center

    Baylor, Amy L.

    2011-01-01

    While the addition of an anthropomorphic interface agent to a learning system generally has little direct impact on learning, it potentially has a huge impact on learner motivation. As such agents become increasingly ubiquitous on the Internet, in virtual worlds, and as interfaces for learning and gaming systems, it is important to design them to…

  16. User Acceptance of a Haptic Interface for Learning Anatomy

    ERIC Educational Resources Information Center

    Yeom, Soonja; Choi-Lundberg, Derek; Fluck, Andrew; Sale, Arthur

    2013-01-01

    Visualizing the structure and relationships in three dimensions (3D) of organs is a challenge for students of anatomy. To provide an alternative way of learning anatomy engaging multiple senses, we are developing a force-feedback (haptic) interface for manipulation of 3D virtual organs, using design research methodology, with iterations of system…

  17. Self-Observation Model Employing an Instinctive Interface for Classroom Active Learning

    ERIC Educational Resources Information Center

    Chen, Gwo-Dong; Nurkhamid; Wang, Chin-Yeh; Yang, Shu-Han; Chao, Po-Yao

    2014-01-01

    In a classroom, obtaining active, whole-focused, and engaging learning results from a design is often difficult. In this study, we propose a self-observation model that employs an instinctive interface for classroom active learning. Students can communicate with virtual avatars in the vertical screen and can react naturally according to the…

  18. Telepresence: A "Real" Component in a Model to Make Human-Computer Interface Factors Meaningful in the Virtual Learning Environment

    ERIC Educational Resources Information Center

    Selverian, Melissa E. Markaridian; Lombard, Matthew

    2009-01-01

    A thorough review of the research relating to Human-Computer Interface (HCI) form and content factors in the education, communication and computer science disciplines reveals strong associations of meaningful perceptual "illusions" with enhanced learning and satisfaction in the evolving classroom. Specifically, associations emerge…

  19. A Web Service and Interface for Remote Electronic Device Characterization

    ERIC Educational Resources Information Center

    Dutta, S.; Prakash, S.; Estrada, D.; Pop, E.

    2011-01-01

    A lightweight Web Service and a Web site interface have been developed, which enable remote measurements of electronic devices as a "virtual laboratory" for undergraduate engineering classes. Using standard browsers without additional plugins (such as Internet Explorer, Firefox, or even Safari on an iPhone), remote users can control a Keithley…

  20. Development of a new virtual diagnostic for V3FIT

    NASA Astrophysics Data System (ADS)

    Trevisan, G. L.; Cianciosa, M. R.; Terranova, D.; Hanson, J. D.

    2014-12-01

    The determination of plasma equilibria from diagnostic information is a fundamental issue. V3FIT is a fully three-dimensional reconstruction code capable of solving the inverse problem using both magnetic and kinetic measurements. It uses VMEC as core equilibrium solver and supports both free- and fixed-boundary reconstruction approaches. In fixed-boundary mode VMEC does not use explicit information about currents in external coils, even though it has important effects on the shape of the safety factor profile. Indeed, the edge safety factor influences the reversal position in RFP plasmas, which then determines the position of the m = 0 island chain and the edge transport properties. In order to exploit such information a new virtual diagnostic has been developed, that thanks to Ampère's law relates the external current through the center of the torus to the circulation of the toroidal magnetic field on the outermost flux surface. The reconstructions that exploit the new diagnostic are indeed found to better interpret the experimental data with respect to edge physics.

  1. iHand: an interactive bare-hand-based augmented reality interface on commercial mobile phones

    NASA Astrophysics Data System (ADS)

    Choi, Junyeong; Park, Jungsik; Park, Hanhoon; Park, Jong-Il

    2013-02-01

    The performance of mobile phones has rapidly improved, and they are emerging as a powerful platform. In many vision-based applications, human hands play a key role in natural interaction. However, relatively little attention has been paid to the interaction between human hands and the mobile phone. Thus, we propose a vision- and hand gesture-based interface in which the user holds a mobile phone in one hand but sees the other hand's palm through a built-in camera. The virtual contents are faithfully rendered on the user's palm through palm pose estimation, and reaction with hand and finger movements is achieved that is recognized by hand shape recognition. Since the proposed interface is based on hand gestures familiar to humans and does not require any additional sensors or markers, the user can freely interact with virtual contents anytime and anywhere without any training. We demonstrate that the proposed interface works at over 15 fps on a commercial mobile phone with a 1.2-GHz dual core processor and 1 GB RAM.

  2. A Simple and Customizable Web Interface to the Virtual Solar Observatory

    NASA Astrophysics Data System (ADS)

    Hughitt, V. Keith; Hourcle, J.; Suarez-Sola, I.; Davey, A.

    2010-05-01

    As the variety and number of solar data sources continue to increase at a rapid rate, the importance of providing methods to search through these sources becomes increasingly important. By taking advantage of the power of modern JavaScript libraries, a new version of the Virtual Solar Observatory's web interface aims to provide a significantly faster and simpler way to explore the multitude of data repositories available. Querying asynchroniously serves not only to eliminates bottlenecks resulting from slow or unresponsive data providers, but also allows for displaying of results as soon as they are returned. Implicit pagination and post-query filtering enables users to work with large result-sets, while a more modular and customizable UI provides a mechanism for customizing both the look-and-feel and behavior of the VSO web interface. Finally, the new web interface features a custom widget system capable of displaying additional tools and information along-side of the standard VSO search form. Interested users can also write their own widgets and submit them for future incorporation into VSO.

  3. The Use of Virtual Reality Simulation to Improve Technical Skill in the Undergraduate Medical Imaging Student

    ERIC Educational Resources Information Center

    Gunn, Therese; Jones, Lee; Bridge, Pete; Rowntree, Pam; Nissen, Lisa

    2018-01-01

    In recent years, simulation has increasingly underpinned the acquisition of pre-clinical skills by undergraduate medical imaging (diagnostic radiography) students. This project aimed to evaluate the impact of an innovative virtual reality (VR) learning environment on the development of technical proficiency by students. The study assessed the…

  4. The Design of Wayfinding Affordance and Its Influence on Task Performance and Perceptual Experience in Desktop Virtual Environments

    ERIC Educational Resources Information Center

    Choi, Gil Ok

    2008-01-01

    For the past few years, virtual environments (VEs) have gained broad attention from both scholarly and practitioner communities. However, in spite of intense and widespread efforts, most VE-related research has focused on the technical aspects of applications, and the necessary theoretical framework to assess the quality of interfaces and designs…

  5. STS-105 Crew Training in VR Lab

    NASA Image and Video Library

    2001-03-15

    JSC2001-00751 (15 March 2001) --- Astronaut Scott J. Horowitz, STS-105 mission commander, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Discovery. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with International Space Station (ISS) elements.

  6. Photographic coverage of STS-112 during EVA 3 in VR Lab.

    NASA Image and Video Library

    2002-08-21

    JSC2002-E-34622 (21 August 2002) --- Astronaut David A. Wolf, STS-112 mission specialist, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Atlantis. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with ISS elements.

  7. STS-115 Vitual Lab Training

    NASA Image and Video Library

    2005-06-07

    JSC2005-E-21191 (7 June 2005) --- Astronaut Steven G. MacLean, STS-115 mission specialist representing the Canadian Space Agency, uses the virtual reality lab at the Johnson Space Center to train for his duties aboard the space shuttle. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  8. STS-105 Crew Training in VR Lab

    NASA Image and Video Library

    2001-03-15

    JSC2001-00758 (15 March 2001) --- Astronaut Frederick W. Sturckow, STS-105 pilot, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties aboard the Space Shuttle Discovery. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with International Space Station (ISS) elements.

  9. STS-115 Vitual Lab Training

    NASA Image and Video Library

    2005-06-07

    JSC2005-E-21192 (7 June 2005) --- Astronauts Christopher J. Ferguson (left), STS-115 pilot, and Daniel C. Burbank, mission specialist, use the virtual reality lab at the Johnson Space Center to train for their duties aboard the space shuttle. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  10. Real-Life Migrants on the MUVE: Stories of Virtual Transitions

    ERIC Educational Resources Information Center

    Perkins, Ross A.; Arreguin, Cathy

    2007-01-01

    The communication and collaborative interface known as a multi-user virtual environment (MUVE), has existed since as early as the late 1970s. MUVEs refer to programs that have an animated character ("avatar") controlled by a user within a wider environment that can be explored--or built--at will. Second Life, a MUVE created by San Francisco-based…

  11. Virtual Teleoperation for Unmanned Aerial Vehicles

    DTIC Science & Technology

    2012-01-24

    Gilbert, S., “Wayfinder: Evaluating Multitouch Interaction in Supervisory Control of Unmanned Vehicles,” Proceedings of ASME 2nd World Conference on... interactive virtual reality environment that fuses available information into a coherent picture that can be viewed from multiple perspectives and scales...for multimodal interaction • Generally abstracted controller hardware and graphical interfaces facilitating deployment on a variety of VR platform

  12. A Closed-loop Brain Computer Interface to a Virtual Reality Avatar: Gait Adaptation to Visual Kinematic Perturbations

    PubMed Central

    Luu, Trieu Phat; He, Yongtian; Brown, Samuel; Nakagome, Sho; Contreras-Vidal, Jose L.

    2016-01-01

    The control of human bipedal locomotion is of great interest to the field of lower-body brain computer interfaces (BCIs) for rehabilitation of gait. While the feasibility of a closed-loop BCI system for the control of a lower body exoskeleton has been recently shown, multi-day closed-loop neural decoding of human gait in a virtual reality (BCI-VR) environment has yet to be demonstrated. In this study, we propose a real-time closed-loop BCI that decodes lower limb joint angles from scalp electroencephalography (EEG) during treadmill walking to control the walking movements of a virtual avatar. Moreover, virtual kinematic perturbations resulting in asymmetric walking gait patterns of the avatar were also introduced to investigate gait adaptation using the closed-loop BCI-VR system over a period of eight days. Our results demonstrate the feasibility of using a closed-loop BCI to learn to control a walking avatar under normal and altered visuomotor perturbations, which involved cortical adaptations. These findings have implications for the development of BCI-VR systems for gait rehabilitation after stroke and for understanding cortical plasticity induced by a closed-loop BCI system. PMID:27713915

  13. An optical brain computer interface for environmental control.

    PubMed

    Ayaz, Hasan; Shewokis, Patricia A; Bunce, Scott; Onaral, Banu

    2011-01-01

    A brain computer interface (BCI) is a system that translates neurophysiological signals detected from the brain to supply input to a computer or to control a device. Volitional control of neural activity and its real-time detection through neuroimaging modalities are key constituents of BCI systems. The purpose of this study was to develop and test a new BCI design that utilizes intention-related cognitive activity within the dorsolateral prefrontal cortex using functional near infrared (fNIR) spectroscopy. fNIR is a noninvasive, safe, portable and affordable optical technique with which to monitor hemodynamic changes, in the brain's cerebral cortex. Because of its portability and ease of use, fNIR is amenable to deployment in ecologically valid natural working environments. We integrated a control paradigm in a computerized 3D virtual environment to augment interactivity. Ten healthy participants volunteered for a two day study in which they navigated a virtual environment with keyboard inputs, but were required to use the fNIR-BCI for interaction with virtual objects. Results showed that participants consistently utilized the fNIR-BCI with an overall success rate of 84% and volitionally increased their cerebral oxygenation level to trigger actions within the virtual environment.

  14. Virtual guidance as a tool to obtain diagnostic ultrasound for spaceflight and remote environments.

    PubMed

    Martin, David S; Caine, Timothy L; Matz, Timothy; Lee, Stuart M C; Stenger, Michael B; Sargsyan, Ashot E; Platts, Steven H

    2012-10-01

    With missions planned to travel greater distances from Earth at ranges that make real-time two-way communication impractical, astronauts will be required to perform autonomous medical diagnostic procedures during future exploration missions. Virtual guidance is a form of just-in-time training developed to allow novice ultrasound operators to acquire diagnostically-adequate images of clinically relevant anatomical structures using a prerecorded audio/visual tutorial viewed in real-time. Individuals without previous experience in ultrasound were recruited to perform carotid artery (N = 10) and ophthalmic (N = 9) ultrasound examinations using virtual guidance as their only training tool. In the carotid group, each untrained operator acquired two-dimensional, pulsed and color Doppler of the carotid artery. In the ophthalmic group, operators acquired representative images of the anterior chamber of the eye, retina, optic nerve, and nerve sheath. Ultrasound image quality was evaluated by independent imaging experts. Of the studies, 8 of the 10 carotid and 17 of 18 of the ophthalmic images (2 images collected per study) were judged to be diagnostically adequate. The quality of all but one of the ophthalmic images ranged from adequate to excellent. Diagnostically-adequate carotid and ophthalmic ultrasound examinations can be obtained by previously untrained operators with assistance from only an audio/video tutorial viewed in real time while scanning. This form of just-in-time training, which can be applied to other examinations, represents an opportunity to acquire important information for NASA flight surgeons and researchers when trained medical personnel are not available or when remote guidance is impractical.

  15. Virtual reality for intelligent and interactive operating, training, and visualization systems

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Juergen; Schluse, Michael

    2000-10-01

    Virtual Reality Methods allow a new and intuitive way of communication between man and machine. The basic idea of Virtual Reality (VR) is the generation of artificial computer simulated worlds, which the user not only can look at but also can interact with actively using data glove and data helmet. The main emphasis for the use of such techniques at the IRF is the development of a new generation of operator interfaces for the control of robots and other automation components and for intelligent training systems for complex tasks. The basic idea of the methods developed at the IRF for the realization of Projective Virtual Reality is to let the user work in the virtual world as he would act in reality. The user actions are recognized by the Virtual reality System and by means of new and intelligent control software projected onto the automation components like robots which afterwards perform the necessary actions in reality to execute the users task. In this operation mode the user no longer has to be a robot expert to generate tasks for robots or to program them, because intelligent control software recognizes the users intention and generated automatically the commands for nearly every automation component. Now, Virtual Reality Methods are ideally suited for universal man-machine-interfaces for the control and supervision of a big class of automation components, interactive training and visualization systems. The Virtual Reality System of the IRF-COSIMIR/VR- forms the basis for different projects starting with the control of space automation systems in the projects CIROS, VITAL and GETEX, the realization of a comprehensive development tool for the International Space Station and last but not least with the realistic simulation fire extinguishing, forest machines and excavators which will be presented in the final paper in addition to the key ideas of this Virtual Reality System.

  16. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same through web-browser interfaces. The summer school will serve as a valuable testbed for the tool development, preparing CMDA to serve its target community: Earth-science modeling and model-analysis community.

  17. Enhancing Navigation Skills through Audio Gaming.

    PubMed

    Sánchez, Jaime; Sáenz, Mauricio; Pascual-Leone, Alvaro; Merabet, Lotfi

    2010-01-01

    We present the design, development and initial cognitive evaluation of an Audio-based Environment Simulator (AbES). This software allows a blind user to navigate through a virtual representation of a real space for the purposes of training orientation and mobility skills. Our findings indicate that users feel satisfied and self-confident when interacting with the audio-based interface, and the embedded sounds allow them to correctly orient themselves and navigate within the virtual world. Furthermore, users are able to transfer spatial information acquired through virtual interactions into real world navigation and problem solving tasks.

  18. STS-134 crew in Virtual Reality Lab during their MSS/EVAA SUPT2 Team training

    NASA Image and Video Library

    2010-08-27

    JSC2010-E-121045 (27 Aug. 2010) --- NASA astronaut Andrew Feustel (right), STS-134 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. David Homan assisted Feustel. Photo credit: NASA or National Aeronautics and Space Administration

  19. Current Status of VO Compliant Data Service in Japanese Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Shirasaki, Y.; Komiya, Y.; Ohishi, M.; Mizumoto, Y.; Ishihara, Y.; Tsutsumi, J.; Hiyama, T.; Nakamoto, H.; Sakamoto, M.

    2012-09-01

    In these years, standards to build a Virtual Observatory (VO) data service have been established with the efforts in the International Virtual Observatory Alliance (IVOA). We applied these newly established standards (SSAP, TAP) to our VO service toolkit which was developed to implement earlier VO standards SIAP and (deprecated) SkyNode. The toolkit can be easily installed and provides a GUI interface to construct and manage VO service. In this paper, we describes the architecture of our toolkit and how it is used to start hosting VO service.

  20. Enhancing Navigation Skills through Audio Gaming

    PubMed Central

    Sánchez, Jaime; Sáenz, Mauricio; Pascual-Leone, Alvaro; Merabet, Lotfi

    2014-01-01

    We present the design, development and initial cognitive evaluation of an Audio-based Environment Simulator (AbES). This software allows a blind user to navigate through a virtual representation of a real space for the purposes of training orientation and mobility skills. Our findings indicate that users feel satisfied and self-confident when interacting with the audio-based interface, and the embedded sounds allow them to correctly orient themselves and navigate within the virtual world. Furthermore, users are able to transfer spatial information acquired through virtual interactions into real world navigation and problem solving tasks. PMID:25505796

  1. Virtual reality haptic dissection.

    PubMed

    Erolin, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-12-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist, and investigate cross-discipline collaborations in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills, before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  2. Three-dimensional virtual acoustic displays

    NASA Technical Reports Server (NTRS)

    Wenzel, Elizabeth M.

    1991-01-01

    The development of an alternative medium for displaying information in complex human-machine interfaces is described. The 3-D virtual acoustic display is a means for accurately transferring information to a human operator using the auditory modality; it combines directional and semantic characteristics to form naturalistic representations of dynamic objects and events in remotely sensed or simulated environments. Although the technology can stand alone, it is envisioned as a component of a larger multisensory environment and will no doubt find its greatest utility in that context. The general philosophy in the design of the display has been that the development of advanced computer interfaces should be driven first by an understanding of human perceptual requirements, and later by technological capabilities or constraints. In expanding on this view, current and potential uses are addressed of virtual acoustic displays, such displays are characterized, and recent approaches to their implementation and application are reviewed, the research project at NASA-Ames is described in detail, and finally some critical research issues for the future are outlined.

  3. Learning in a Virtual Environment Using Haptic Systems for Movement Re-Education: Can This Medium Be Used for Remodeling Other Behaviors and Actions?

    PubMed Central

    Merians, Alma S; Fluet, Gerard G; Qiu, Qinyin; Lafond, Ian; Adamovich, Sergei V

    2011-01-01

    Robotic systems that are interfaced with virtual reality gaming and task simulations are increasingly being developed to provide repetitive intensive practice to promote increased compliance and facilitate better outcomes in rehabilitation post-stroke. A major development in the use of virtual environments (VEs) has been to incorporate tactile information and interaction forces into what was previously an essentially visual experience. Robots of varying complexity are being interfaced with more traditional virtual presentations to provide haptic feedback that enriches the sensory experience and adds physical task parameters. This provides forces that produce biomechanical and neuromuscular interactions with the VE that approximate real-world movement more accurately than visual-only VEs, simulating the weight and force found in upper extremity tasks. The purpose of this article is to present an overview of several systems that are commercially available for ambulation training and for training movement of the upper extremity. We will also report on the system that we have developed (NJIT-RAVR system) that incorporates motivating and challenging haptic feedback effects into VE simulations to facilitate motor recovery of the upper extremity post-stroke. The NJIT-RAVR system trains both the upper arm and the hand. The robotic arm acts as an interface between the participants and the VEs, enabling multiplanar movements against gravity in a three-dimensional workspace. The ultimate question is whether this medium can provide a motivating, challenging, gaming experience with dramatically decreased physical difficulty levels, which would allow for participation by an obese person and facilitate greater adherence to exercise regimes. PMID:21527097

  4. Generating Contextual Descriptions of Virtual Reality (VR) Spaces

    NASA Astrophysics Data System (ADS)

    Olson, D. M.; Zaman, C. H.; Sutherland, A.

    2017-12-01

    Virtual reality holds great potential for science communication, education, and research. However, interfaces for manipulating data and environments in virtual worlds are limited and idiosyncratic. Furthermore, speech and vision are the primary modalities by which humans collect information about the world, but the linking of visual and natural language domains is a relatively new pursuit in computer vision. Machine learning techniques have been shown to be effective at image and speech classification, as well as at describing images with language (Karpathy 2016), but have not yet been used to describe potential actions. We propose a technique for creating a library of possible context-specific actions associated with 3D objects in immersive virtual worlds based on a novel dataset generated natively in virtual reality containing speech, image, gaze, and acceleration data. We will discuss the design and execution of a user study in virtual reality that enabled the collection and the development of this dataset. We will also discuss the development of a hybrid machine learning algorithm linking vision data with environmental affordances in natural language. Our findings demonstrate that it is possible to develop a model which can generate interpretable verbal descriptions of possible actions associated with recognized 3D objects within immersive VR environments. This suggests promising applications for more intuitive user interfaces through voice interaction within 3D environments. It also demonstrates the potential to apply vast bodies of embodied and semantic knowledge to enrich user interaction within VR environments. This technology would allow for applications such as expert knowledge annotation of 3D environments, complex verbal data querying and object manipulation in virtual spaces, and computer-generated, dynamic 3D object affordances and functionality during simulations.

  5. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    PubMed

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  6. The photoelectric effect and study of the diffraction of light: Two new experiments in UNILabs virtual and remote laboratories network

    NASA Astrophysics Data System (ADS)

    Pedro Sánchez, Juan; Sáenz, Jacobo; de la Torre, Luis; Carreras, Carmen; Yuste, Manuel; Heradio, Rubén; Dormido, Sebastián

    2016-05-01

    This work describes two experiments: "study of the diffraction of light: Fraunhofer approximation" and "the photoelectric effect". Both of them count with a virtual, simulated, version of the experiment as well as with a real one which can be operated remotely. The two previous virtual and remote labs (built using Easy Java(script) Simulations) are integrated in UNILabs, a network of online interactive laboratories based on the free Learning Management System Moodle. In this web environment, students can find not only the virtual and remote labs but also manuals with related theory, the user interface description for each application, and so on.

  7. Compcon '95 Paper

    Science.gov Websites

    has the ability to view various parts of a frog from many different angles, and with the different here - the Virtual Dissection Kit - has been accessed by 50,000 different sites, in over 50 different form document that is a translation of the user interface into a different language (the Virtual Frog

  8. STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus

    NASA Image and Video Library

    2007-08-09

    JSC2007-E-41539 (9 Aug. 2007) --- Astronaut Pamela A. Melroy, STS-120 commander, uses the virtual reality lab at Johnson Space Center to train for her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  9. STS-111 Training in VR lab with Expedition IV and V Crewmembers

    NASA Image and Video Library

    2001-10-18

    JSC2001-E-39090 (18 October 2001) --- Cosmonaut Valeri G. Korzun, Expedition Five mission commander representing Rosaviakosmos, uses the virtual reality lab at the Johnson Space Center (JSC) to train for his duties on the International Space Station (ISS). This type of computer interface paired with virtual reality training hardware and software helps the entire team for dealing with ISS elements.

  10. STS-EVA Mass Ops training of the STS-117 EVA crewmembers

    NASA Image and Video Library

    2006-11-01

    JSC2006-E-47612 (1 Nov. 2006) --- Astronaut Steven R. Swanson, STS-117 mission specialist, uses the virtual reality lab at Johnson Space Center to train for his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  11. STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus

    NASA Image and Video Library

    2007-08-09

    JSC2007-E-41532 (9 Aug. 2007) --- Astronaut Stephanie D. Wilson, STS-120 mission specialist, uses the virtual reality lab at Johnson Space Center to train for her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  12. STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus

    NASA Image and Video Library

    2007-08-09

    JSC2007-E-41531 (9 Aug. 2007) --- Astronaut Pamela A. Melroy, STS-120 commander, uses the virtual reality lab at Johnson Space Center to train for her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  13. On the Usability and Likeability of Virtual Reality Games for Education: The Case of VR-ENGAGE

    ERIC Educational Resources Information Center

    Virvou, Maria; Katsionis, George

    2008-01-01

    Educational software games aim at increasing the students' motivation and engagement while they learn. However, if software games are targeted to school classrooms they have to be usable and likeable by all students. Usability of virtual reality games may be a problem because these games tend to have complex user interfaces so that they are more…

  14. Second Life for Distance Language Learning: A Framework for Native/Non-Native Speaker Interactions in a Virtual World

    ERIC Educational Resources Information Center

    Tusing, Jennifer; Berge, Zane L.

    2010-01-01

    This paper examines a number of theoretical principles governing second language teaching and learning and the ways in which these principles are being applied in 3D virtual worlds such as Second Life. Also examined are the benefits to language learning afforded by the Second Life interface, including access, the availability of native speakers of…

  15. Multi-degree of freedom joystick for virtual reality simulation.

    PubMed

    Head, M J; Nelson, C A; Siu, K C

    2013-11-01

    A modular control interface and simulated virtual reality environment were designed and created in order to determine how the kinematic architecture of a control interface affects minimally invasive surgery training. A user is able to selectively determine the kinematic configuration of an input device (number, type and location of degrees of freedom) for a specific surgical simulation through the use of modular joints and constraint components. Furthermore, passive locking was designed and implemented through the use of inflated latex tubing around rotational joints in order to allow a user to step away from a simulation without unwanted tool motion. It is believed that these features will facilitate improved simulation of a variety of surgical procedures and, thus, improve surgical skills training.

  16. Development of Virtual Resource Based IoT Proxy for Bridging Heterogeneous Web Services in IoT Networks.

    PubMed

    Jin, Wenquan; Kim, DoHyeun

    2018-05-26

    The Internet of Things is comprised of heterogeneous devices, applications, and platforms using multiple communication technologies to connect the Internet for providing seamless services ubiquitously. With the requirement of developing Internet of Things products, many protocols, program libraries, frameworks, and standard specifications have been proposed. Therefore, providing a consistent interface to access services from those environments is difficult. Moreover, bridging the existing web services to sensor and actuator networks is also important for providing Internet of Things services in various industry domains. In this paper, an Internet of Things proxy is proposed that is based on virtual resources to bridge heterogeneous web services from the Internet to the Internet of Things network. The proxy enables clients to have transparent access to Internet of Things devices and web services in the network. The proxy is comprised of server and client to forward messages for different communication environments using the virtual resources which include the server for the message sender and the client for the message receiver. We design the proxy for the Open Connectivity Foundation network where the virtual resources are discovered by the clients as Open Connectivity Foundation resources. The virtual resources represent the resources which expose services in the Internet by web service providers. Although the services are provided by web service providers from the Internet, the client can access services using the consistent communication protocol in the Open Connectivity Foundation network. For discovering the resources to access services, the client also uses the consistent discovery interface to discover the Open Connectivity Foundation devices and virtual resources.

  17. Three-dimensional user interfaces for scientific visualization

    NASA Technical Reports Server (NTRS)

    VanDam, Andries (Principal Investigator)

    1996-01-01

    The focus of this grant was to experiment with novel user interfaces for scientific visualization applications using both desktop and virtual reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past three years, and subsumes all prior reports.

  18. A video, text, and speech-driven realistic 3-d virtual head for human-machine interface.

    PubMed

    Yu, Jun; Wang, Zeng-Fu

    2015-05-01

    A multiple inputs-driven realistic facial animation system based on 3-D virtual head for human-machine interface is proposed. The system can be driven independently by video, text, and speech, thus can interact with humans through diverse interfaces. The combination of parameterized model and muscular model is used to obtain a tradeoff between computational efficiency and high realism of 3-D facial animation. The online appearance model is used to track 3-D facial motion from video in the framework of particle filtering, and multiple measurements, i.e., pixel color value of input image and Gabor wavelet coefficient of illumination ratio image, are infused to reduce the influence of lighting and person dependence for the construction of online appearance model. The tri-phone model is used to reduce the computational consumption of visual co-articulation in speech synchronized viseme synthesis without sacrificing any performance. The objective and subjective experiments show that the system is suitable for human-machine interaction.

  19. The Virtual Solar Observatory: What Are We Up To Now?

    NASA Technical Reports Server (NTRS)

    Gurman, J. B.; Hill, F.; Suarez-Sola, F.; Bogart, R.; Amezcua, A.; Martens, P.; Hourcle, J.; Hughitt, K.; Davey, A.

    2012-01-01

    In the nearly ten years of a functional Virtual Solar Observatory (VSO), http://virtualsolar.org/ we have made it possible to query and access sixty-seven distinct solar data products and several event lists from nine spacecraft and fifteen observatories or observing networks. We have used existing VSO technology, and developed new software, for a distributed network of sites caching and serving SDO HMI and/ or AlA data. We have also developed an application programming interface (API) that has enabled VSO search and data access capabilities in IDL, Python, and Java. We also have quite a bit of work yet to do, including completion of the implementation of access to SDO EVE data, and access to some nineteen other data sets from space- and ground-based observatories. In addition, we have been developing a new graphic user interface that will enable the saving of user interface and search preferences. We solicit advice from the community input prioritizing our task list, and adding to it

  20. Evaluation of an Intelligent Tutoring System in Pathology: Effects of External Representation on Performance Gains, Metacognition, and Acceptance

    PubMed Central

    Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Tseytlin, Eugene; Roh, Ellen; Jukic, Drazen

    2007-01-01

    Objective Determine effects of computer-based tutoring on diagnostic performance gains, meta-cognition, and acceptance using two different problem representations. Describe impact of tutoring on spectrum of diagnostic skills required for task performance. Identify key features of student-tutor interaction contributing to learning gains. Design Prospective, between-subjects study, controlled for participant level of training. Resident physicians in two academic pathology programs spent four hours using one of two interfaces which differed mainly in external problem representation. The case-focused representation provided an open-learning environment in which students were free to explore evidence-hypothesis relationships within a case, but could not visualize the entire diagnostic space. The knowledge-focused representation provided an interactive representation of the entire diagnostic space, which more tightly constrained student actions. Measurements Metrics included results of pretest, post-test and retention-test for multiple choice and case diagnosis tests, ratios of performance to student reported certainty, results of participant survey, learning curves, and interaction behaviors during tutoring. Results Students had highly significant learning gains after one tutoring session. Learning was retained at one week. There were no differences between the two interfaces in learning gains on post-test or retention test. Only students in the knowledge-focused interface exhibited significant metacognitive gains from pretest to post-test and pretest to retention test. Students rated the knowledge-focused interface significantly higher than the case-focused interface. Conclusions Cognitive tutoring is associated with improved diagnostic performance in a complex medical domain. The effect is retained at one-week post-training. Knowledge-focused external problem representation shows an advantage over case-focused representation for metacognitive effects and user acceptance. PMID:17213494

  1. Evaluation of an intelligent tutoring system in pathology: effects of external representation on performance gains, metacognition, and acceptance.

    PubMed

    Crowley, Rebecca S; Legowski, Elizabeth; Medvedeva, Olga; Tseytlin, Eugene; Roh, Ellen; Jukic, Drazen

    2007-01-01

    Determine effects of computer-based tutoring on diagnostic performance gains, meta-cognition, and acceptance using two different problem representations. Describe impact of tutoring on spectrum of diagnostic skills required for task performance. Identify key features of student-tutor interaction contributing to learning gains. Prospective, between-subjects study, controlled for participant level of training. Resident physicians in two academic pathology programs spent four hours using one of two interfaces which differed mainly in external problem representation. The case-focused representation provided an open-learning environment in which students were free to explore evidence-hypothesis relationships within a case, but could not visualize the entire diagnostic space. The knowledge-focused representation provided an interactive representation of the entire diagnostic space, which more tightly constrained student actions. Metrics included results of pretest, post-test and retention-test for multiple choice and case diagnosis tests, ratios of performance to student reported certainty, results of participant survey, learning curves, and interaction behaviors during tutoring. Students had highly significant learning gains after one tutoring session. Learning was retained at one week. There were no differences between the two interfaces in learning gains on post-test or retention test. Only students in the knowledge-focused interface exhibited significant metacognitive gains from pretest to post-test and pretest to retention test. Students rated the knowledge-focused interface significantly higher than the case-focused interface. Cognitive tutoring is associated with improved diagnostic performance in a complex medical domain. The effect is retained at one-week post-training. Knowledge-focused external problem representation shows an advantage over case-focused representation for metacognitive effects and user acceptance.

  2. Hirarchical emotion calculation model for virtual human modellin - biomed 2010.

    PubMed

    Zhao, Yue; Wright, David

    2010-01-01

    This paper introduces a new emotion generation method for virtual human modelling. The method includes a novel hierarchical emotion structure, a group of emotion calculation equations and a simple heuristics decision making mechanism, which enables virtual humans to perform emotionally in real-time according to their internal and external factors. Emotion calculation equations used in this research were derived from psychologic emotion measurements. Virtual humans can utilise the information in virtual memory and emotion calculation equations to generate their own numerical emotion states within the hierarchical emotion structure. Those emotion states are important internal references for virtual humans to adopt appropriate behaviours and also key cues for their decision making. A simple heuristics theory is introduced and integrated into decision making process in order to make the virtual humans decision making more like a real human. A data interface which connects the emotion calculation and the decision making structure together has also been designed and simulated to test the method in Virtools environment.

  3. Virtual coach technology for supporting self-care.

    PubMed

    Ding, Dan; Liu, Hsin-Yi; Cooper, Rosemarie; Cooper, Rory A; Smailagic, Asim; Siewiorek, Dan

    2010-02-01

    "Virtual Coach" refers to a coaching program or device aiming to guide users through tasks for the purpose of prompting positive behavior or assisting with learning new skills. This article reviews virtual coach interventions with the purpose of guiding rehabilitation professionals to comprehend more effectively the essential components of such interventions, the underlying technologies and their integration, and example applications. A design space of virtual coach interventions including self-monitoring, context awareness, interface modality, and coaching strategies were identified and discussed to address when, how, and what coaching messages to deliver in an automated and intelligent way. Example applications that address various health-related issues also are provided to illustrate how a virtual coach intervention is developed and evaluated. Finally, the article provides some insight into addressing key challenges and opportunities in designing and implementing virtual coach interventions. It is expected that more virtual coach interventions will be developed in the field of rehabilitation to support self-care and prevent secondary conditions in individuals with disabilities.

  4. Food Microbiology--Design and Testing of a Virtual Laboratory Exercise

    ERIC Educational Resources Information Center

    Flint, Steve; Stewart, Terry

    2010-01-01

    A web-based virtual laboratory exercise in identifying an unknown microorganism was designed for use with a cohort of 3rd-year university food-technology students. They were presented with a food-contamination case, and then walked through a number of diagnostic steps to identify the microorganism. At each step, the students were asked to select 1…

  5. [Virtual bronchoscopy in the child using multi-slice CT: initial clinical experiences].

    PubMed

    Kirchner, J; Laufer, U; Jendreck, M; Kickuth, R; Schilling, E M; Liermann, D

    2000-01-01

    Virtual bronchoscopy of the pediatric patient has been reported to be more difficult because of artifacts due to breathing or motion. We demonstrate the benefit of the accelerated examination based on multislice spiral CT (MSCT) in the pediatric patient which has not been reported so far. MSCT (tube voltage 120 kV, tube current 110 mA, 4 x 1 mm Slice thickness, 500 ms rotation time, Pitch 6) was performed on a CT scanner of the latest generation (Volume Zoom, Siemens Corp. Forchheim, Germany). In totally we examined 11 patients (median age 48 months, range 2-122 months) suspected of having tracheoesophageal fistula (n = 2), tracheobronchial narrowing (n = 8) due to intrinsic or extrinsic factors or injury of the bronchial system (n = 1). In all patients we obtained sufficient data for 3D reconstruction avoiding general anesthesia. 6/11 examinations were described to be without pathological finding. A definite diagnosis was obtained in 10 patients. Virtual bronchoscopy could avoid other invasive diagnostic examination in 8/11 patients (73%). Helical CT provides 3D-reconstruction and virtual bronchoscopy in the newborn as well as the infant. It avoids additional diagnostic bronchoscopy in a high percentage of all cases.

  6. Shear wave velocity measurements for differential diagnosis of solid breast masses: a comparison between virtual touch quantification and virtual touch IQ.

    PubMed

    Tozaki, Mitsuhiro; Saito, Masahiro; Benson, John; Fan, Liexiang; Isobe, Sachiko

    2013-12-01

    This study compared the diagnostic performance of two shear wave speed measurement techniques in 81 patients with 83 solid breast lesions. Virtual Touch Quantification, which provides single-point shear wave speed measurement capability (SP-SWS), was compared with Virtual Touch IQ, a new 2-D shear wave imaging technique with multi-point shear wave speed measurement capability (2D-SWS). With SP-SWS, shear wave velocity was measured within the lesion ("internal" value) and the marginal areas ("marginal" value). With 2D-SWS, the highest velocity was measured. The marginal values obtained with the SP-SWS and 2D-SWS methods were significantly higher for malignant lesions and benign lesions, respectively (p < 0.0001). Sensitivity, specificity and accuracy were 86% (36/42), 90% (37/41) and 88% (73/83), respectively, for SP-SWS, and 88% (37/42), 93% (38/41) and 90% (75/83), respectively, for 2D-SWS. It is concluded that 2D-SWS is a useful diagnostic tool for differentiating malignant from benign solid breast masses. Copyright © 2013 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  7. Are Mash-Ups the Future for Online Learning Platforms? Psychology A-Level Students' Judgements about VLE and MUPPLE Interfaces

    ERIC Educational Resources Information Center

    Jarvis, Matt; Gauntlett, Lizzie; Collins, Hayley

    2011-01-01

    Virtual Learning Environments (VLEs) have become ubiquitous in colleges and universities but have failed to consistently improve learning (Machin, 2007). An alternative interface can be provided in the form of a mashed-up personal learning environment (MUPPLE). The aim of this study was to investigate student perceptions of its desirability and…

  8. Augmented Reality, Virtual Reality and Their Effect on Learning Style in the Creative Design Process

    ERIC Educational Resources Information Center

    Chandrasekera, Tilanka; Yoon, So-Yeon

    2018-01-01

    Research has shown that user characteristics such as preference for using an interface can result in effective use of the interface. Research has also suggested that there is a relationship between learner preference and creativity. This study uses the VARK learning styles inventory to assess students learning style then explores how this learning…

  9. Architecture, Design, and Development of an HTML/JavaScript Web-Based Group Support System.

    ERIC Educational Resources Information Center

    Romano, Nicholas C., Jr.; Nunamaker, Jay F., Jr.; Briggs, Robert O.; Vogel, Douglas R.

    1998-01-01

    Examines the need for virtual workspaces and describes the architecture, design, and development of GroupSystems for the World Wide Web (GSWeb), an HTML/JavaScript Web-based Group Support System (GSS). GSWeb, an application interface similar to a Graphical User Interface (GUI), is currently used by teams around the world and relies on user…

  10. Image Security

    DTIC Science & Technology

    1999-01-01

    34. twenty-first century. These papers illustrate topics such as a development ofvirtual environment applications, different uses ofVRML in information system...interfaces, an examination of research in virtual reality environment interfaces, and five approaches to supporting changes’ in virtuaI environments...we get false negatives that contribute to the probability of false rejection Prrj). { l � Taking these error probabilities into account, we define a

  11. DXRaySMCS: a user-friendly interface developed for prediction of diagnostic radiology X-ray spectra produced by Monte Carlo (MCNP-4C) simulation.

    PubMed

    Bahreyni Toossi, M T; Moradi, H; Zare, H

    2008-01-01

    In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.

  12. The Modeling of Virtual Environment Distance Education

    NASA Astrophysics Data System (ADS)

    Xueqin, Chang

    This research presented a virtual environment that integrates in a virtual mockup services available in a university campus for students and teachers communication in different actual locations. Advantages of this system include: the remote access to a variety of services and educational tools, the representation of real structures and landscapes in an interactive 3D model that favors localization of services and preserves the administrative organization of the university. For that, the system was implemented a control access for users and an interface to allow the use of previous educational equipments and resources not designed for distance education mode.

  13. Virtual reality haptic human dissection.

    PubMed

    Needham, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-01-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist and investigate the cross-discipline collaborations required in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  14. Multiprog virtual laboratory applied to PLC programming learning

    NASA Astrophysics Data System (ADS)

    Shyr, Wen-Jye

    2010-10-01

    This study develops a Multiprog virtual laboratory for a mechatronics education designed to teach how to programme a programmable logic controller (PLC). The study was carried out with 34 students in the Department of Industry Education and Technology at National Changhua University of Education in Taiwan. In total, 17 students were assigned to each group, experimental and control. Two laboratory exercises were designed to provide students with experience in PLC programming. The results show that the experiments supported by Multiprog virtual laboratory user-friendly control interfaces generate positive meaningful results in regard to students' knowledge and understanding of the material.

  15. Virtual and physical toys: open-ended features for non-formal learning.

    PubMed

    Petersson, Eva; Brooks, Anthony

    2006-04-01

    This paper examines the integrated toy--both physical and virtual--as an essential resource for collaborative learning. This learning incorporates rehabilitation, training, and education. The data derived from two different cases. Pedagogical issues related to non-formal learning and open-ended features of design are discussed. Findings suggest that social, material, and expressive affordances constitute a base for an alterative interface to encourage children's play and learning.

  16. Considerations for Adaptive Tutoring Within Serious Games: Authoring Cognitive Models and game Interfaces

    DTIC Science & Technology

    2011-06-01

    character skills correspond to real- world player skills (transfer). In games such as World of Warcraft , "grinding" behaviors are popular (boring...reflecting on a recent emphasis on self-directed learning using game-based simulations and virtual worlds , the authors considered key challenges in...transforming serious games and virtual worlds into adaptive training tools. This article reflects specifically on the challenges and potential of cognitive

  17. "MSN Was the next Big Thing after Beanie Babies": Children's Virtual Experiences as an Interface to Their Identities and Their Everyday Lives

    ERIC Educational Resources Information Center

    Thomas, Angela

    2006-01-01

    In this article the author explores the seamlessness between children's online and offline worlds. For children, there is no dichotomy of online and offline, or virtual and real; the digital is so much intertwined into their lives and psyche that the one is entirely enmeshed with the other. Despite early research pointing to the differences that…

  18. The Impact of User-Input Devices on Virtual Desktop Trainers

    DTIC Science & Technology

    2010-09-01

    playing the game more enjoyable. Some of these changes include the design of controllers, the controller interface, and ergonomic changes made to...within subjects experimental design to evaluate young active duty Soldier’s ability to move and shoot in a virtual environment using different input...sufficient gaming proficiency, resulting in more time dedicated to training military skills. We employed a within subjects experimental design to

  19. Development of and feedback on a fully automated virtual reality system for online training in weight management skills.

    PubMed

    Thomas, J Graham; Spitalnick, Josh S; Hadley, Wendy; Bond, Dale S; Wing, Rena R

    2015-01-01

    Virtual reality (VR) technology can provide a safe environment for observing, learning, and practicing use of behavioral weight management skills, which could be particularly useful in enhancing minimal contact online weight management programs. The Experience Success (ES) project developed a system for creating and deploying VR scenarios for online weight management skills training. Virtual environments populated with virtual actors allow users to experiment with implementing behavioral skills via a PC-based point and click interface. A culturally sensitive virtual coach guides the experience, including planning for real-world skill use. Thirty-seven overweight/obese women provided feedback on a test scenario focused on social eating situations. They reported that the scenario gave them greater skills, confidence, and commitment for controlling eating in social situations. © 2014 Diabetes Technology Society.

  20. Development of and Feedback on a Fully Automated Virtual Reality System for Online Training in Weight Management Skills

    PubMed Central

    Spitalnick, Josh S.; Hadley, Wendy; Bond, Dale S.; Wing, Rena R.

    2014-01-01

    Virtual reality (VR) technology can provide a safe environment for observing, learning, and practicing use of behavioral weight management skills, which could be particularly useful in enhancing minimal contact online weight management programs. The Experience Success (ES) project developed a system for creating and deploying VR scenarios for online weight management skills training. Virtual environments populated with virtual actors allow users to experiment with implementing behavioral skills via a PC-based point and click interface. A culturally sensitive virtual coach guides the experience, including planning for real-world skill use. Thirty-seven overweight/obese women provided feedback on a test scenario focused on social eating situations. They reported that the scenario gave them greater skills, confidence, and commitment for controlling eating in social situations. PMID:25367014

  1. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    PubMed Central

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  2. Navigation of a virtual exercise environment with Microsoft Kinect by people post-stroke or with cerebral palsy.

    PubMed

    Pool, Sean M; Hoyle, John M; Malone, Laurie A; Cooper, Lloyd; Bickel, C Scott; McGwin, Gerald; Rimmer, James H; Eberhardt, Alan W

    2016-04-08

    One approach to encourage and facilitate exercise is through interaction with virtual environments. The present study assessed the utility of Microsoft Kinect as an interface for choosing between multiple routes within a virtual environment through body gestures and voice commands. The approach was successfully tested on 12 individuals post-stroke and 15 individuals with cerebral palsy (CP). Participants rated their perception of difficulty in completing each gesture using a 5-point Likert scale questionnaire. The "most viable" gestures were defined as those with average success rates of 90% or higher and perception of difficulty ranging between easy and very easy. For those with CP, hand raises, hand extensions, and head nod gestures were found most viable. For those post-stroke, the most viable gestures were torso twists, head nods, as well as hand raises and hand extensions using the less impaired hand. Voice commands containing two syllables were viable (>85% successful) for those post-stroke; however, participants with CP were unable to complete any voice commands with a high success rate. This study demonstrated that Kinect may be useful for persons with mobility impairments to interface with virtual exercise environments, but the effectiveness of the various gestures depends upon the disability of the user.

  3. Virtual Instrumentation for a Fiber-Optics-Based Artificial Nerve

    NASA Technical Reports Server (NTRS)

    Lyons, Donald R.; Kyaw, Thet Mon; Griffin, DeVon (Technical Monitor)

    2001-01-01

    A LabView-based computer interface for fiber-optic artificial nerves has been devised as a Masters thesis project. This project involves the use of outputs from wavelength multiplexed optical fiber sensors (artificial nerves), which are capable of producing dense optical data outputs for physical measurements. The potential advantages of using optical fiber sensors for sensory function restoration is the fact that well defined WDM-modulated signals can be transmitted to and from the sensing region allowing networked units to replace low-level nerve functions for persons desirous of "intelligent artificial limbs." Various FO sensors can be designed with high sensitivity and the ability to be interfaced with a wide range of devices including miniature shielded electrical conversion units. Our Virtual Instrument (VI) interface software package was developed using LabView's "Laboratory Virtual Instrument Engineering Workbench" package. The virtual instrument has been configured to arrange and encode the data to develop an intelligent response in the form of encoded digitized signal outputs. The architectural layout of our nervous system is such that different touch stimuli from different artificial fiber-optic nerve points correspond to gratings of a distinct resonant wavelength and physical location along the optical fiber. Thus, when an automated, tunable diode laser sends scans, the wavelength spectrum of the artificial nerve, it triggers responses that are encoded with different touch stimuli by way wavelength shifts in the reflected Bragg resonances. The reflected light is detected and a resulting analog signal is fed into ADC1 board and DAQ card. Finally, the software has been written such that the experimenter is able to set the response range during data acquisition.

  4. Learning in a virtual environment using haptic systems for movement re-education: can this medium be used for remodeling other behaviors and actions?

    PubMed

    Merians, Alma S; Fluet, Gerard G; Qiu, Qinyin; Lafond, Ian; Adamovich, Sergei V

    2011-03-01

    Robotic systems that are interfaced with virtual reality gaming and task simulations are increasingly being developed to provide repetitive intensive practice to promote increased compliance and facilitate better outcomes in rehabilitation post-stroke. A major development in the use of virtual environments (VEs) has been to incorporate tactile information and interaction forces into what was previously an essentially visual experience. Robots of varying complexity are being interfaced with more traditional virtual presentations to provide haptic feedback that enriches the sensory experience and adds physical task parameters. This provides forces that produce biomechanical and neuromuscular interactions with the VE that approximate real-world movement more accurately than visual-only VEs, simulating the weight and force found in upper extremity tasks. The purpose of this article is to present an overview of several systems that are commercially available for ambulation training and for training movement of the upper extremity. We will also report on the system that we have developed (NJIT-RAVR system) that incorporates motivating and challenging haptic feedback effects into VE simulations to facilitate motor recovery of the upper extremity post-stroke. The NJIT-RAVR system trains both the upper arm and the hand. The robotic arm acts as an interface between the participants and the VEs, enabling multiplanar movements against gravity in a three-dimensional workspace. The ultimate question is whether this medium can provide a motivating, challenging, gaming experience with dramatically decreased physical difficulty levels, which would allow for participation by an obese person and facilitate greater adherence to exercise regimes. © 2011 Diabetes Technology Society.

  5. Integrated Artificial Intelligence Approaches for Disease Diagnostics.

    PubMed

    Vashistha, Rajat; Chhabra, Deepak; Shukla, Pratyoosh

    2018-06-01

    Mechanocomputational techniques in conjunction with artificial intelligence (AI) are revolutionizing the interpretations of the crucial information from the medical data and converting it into optimized and organized information for diagnostics. It is possible due to valuable perfection in artificial intelligence, computer aided diagnostics, virtual assistant, robotic surgery, augmented reality and genome editing (based on AI) technologies. Such techniques are serving as the products for diagnosing emerging microbial or non microbial diseases. This article represents a combinatory approach of using such approaches and providing therapeutic solutions towards utilizing these techniques in disease diagnostics.

  6. Exploring virtual reality technology and the Oculus Rift for the examination of digital pathology slides.

    PubMed

    Farahani, Navid; Post, Robert; Duboy, Jon; Ahmed, Ishtiaque; Kolowitz, Brian J; Krinchai, Teppituk; Monaco, Sara E; Fine, Jeffrey L; Hartman, Douglas J; Pantanowitz, Liron

    2016-01-01

    Digital slides obtained from whole slide imaging (WSI) platforms are typically viewed in two dimensions using desktop personal computer monitors or more recently on mobile devices. To the best of our knowledge, we are not aware of any studies viewing digital pathology slides in a virtual reality (VR) environment. VR technology enables users to be artificially immersed in and interact with a computer-simulated world. Oculus Rift is among the world's first consumer-targeted VR headsets, intended primarily for enhanced gaming. Our aim was to explore the use of the Oculus Rift for examining digital pathology slides in a VR environment. An Oculus Rift Development Kit 2 (DK2) was connected to a 64-bit computer running Virtual Desktop software. Glass slides from twenty randomly selected lymph node cases (ten with benign and ten malignant diagnoses) were digitized using a WSI scanner. Three pathologists reviewed these digital slides on a 27-inch 5K display and with the Oculus Rift after a 2-week washout period. Recorded endpoints included concordance of final diagnoses and time required to examine slides. The pathologists also rated their ease of navigation, image quality, and diagnostic confidence for both modalities. There was 90% diagnostic concordance when reviewing WSI using a 5K display and Oculus Rift. The time required to examine digital pathology slides on the 5K display averaged 39 s (range 10-120 s), compared to 62 s with the Oculus Rift (range 15-270 s). All pathologists confirmed that digital pathology slides were easily viewable in a VR environment. The ratings for image quality and diagnostic confidence were higher when using the 5K display. Using the Oculus Rift DK2 to view and navigate pathology whole slide images in a virtual environment is feasible for diagnostic purposes. However, image resolution using the Oculus Rift device was limited. Interactive VR technologies such as the Oculus Rift are novel tools that may be of use in digital pathology.

  7. Utilization of Virtual Server Technology in Mission Operations

    NASA Technical Reports Server (NTRS)

    Felton, Larry; Lankford, Kimberly; Pitts, R. Lee; Pruitt, Robert W.

    2010-01-01

    Virtualization provides the opportunity to continue to do "more with less"---more computing power with fewer physical boxes, thus reducing the overall hardware footprint, power and cooling requirements, software licenses, and their associated costs. This paper explores the tremendous advantages and any disadvantages of virtualization in all of the environments associated with software and systems development to operations flow. It includes the use and benefits of the Intelligent Platform Management Interface (IPMI) specification, and identifies lessons learned concerning hardware and network configurations. Using the Huntsville Operations Support Center (HOSC) at NASA Marshall Space Flight Center as an example, we demonstrate that deploying virtualized servers as a means of managing computing resources is applicable and beneficial to many areas of application, up to and including flight operations.

  8. Virtualization in the Operations Environments

    NASA Technical Reports Server (NTRS)

    Pitts, Lee; Lankford, Kim; Felton, Larry; Pruitt, Robert

    2010-01-01

    Virtualization provides the opportunity to continue to do "more with less"---more computing power with fewer physical boxes, thus reducing the overall hardware footprint, power and cooling requirements, software licenses, and their associated costs. This paper explores the tremendous advantages and any disadvantages of virtualization in all of the environments associated with software and systems development to operations flow. It includes the use and benefits of the Intelligent Platform Management Interface (IPMI) specification, and identifies lessons learned concerning hardware and network configurations. Using the Huntsville Operations Support Center (HOSC) at NASA Marshall Space Flight Center as an example, we demonstrate that deploying virtualized servers as a means of managing computing resources is applicable and beneficial to many areas of application, up to and including flight operations.

  9. 40 CFR 85.2231 - On-board diagnostic test equipment requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 19 2012-07-01 2012-07-01 false On-board diagnostic test equipment... Warranty Short Tests § 85.2231 On-board diagnostic test equipment requirements. (a) The test system interface to the vehicle shall include a plug that conforms to SAE J1962 “Diagnostic Connector.” The...

  10. 40 CFR 85.2231 - On-board diagnostic test equipment requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 18 2011-07-01 2011-07-01 false On-board diagnostic test equipment... Warranty Short Tests § 85.2231 On-board diagnostic test equipment requirements. (a) The test system interface to the vehicle shall include a plug that conforms to SAE J1962 “Diagnostic Connector.” The...

  11. 40 CFR 85.2231 - On-board diagnostic test equipment requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 19 2013-07-01 2013-07-01 false On-board diagnostic test equipment... Warranty Short Tests § 85.2231 On-board diagnostic test equipment requirements. (a) The test system interface to the vehicle shall include a plug that conforms to SAE J1962 “Diagnostic Connector.” The...

  12. Testing of the Crew Exploration Vehicle in NASA Langley's Unitary Plan Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Murphy, Kelly J.; Borg, Stephen E.; Watkins, Anthony N.; Cole, Daniel R.; Schwartz, Richard J.

    2007-01-01

    As part of a strategic, multi-facility test program, subscale testing of NASA s Crew Exploration Vehicle was conducted in both legs of NASA Langley s Unitary Plan Wind Tunnel. The objectives of these tests were to generate aerodynamic and surface pressure data over a range of supersonic Mach numbers and reentry angles of attack for experimental and computational validation and aerodynamic database development. To provide initial information on boundary layer transition at supersonic test conditions, transition studies were conducted using temperature sensitive paint and infrared thermography optical techniques. To support implementation of these optical diagnostics in the Unitary Wind Tunnel, the experiment was first modeled using the Virtual Diagnostics Interface software. For reentry orientations of 140 to 170 degrees (heat shield forward), windward surface flow was entirely laminar for freestream unit Reynolds numbers equal to or less than 3 million per foot. Optical techniques showed qualitative evidence of forced transition on the windward heat shield with application of both distributed grit and discreet trip dots. Longitudinal static force and moment data showed the largest differences with Mach number and angle of attack variations. Differences associated with Reynolds number variation and/or laminar versus turbulent flow on the heat shield were very small. Static surface pressure data supported the aforementioned trends with Mach number, Reynolds number, and angle of attack.

  13. Virtual sensors for robust on-line monitoring (OLM) and Diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tipireddy, Ramakrishna; Lerchen, Megan E.; Ramuhalli, Pradeep

    Unscheduled shutdown of nuclear power facilities for recalibration and replacement of faulty sensors can be expensive and disruptive to grid management. In this work, we present virtual (software) sensors that can replace a faulty physical sensor for a short duration thus allowing recalibration to be safely deferred to a later time. The virtual sensor model uses a Gaussian process model to process input data from redundant and other nearby sensors. Predicted data includes uncertainty bounds including spatial association uncertainty and measurement noise and error. Using data from an instrumented cooling water flow loop testbed, the virtual sensor model has predictedmore » correct sensor measurements and the associated error corresponding to a faulty sensor.« less

  14. Interactive voxel graphics in virtual reality

    NASA Astrophysics Data System (ADS)

    Brody, Bill; Chappell, Glenn G.; Hartman, Chris

    2002-06-01

    Interactive voxel graphics in virtual reality poses significant research challenges in terms of interface, file I/O, and real-time algorithms. Voxel graphics is not so new, as it is the focus of a good deal of scientific visualization. Interactive voxel creation and manipulation is a more innovative concept. Scientists are understandably reluctant to manipulate data. They collect or model data. A scientific analogy to interactive graphics is the generation of initial conditions for some model. It is used as a method to test those models. We, however, are in the business of creating new data in the form of graphical imagery. In our endeavor, science is a tool and not an end. Nevertheless, there is a whole class of interactions and associated data generation scenarios that are natural to our way of working and that are also appropriate to scientific inquiry. Annotation by sketching or painting to point to and distinguish interesting and important information is very significant for science as well as art. Annotation in 3D is difficult without a good 3D interface. Interactive graphics in virtual reality is an appropriate approach to this problem.

  15. A Web-based cost-effective training tool with possible application to brain injury rehabilitation.

    PubMed

    Wang, Peijun; Kreutzer, Ina Anna; Bjärnemo, Robert; Davies, Roy C

    2004-06-01

    Virtual reality (VR) has provoked enormous interest in the medical community. In particular, VR offers therapists new approaches for improving rehabilitation effects. However, most of these VR assistant tools are not very portable, extensible or economical. Due to the vast amount of 3D data, they are not suitable for Internet transfer. Furthermore, in order to run these VR systems smoothly, special hardware devices are needed. As a result, existing VR assistant tools tend to be available in hospitals but not in patients' homes. To overcome these disadvantages, as a case study, this paper proposes a Web-based Virtual Ticket Machine, called WBVTM, using VRML [VRML Consortium, The Virtual Reality Modeling Language: International Standard ISO/IEC DIS 14772-1, 1997, available at ], Java and EAI (External Authoring Interface) [Silicon Graphics, Inc., The External Authoring Interface (EAI), available at ], to help people with acquired brain injury (ABI) to relearn basic living skills at home at a low cost. As these technologies are open standard and feature usability on the Internet, WBVTM achieves the goals of portability, easy accessibility and cost-effectiveness.

  16. Human Machine Interfaces for Teleoperators and Virtual Environments Conference

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In a teleoperator system the human operator senses, moves within, and operates upon a remote or hazardous environment by means of a slave mechanism (a mechanism often referred to as a teleoperator). In a virtual environment system the interactive human machine interface is retained but the slave mechanism and its environment are replaced by a computer simulation. Video is replaced by computer graphics. The auditory and force sensations imparted to the human operator are similarly computer generated. In contrast to a teleoperator system, where the purpose is to extend the operator's sensorimotor system in a manner that facilitates exploration and manipulation of the physical environment, in a virtual environment system, the purpose is to train, inform, alter, or study the human operator to modify the state of the computer and the information environment. A major application in which the human operator is the target is that of flight simulation. Although flight simulators have been around for more than a decade, they had little impact outside aviation presumably because the application was so specialized and so expensive.

  17. Interrater Reliability of the Power Mobility Road Test in the Virtual Reality-Based Simulator-2.

    PubMed

    Kamaraj, Deepan C; Dicianno, Brad E; Mahajan, Harshal P; Buhari, Alhaji M; Cooper, Rory A

    2016-07-01

    To assess interrater reliability of the Power Mobility Road Test (PMRT) when administered through the Virtual Reality-based SIMulator-version 2 (VRSIM-2). Within-subjects repeated-measures design. Participants interacted with VRSIM-2 through 2 display options (desktop monitor vs immersive virtual reality screens) using 2 control interfaces (roller system vs conventional movement-sensing joystick), providing 4 different driving scenarios (driving conditions 1-4). Participants performed 3 virtual driving sessions for each of the 2 display screens and 1 session through a real-world driving course (driving condition 5). The virtual PMRT was conducted in a simulated indoor office space, and an equivalent course was charted in an open space for the real-world assessment. After every change in driving condition, participants completed a self-reported workload assessment questionnaire, the Task Load Index, developed by the National Aeronautics and Space Administration. A convenience sample of electric-powered wheelchair (EPW) athletes (N=21) recruited at the 31st National Veterans Wheelchair Games. Not applicable. Total composite PMRT score. The PMRT had high interrater reliability (intraclass correlation coefficient [ICC]>.75) between the 2 raters in all 5 driving conditions. Post hoc analyses revealed that the reliability analyses had >80% power to detect high ICCs in driving conditions 1 and 4. The PMRT has high interrater reliability in conditions 1 and 4 and could be used to assess EPW driving performance virtually in VRSIM-2. However, further psychometric assessment is necessary to assess the feasibility of administering the PMRT using the different interfaces of VRSIM-2. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. Remote laboratories for optical metrology: from the lab to the cloud

    NASA Astrophysics Data System (ADS)

    Osten, W.; Wilke, M.; Pedrini, G.

    2012-10-01

    The idea of remote and virtual metrology has been reported already in 2000 with a conceptual illustration by use of comparative digital holography, aimed at the comparison of two nominally identical but physically different objects, e.g., master and sample, in industrial inspection processes. However, the concept of remote and virtual metrology can be extended far beyond this. For example, it does not only allow for the transmission of static holograms over the Internet, but also provides an opportunity to communicate with and eventually control the physical set-up of a remote metrology system. Furthermore, the metrology system can be modeled in the environment of a 3D virtual reality using CAD or similar technology, providing a more intuitive interface to the physical setup within the virtual world. An engineer or scientist who would like to access the remote real world system can log on to the virtual system, moving and manipulating the setup through an avatar and take the desired measurements. The real metrology system responds to the interaction between the avatar and the 3D virtual representation, providing a more intuitive interface to the physical setup within the virtual world. The measurement data are stored and interpreted automatically for appropriate display within the virtual world, providing the necessary feedback to the experimenter. Such a system opens up many novel opportunities in industrial inspection such as the remote master-sample-comparison and the virtual assembling of parts that are fabricated at different places. Moreover, a multitude of new techniques can be envisaged. To them belong modern ways for documenting, efficient methods for metadata storage, the possibility for remote reviewing of experimental results, the adding of real experiments to publications by providing remote access to the metadata and to the experimental setup via Internet, the presentation of complex experiments in classrooms and lecture halls, the sharing of expensive and complex infrastructure within international collaborations, the implementation of new ways for the remote test of new devices, for their maintenance and service, and many more. The paper describes the idea of remote laboratories and illustrates the potential of the approach on selected examples with special attention to optical metrology.

  19. Virtual endoscopy using spherical QuickTime-VR panorama views.

    PubMed

    Tiede, Ulf; von Sternberg-Gospos, Norman; Steiner, Paul; Höhne, Karl Heinz

    2002-01-01

    Virtual endoscopy needs some precomputation of the data (segmentation, path finding) before the diagnostic process can take place. We propose a method that precomputes multinode spherical panorama movies using Quick-Time VR. This technique allows almost the same navigation and visualization capabilities as a real endoscopic procedure, a significant reduction of interaction input is achieved and the movie represents a document of the procedure.

  20. Fast virtual functional assessment of intermediate coronary lesions using routine angiographic data and blood flow simulation in humans: comparison with pressure wire - fractional flow reserve.

    PubMed

    Papafaklis, Michail I; Muramatsu, Takashi; Ishibashi, Yuki; Lakkas, Lampros S; Nakatani, Shimpei; Bourantas, Christos V; Ligthart, Jurgen; Onuma, Yoshinobu; Echavarria-Pinto, Mauro; Tsirka, Georgia; Kotsia, Anna; Nikas, Dimitrios N; Mogabgab, Owen; van Geuns, Robert-Jan; Naka, Katerina K; Fotiadis, Dimitrios I; Brilakis, Emmanouil S; Garcia-Garcia, Héctor M; Escaned, Javier; Zijlstra, Felix; Michalis, Lampros K; Serruys, Patrick W

    2014-09-01

    To develop a simplified approach of virtual functional assessment of coronary stenosis from routine angiographic data and test it against fractional flow reserve using a pressure wire (wire-FFR). Three-dimensional quantitative coronary angiography (3D-QCA) was performed in 139 vessels (120 patients) with intermediate lesions assessed by wire-FFR (reference standard: ≤0.80). The 3D-QCA models were processed with computational fluid dynamics (CFD) to calculate the lesion-specific pressure gradient (ΔP) and construct the ΔP-flow curve, from which the virtual functional assessment index (vFAI) was derived. The discriminatory power of vFAI for ischaemia- producing lesions was high (area under the receiver operator characteristic curve [AUC]: 92% [95% CI: 86-96%]). Diagnostic accuracy, sensitivity and specificity for the optimal vFAI cut-point (≤0.82) were 88%, 90% and 86%, respectively. Virtual-FAI demonstrated superior discrimination against 3D-QCA-derived % area stenosis (AUC: 78% [95% CI: 70- 84%]; p<0.0001 compared to vFAI). There was a close correlation (r=0.78, p<0.0001) and agreement of vFAI compared to wire-FFR (mean difference: -0.0039±0.085, p=0.59). We developed a fast and simple CFD-powered virtual haemodynamic assessment model using only routine angiography and without requiring any invasive physiology measurements/hyperaemia induction. Virtual-FAI showed a high diagnostic performance and incremental value to QCA for predicting wire-FFR; this "less invasive" approach could have important implications for patient management and cost.

  1. Virtual gaming simulation of a mental health assessment: A usability study.

    PubMed

    Verkuyl, Margaret; Romaniuk, Daria; Mastrilli, Paula

    2018-05-18

    Providing safe and realistic virtual simulations could be an effective way to facilitate the transition from the classroom to clinical practice. As nursing programs begin to include virtual simulations as a learning strategy; it is critical to first assess the technology for ease of use and usefulness. A virtual gaming simulation was developed, and a usability study was conducted to assess its ease of use and usefulness for students and faculty. The Technology Acceptance Model provided the framework for the study, which included expert review and testing by nursing faculty and nursing students. This study highlighted the importance of assessing ease of use and usefulness in a virtual game simulation and provided feedback for the development of an effective virtual gaming simulation. The study participants said the virtual gaming simulation was engaging, realistic and similar to a clinical experience. Participants found the game easy to use and useful. Testing provided the development team with ideas to improve the user interface. The usability methodology provided is a replicable approach to testing virtual experiences before a research study or before implementing virtual experiences into curriculum. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. A virtual work space for both hands manipulation with coherency between kinesthetic and visual sensation

    NASA Technical Reports Server (NTRS)

    Ishii, Masahiro; Sukanya, P.; Sato, Makoto

    1994-01-01

    This paper describes the construction of a virtual work space for tasks performed by two handed manipulation. We intend to provide a virtual environment that encourages users to accomplish tasks as they usually act in a real environment. Our approach uses a three dimensional spatial interface device that allows the user to handle virtual objects by hand and be able to feel some physical properties such as contact, weight, etc. We investigated suitable conditions for constructing our virtual work space by simulating some basic assembly work, a face and fit task. We then selected the conditions under which the subjects felt most comfortable in performing this task and set up our virtual work space. Finally, we verified the possibility of performing more complex tasks in this virtual work space by providing simple virtual models and then let the subjects create new models by assembling these components. The subjects can naturally perform assembly operations and accomplish the task. Our evaluation shows that this virtual work space has the potential to be used for performing tasks that require two-handed manipulation or cooperation between both hands in a natural manner.

  3. STS-134 crew and Expedition 24/25 crew member Shannon Walker

    NASA Image and Video Library

    2010-03-25

    JSC2010-E-043667 (25 March 2010) --- NASA astronaut Mark Kelly, STS-134 commander, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  4. STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus

    NASA Image and Video Library

    2007-08-09

    JSC2007-E-41540 (9 Aug. 2007) --- Astronauts Pamela A. Melroy, STS-120 commander, and European Space Agency's (ESA) Paolo Nespoli, mission specialist, use the virtual reality lab at Johnson Space Center to train for their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  5. STS-126 crew during preflight VR LAB MSS EVA2 training

    NASA Image and Video Library

    2008-04-14

    JSC2008-E-033771 (14 April 2008) --- Astronaut Eric A. Boe, STS-126 pilot, uses the virtual reality lab in the Space Vehicle Mockup Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  6. STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus

    NASA Image and Video Library

    2007-08-09

    JSC2007-E-41541 (9 Aug. 2007) --- Astronauts Stephanie Wilson, STS-120 mission specialist, and Dan Tani, Expedition 16 flight engineer, use the virtual reality lab at Johnson Space Center to train for their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  7. The power of pezonomics

    NASA Technical Reports Server (NTRS)

    Orr, Joel N.

    1995-01-01

    This reflection of human-computer interface and its requirements as virtual technology is advanced, proposes a new term: 'Pezonomics'. The term replaces the term ergonomics ('the law of work') with a definition pointing to 'the law of play.' The necessity of this term, the author reasons, comes from the need to 'capture the essence of play and calibrate our computer systems to its cadences.' Pezonomics will ensure that artificial environments, in particular virtual reality, are user friendly.

  8. Virtual Wingman: Harnessing the Future Unstructured Information Environment to Achieve Mission Success

    DTIC Science & Technology

    2010-12-01

    Ibid. 24. Ibid. 25. Ibid. 26. Carlson, “Verizon Unifies Communications ,” 1. 27. Bednarz, “Users Turn to Virtual Data Marts,” 56. 28. Coombs ...systems that do not communicate .”7 Data format standards are an oft-tried interoperability approach to homogenize interfaces between functional, physical...instance—and not the collection sources used to create the warning. Unfortunately, the intelligence community (IC) has yet to widely decouple

  9. The Optokinetic Cervical Reflex (OKCR) in Pilots of High-Performance Aircraft.

    DTIC Science & Technology

    1997-04-01

    Coupled System virtual reality - the attempt to create a realistic, three-dimensional environment or synthetic immersive environment in which the user ...factors interface between the pilot and the flight environment. The final section is a case study of head- and helmet-mounted displays (HMD) and the impact...themselves as actually moving (flying) through a virtual environment. However, in the studies of Held, et al. (1975) and Young, et al. (1975) the

  10. Weintek interfaces for controlling the position of a robotic arm

    NASA Astrophysics Data System (ADS)

    Barz, C.; Ilia, M.; Ilut, T.; Pop-Vadean, A.; Pop, P. P.; Dragan, F.

    2016-08-01

    The paper presents the use of Weintek panels to control the position of a robotic arm, operated step by step on the three motor axes. PLC control interface is designed with a Weintek touch screen. The HMI Weintek eMT3070a is the user interface in the process command of the PLC. This HMI controls the local PLC, entering the coordinate on the axes X, Y and Z. The subject allows the development in a virtual environment for e-learning and monitoring the robotic arm actions.

  11. A Standard-Compliant Virtual Meeting System with Active Video Object Tracking

    NASA Astrophysics Data System (ADS)

    Lin, Chia-Wen; Chang, Yao-Jen; Wang, Chih-Ming; Chen, Yung-Chang; Sun, Ming-Ting

    2002-12-01

    This paper presents an H.323 standard compliant virtual video conferencing system. The proposed system not only serves as a multipoint control unit (MCU) for multipoint connection but also provides a gateway function between the H.323 LAN (local-area network) and the H.324 WAN (wide-area network) users. The proposed virtual video conferencing system provides user-friendly object compositing and manipulation features including 2D video object scaling, repositioning, rotation, and dynamic bit-allocation in a 3D virtual environment. A reliable, and accurate scheme based on background image mosaics is proposed for real-time extracting and tracking foreground video objects from the video captured with an active camera. Chroma-key insertion is used to facilitate video objects extraction and manipulation. We have implemented a prototype of the virtual conference system with an integrated graphical user interface to demonstrate the feasibility of the proposed methods.

  12. Predicting Innovation Acceptance by Simulation in Virtual Environments (Theoretical Foundations)

    NASA Astrophysics Data System (ADS)

    León, Noel; Duran, Roberto; Aguayo, Humberto; Flores, Myrna

    This paper extends the current development of a methodology for Computer Aided Innovation. It begins with a presentation of concepts related to the perceived capabilities of virtual environments in the Innovation Cycle. The main premise establishes that it is possible to predict the acceptance of a new product in a specific market, by releasing an early prototype in a virtual scenario to quantify its general reception and to receive early feedback from potential customers. The paper continues to focus this research on a synergistic extension of techniques that have their origins in optimization and innovation disciplines. TRIZ (Theory of Inventive Problem Solving), extends the generation of variants with Evolutionary Algorithms (EA) and finally to present the designer and the intended customer, creative and innovative alternatives. All of this developed on a virtual software interface (Virtual World). The work continues with a general description of the project as a step forward to improve the overall strategy.

  13. Ambient clumsiness in virtual environments

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia; Behar, Katherine

    2010-01-01

    A fundamental pursuit of Virtual Reality is the experience of a seamless connection between the user's body and actions within the simulation. Virtual worlds often mediate the relationship between the physical and virtual body through creating an idealized representation of the self in an idealized space. This paper argues that the very ubiquity of the medium of virtual environments, such as the massively popular Second Life, has now made them mundane, and that idealized representations are no longer appropriate. In our artwork we introduce the attribute of clumsiness to Second Life by creating and distributing scripts that cause users' avatars to exhibit unpredictable stumbling, tripping, and momentary poor coordination, thus subtly and unexpectedly intervening with, rather than amplifying, a user's intent. These behaviors are publicly distributed, and manifest only occasionally - rather than intentional, conscious actions, they are involuntary and ambient. We suggest that the physical human body is itself an imperfect interface, and that the continued blurring of distinctions between the physical body and virtual representations calls for the introduction of these mundane, clumsy elements.

  14. Suitability of virtual prototypes to support human factors/ergonomics evaluation during the design.

    PubMed

    Aromaa, Susanna; Väänänen, Kaisa

    2016-09-01

    In recent years, the use of virtual prototyping has increased in product development processes, especially in the assessment of complex systems targeted at end-users. The purpose of this study was to evaluate the suitability of virtual prototyping to support human factors/ergonomics evaluation (HFE) during the design phase. Two different virtual prototypes were used: augmented reality (AR) and virtual environment (VE) prototypes of a maintenance platform of a rock crushing machine. Nineteen designers and other stakeholders were asked to assess the suitability of the prototype for HFE evaluation. Results indicate that the system model characteristics and user interface affect the experienced suitability. The VE system was valued as being more suitable to support the assessment of visibility, reach, and the use of tools than the AR system. The findings of this study can be used as a guidance for the implementing virtual prototypes in the product development process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Virtual Reality Simulation of the International Space Welding Experiment

    NASA Technical Reports Server (NTRS)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) It involves 3-dimensional computer graphics; (2) It includes real-time feedback and response to user actions; and (3) It must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, and we have therefore implemented a VR trainer for the International Space Welding Experiment. My role in the development of the ISWE trainer consisted of the following: (1) created texture-mapped models of the ISWE's rotating sample drum, technology block, tool stowage assembly, sliding foot restraint, and control panel; (2) developed C code for control panel button selection and rotation of the sample drum; (3) In collaboration with Tim Clark (Antares Virtual Reality Systems), developed a serial interface box for the PC and the SGI Indigo so that external control devices, similar to ones actually used on the ISWE, could be used to control virtual objects in the ISWE simulation; (4) In collaboration with Peter Wang (SFFP) and Mark Blasingame (Boeing), established the interference characteristics of the VIM 1000 head-mounted-display and tested software filters to correct the problem; (5) In collaboration with Peter Wang and Mark Blasingame, established software and procedures for interfacing the VPL DataGlove and the Polhemus 6DOF position sensors to the SGI Indigo serial ports. The majority of the ISWE modeling effort was conducted on a PC-based VR Workstation, described below.

  16. Accelerating Virtual High-Throughput Ligand Docking: current technology and case study on a petascale supercomputer.

    PubMed

    Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome

    2014-04-25

    In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.

  17. Development of a Virtual Reality Simulator for Natural Orifice Translumenal Endoscopic Surgery (NOTES) Cholecystectomy Procedure.

    PubMed

    Ahn, Woojin; Dargar, Saurabh; Halic, Tansel; Lee, Jason; Li, Baichun; Pan, Junjun; Sankaranarayanan, Ganesh; Roberts, Kurt; De, Suvranu

    2014-01-01

    The first virtual-reality-based simulator for Natural Orifice Translumenal Endoscopic Surgery (NOTES) is developed called the Virtual Translumenal Endoscopic Surgery Trainer (VTESTTM). VTESTTM aims to simulate hybrid NOTES cholecystectomy procedure using a rigid scope inserted through the vaginal port. The hardware interface is designed for accurate motion tracking of the scope and laparoscopic instruments to reproduce the unique hand-eye coordination. The haptic-enabled multimodal interactive simulation includes exposing the Calot's triangle and detaching the gall bladder while performing electrosurgery. The developed VTESTTM was demonstrated and validated at NOSCAR 2013.

  18. ChemScreener: A Distributed Computing Tool for Scaffold based Virtual Screening.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Deepak; Vyas, Renu

    2015-01-01

    In this work we present ChemScreener, a Java-based application to perform virtual library generation combined with virtual screening in a platform-independent distributed computing environment. ChemScreener comprises a scaffold identifier, a distinct scaffold extractor, an interactive virtual library generator as well as a virtual screening module for subsequently selecting putative bioactive molecules. The virtual libraries are annotated with chemophore-, pharmacophore- and toxicophore-based information for compound prioritization. The hits selected can then be further processed using QSAR, docking and other in silico approaches which can all be interfaced within the ChemScreener framework. As a sample application, in this work scaffold selectivity, diversity, connectivity and promiscuity towards six important therapeutic classes have been studied. In order to illustrate the computational power of the application, 55 scaffolds extracted from 161 anti-psychotic compounds were enumerated to produce a virtual library comprising 118 million compounds (17 GB) and annotated with chemophore, pharmacophore and toxicophore based features in a single step which would be non-trivial to perform with many standard software tools today on libraries of this size.

  19. A game based virtual campus tour

    NASA Astrophysics Data System (ADS)

    Razia Sulthana, A.; Arokiaraj Jovith, A.; Saveetha, D.; Jaithunbi, A. K.

    2018-04-01

    The aim of the application is to create a virtual reality game, whose purpose is to showcase the facilities of SRM University, while doing so in an entertaining manner. The virtual prototype of the institution is deployed in a game engine which eases the students to look over the infrastructure, thereby reducing the resources utilization. Time and money are the resources in concern today. The virtual campus application assists the end user even from a remote location. The virtual world simulates the exact location and hence the effect is created. Thus, it virtually transports the user to the university, with the help of a VR Headset. This is a dynamic application wherein the user can move in any direction. The VR headset provides an interface to get gyro input and this is used to start and stop the movement. Virtual Campus is size efficient and occupies minimal space. It is scalable against mobile gadgets. This gaming application helps the end user to explore the campus, while having fun too. It is a user friendly application that supports users worldwide.

  20. Development of a Locomotion Interface for Portable Virtual Environment Systems Using an Inertial/Magnetic Sensor-Based System and a Ranging Measurement System

    DTIC Science & Technology

    2014-03-01

    56 1. Motivation ...83 1. Motivation ...........................................................................................83 2. Environment Requirements...ENVIRONMENT SYSTEMS ......................................................97 A. BACKGROUND AND MOTIVATION

  1. Development of real-time motion capture system for 3D on-line games linked with virtual character

    NASA Astrophysics Data System (ADS)

    Kim, Jong Hyeong; Ryu, Young Kee; Cho, Hyung Suck

    2004-10-01

    Motion tracking method is being issued as essential part of the entertainment, medical, sports, education and industry with the development of 3-D virtual reality. Virtual human character in the digital animation and game application has been controlled by interfacing devices; mouse, joysticks, midi-slider, and so on. Those devices could not enable virtual human character to move smoothly and naturally. Furthermore, high-end human motion capture systems in commercial market are expensive and complicated. In this paper, we proposed a practical and fast motion capturing system consisting of optic sensors, and linked the data with 3-D game character with real time. The prototype experiment setup is successfully applied to a boxing game which requires very fast movement of human character.

  2. Design of virtual SCADA simulation system for pressurized water reactor

    NASA Astrophysics Data System (ADS)

    Wijaksono, Umar; Abdullah, Ade Gafar; Hakim, Dadang Lukman

    2016-02-01

    The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles of energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.

  3. Diagnosis of major cancer resection specimens with virtual slides: impact of a novel digital pathology workstation.

    PubMed

    Randell, Rebecca; Ruddle, Roy A; Thomas, Rhys G; Mello-Thoms, Claudia; Treanor, Darren

    2014-10-01

    Digital pathology promises a number of benefits in efficiency in surgical pathology, yet the longer time required to review a virtual slide than a glass slide currently represents a significant barrier to the routine use of digital pathology. We aimed to create a novel workstation that enables pathologists to view a case as quickly as on the conventional microscope. The Leeds Virtual Microscope (LVM) was evaluated using a mixed factorial experimental design. Twelve consultant pathologists took part, each viewing one long cancer case (12-25 slides) on the LVM and one on a conventional microscope. Total time taken and diagnostic confidence were similar for the microscope and LVM, as was the mean slide viewing time. On the LVM, participants spent a significantly greater proportion of the total task time viewing slides and revisited slides more often. The unique design of the LVM, enabling real-time rendering of virtual slides while providing users with a quick and intuitive way to navigate within and between slides, makes use of digital pathology in routine practice a realistic possibility. With further practice with the system, diagnostic efficiency on the LVM is likely to increase yet more. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Virtual simulation as a learning method in interventional radiology.

    PubMed

    Avramov, Predrag; Avramov, Milena; Juković, Mirela; Kadić, Vuk; Till, Viktor

    2013-01-01

    Radiology is the fastest growing discipline of medicine thanks to the implementation of new technologies and very rapid development of imaging diagnostic procedures in the last few decades. On the other hand, the development of imaging diagnostic procedures has put aside the traditional gaining of experience by working on real patients, and the need for other alternatives of learning interventional radiology procedures has emerged. A new method of virtual approach was added as an excellent alternative to the currently known methods of training on physical models and animals. Virtual reality represents a computer-generated reconstruction of anatomical environment with tactile interactions and it enables operators not only to learn on their own mistakes without compromising the patient's safety, but also to enhance their knowledge and experience. It is true that studies published so far on the validity of endovascular simulators have shown certain improvement of operator's technical skills and reduction in time needed for the procedure, but on the other hand, it is still a question whether these skills are transferable to the real patients in the angio room. With further improvement of technology, shortcomings of virtual approach to interventional procedures learning will be less significant and this procedure is likely to become the only method of learning in the near future.

  5. VirGO: A Visual Browser for the ESO Science Archive Facility

    NASA Astrophysics Data System (ADS)

    Chéreau, Fabien

    2012-04-01

    VirGO is the next generation Visual Browser for the ESO Science Archive Facility developed by the Virtual Observatory (VO) Systems Department. It is a plug-in for the popular open source software Stellarium adding capabilities for browsing professional astronomical data. VirGO gives astronomers the possibility to easily discover and select data from millions of observations in a new visual and intuitive way. Its main feature is to perform real-time access and graphical display of a large number of observations by showing instrumental footprints and image previews, and to allow their selection and filtering for subsequent download from the ESO SAF web interface. It also allows the loading of external FITS files or VOTables, the superimposition of Digitized Sky Survey (DSS) background images, and the visualization of the sky in a `real life' mode as seen from the main ESO sites. All data interfaces are based on Virtual Observatory standards which allow access to images and spectra from external data centers, and interaction with the ESO SAF web interface or any other VO applications supporting the PLASTIC messaging system.

  6. A virtual university Web system for a medical school.

    PubMed

    Séka, L P; Duvauferrier, R; Fresnel, A; Le Beux, P

    1998-01-01

    This paper describes a Virtual Medical University Web Server. This project started in 1994 by the development of the French Radiology Server. The main objective of our Medical Virtual University is to offer not only an initial training (for students) but also the Continuing Professional Education (for practitioners). Our system is based on electronic textbooks, clinical cases (around 4000) and a medical knowledge base called A.D.M. ("Aide au Diagnostic Medical"). We have indexed all electronic textbooks and clinical cases according to the ADM base in order to facilitate the navigation on the system. This system base is supported by a relational database management system. The Virtual Medical University, available on the Web Internet, is presently in the process of external evaluations.

  7. Screening-Engineered Field-Effect Solar Cells

    DTIC Science & Technology

    2012-01-01

    virtually any semiconductor, including the promising but hard-to- dope metal oxides, sulfides, and phosphides.3 Prototype SFPV devices have been...MIS interface. Unfortu- nately, MIS cells, though sporting impressive efficiencies,4−6 typically have short operating lifetimes due to surface state...instability at the MIS interface.7 Methods aimed at direct field- effect “ doping ” of semiconductors, in which the voltage is externally applied to a gate

  8. AAL Platform with a “De Facto” Standard Communication Interface (TICO): Training in Home Control in Special Education

    PubMed Central

    Guillomía San Bartolomé, Miguel A.; Artigas Maestre, José Ignacio; Sánchez Agustín, Ana

    2017-01-01

    Framed within a long-term cooperation between university and special education teachers, training in alternative communication skills and home control was realized using the “TICO” interface, a communication panel editor extensively used in special education schools. From a technological view we follow AAL technology trends by integrating a successful interface in a heterogeneous services AAL platform, focusing on a functional view. Educationally, a very flexible interface in line with communication training allows dynamic adjustment of complexity, enhanced by an accessible mindset and virtual elements significance already in use, offers specific interaction feedback, adapts to the evolving needs and capacities and improves the personal autonomy and self-confidence of children at school and home. TICO-home-control was installed during the last school year in the library of a special education school to study adaptations and training strategies to enhance the autonomy opportunities of its pupils. The methodology involved a case study and structured and semi-structured observations. Five children, considered unable to use commercial home control systems were trained obtaining good results in enabling them to use an open home control system. Moreover this AAL platform has proved efficient in training children in previous cognitive steps like virtual representation and cause-effect interaction. PMID:29023383

  9. Virtual Character Animation Based on Affordable Motion Capture and Reconfigurable Tangible Interfaces.

    PubMed

    Lamberti, Fabrizio; Paravati, Gianluca; Gatteschi, Valentina; Cannavo, Alberto; Montuschi, Paolo

    2018-05-01

    Software for computer animation is generally characterized by a steep learning curve, due to the entanglement of both sophisticated techniques and interaction methods required to control 3D geometries. This paper proposes a tool designed to support computer animation production processes by leveraging the affordances offered by articulated tangible user interfaces and motion capture retargeting solutions. To this aim, orientations of an instrumented prop are recorded together with animator's motion in the 3D space and used to quickly pose characters in the virtual environment. High-level functionalities of the animation software are made accessible via a speech interface, thus letting the user control the animation pipeline via voice commands while focusing on his or her hands and body motion. The proposed solution exploits both off-the-shelf hardware components (like the Lego Mindstorms EV3 bricks and the Microsoft Kinect, used for building the tangible device and tracking animator's skeleton) and free open-source software (like the Blender animation tool), thus representing an interesting solution also for beginners approaching the world of digital animation for the first time. Experimental results in different usage scenarios show the benefits offered by the designed interaction strategy with respect to a mouse & keyboard-based interface both for expert and non-expert users.

  10. AAL Platform with a "De Facto" Standard Communication Interface (TICO): Training in Home Control in Special Education.

    PubMed

    Guillomía San Bartolomé, Miguel A; Falcó Boudet, Jorge L; Artigas Maestre, José Ignacio; Sánchez Agustín, Ana

    2017-10-12

    Framed within a long-term cooperation between university and special education teachers, training in alternative communication skills and home control was realized using the "TICO" interface, a communication panel editor extensively used in special education schools. From a technological view we follow AAL technology trends by integrating a successful interface in a heterogeneous services AAL platform, focusing on a functional view. Educationally, a very flexible interface in line with communication training allows dynamic adjustment of complexity, enhanced by an accessible mindset and virtual elements significance already in use, offers specific interaction feedback, adapts to the evolving needs and capacities and improves the personal autonomy and self-confidence of children at school and home. TICO-home-control was installed during the last school year in the library of a special education school to study adaptations and training strategies to enhance the autonomy opportunities of its pupils. The methodology involved a case study and structured and semi-structured observations. Five children, considered unable to use commercial home control systems were trained obtaining good results in enabling them to use an open home control system. Moreover this AAL platform has proved efficient in training children in previous cognitive steps like virtual representation and cause-effect interaction.

  11. Environment Study of AGNs at z = 0.3 to 3.0 Using the Japanese Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Shirasaki, Y.; Ohishi, M.; Mizumoto, Y.; Takata, T.; Tanaka, M.; Yasuda, N.

    2010-12-01

    We present a science use case of Virtual Observatory, which was achieved to examine an environment of AGN up to redshift of 3.0. We used the Japanese Virtual Observatory (JVO) to obtain Subaru Suprime-Cam images around known AGNs. According to the hierarchical galaxy formation model, AGNs are expected to be found in an environment of higher galaxy density than that of typical galaxies. The current observations, however, indicate that AGNs do not reside in a particularly high density environment. We investigated ˜1000 AGNs, which is about ten times larger samples than the other studies covering the redshifts larger than 0.6. We successfully found significant excess of galaxies around AGNs at redshifts of 0.3 to 1.8. If this work was done in a classical manner, that is, raw data were retrieved from the archive through a form-based web interface in an interactive way, and the data were reduced on a low performance computer, it might take several years to finish it. Since the Virtual Observatory system is accessible through a standard interface, it is easy to query and retrieve data in an automatic way. We constructed a pipeline for retrieving the data and calculating the galaxy number density around a given coordinate. This procedure was executed in parallel on ˜10 quad core PCs, and it took only one day for obtaining the final result. Our result implies that the Virtual Observatory can be a powerful tool to do an astronomical research based on large amount of data.

  12. eF-seek: prediction of the functional sites of proteins by searching for similar electrostatic potential and molecular surface shape.

    PubMed

    Kinoshita, Kengo; Murakami, Yoichi; Nakamura, Haruki

    2007-07-01

    We have developed a method to predict ligand-binding sites in a new protein structure by searching for similar binding sites in the Protein Data Bank (PDB). The similarities are measured according to the shapes of the molecular surfaces and their electrostatic potentials. A new web server, eF-seek, provides an interface to our search method. It simply requires a coordinate file in the PDB format, and generates a prediction result as a virtual complex structure, with the putative ligands in a PDB format file as the output. In addition, the predicted interacting interface is displayed to facilitate the examination of the virtual complex structure on our own applet viewer with the web browser (URL: http://eF-site.hgc.jp/eF-seek).

  13. STS-132 crew during their MSS/SIMP EVA3 OPS 4 training

    NASA Image and Video Library

    2010-01-28

    JSC2010-E-014952 (28 Jan. 2010) --- NASA astronauts Michael Good (seated) and Garrett Reisman, both STS-132 mission specialists, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  14. STS-109 Crew Training in VR Lab, Building 9

    NASA Image and Video Library

    2001-08-08

    JSC2001-E-24452 (8 August 2001) --- Astronauts John M. Grunsfeld (left), STS-109 payload commander, and Nancy J. Currie, mission specialist, use the virtual reality lab at the Johnson Space Center (JSC) to train for some of their duties aboard the Space Shuttle Columbia. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team to perform its duties during the fourth Hubble Space Telescope (HST) servicing mission.

  15. STS-134 crew and Expedition 24/25 crew member Shannon Walker

    NASA Image and Video Library

    2010-03-25

    JSC2010-E-043666 (25 March 2010) --- NASA astronauts Mark Kelly (background), STS-134 commander; and Andrew Feustel, mission specialist, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  16. STS-134 crew and Expedition 24/25 crew member Shannon Walker

    NASA Image and Video Library

    2010-03-25

    JSC2010-E-043668 (25 March 2010) --- NASA astronauts Mark Kelly (background), STS-134 commander; and Andrew Feustel, mission specialist, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  17. STS-111 Training in VR lab with Expedition IV and V Crewmembers

    NASA Image and Video Library

    2001-10-18

    JSC2001-E-39082 (18 October 2001) --- Cosmonaut Valeri G. Korzun (left), Expedition Five mission commander, and astronaut Carl E. Walz, Expedition Four flight engineer, use the virtual reality lab at the Johnson Space Center (JSC) to train for their duties on the International Space Station (ISS). This type of computer interface paired with virtual reality training hardware and software helps the entire team for dealing with ISS elements. Korzun represents Rosaviakosmos.

  18. STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus

    NASA Image and Video Library

    2007-08-09

    JSC2007-E-41533 (9 Aug. 2007) --- Astronauts Stephanie Wilson (left), STS-120 mission specialist; Sandra Magnus, Expedition 17 flight engineer; and Dan Tani, Expedition 16 flight engineer, use the virtual reality lab at Johnson Space Center to train for their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  19. Interface science of virtual GaN substrates on Si(111) via Sc2O3/Y2O3 buffers: Experiment and theory

    NASA Astrophysics Data System (ADS)

    Tarnawska, L.; Dabrowski, J.; Grzela, T.; Lehmann, M.; Niermann, T.; Paszkiewicz, R.; Storck, P.; Schroeder, T.

    2013-06-01

    The final film quality of GaN on foreign substrates is known to crucially depend on the initial GaN interface and nucleation characteristics. To shed light on these characteristics of recently pioneered virtual, hexagonal GaN(0001) substrates on Si(111) via step graded Sc2O3(111)/Y2O3(111) buffers, a complex GaN(0001)/Sc2O3(111) interface structure model and the initial nucleation scenario is derived from a combined experimental (reflection high energy electron diffraction and X-ray photoelectron spectroscopy) and theoretical ab initio study. It is shown that the GaN/Sc2O3 interface chemistry is determined by a N-Ga-O-Sc atomic arrangement leading to N-polar GaN films. However, the atomic GaN(0001)/Sc2O3(111) interface configuration is complex and local perturbations might be at the origin of Ga-polar inversion domains in the mainly N-polar GaN films. The initial growth of GaN on Sc2O3 is characterized by an ultrathin N-Ga-O-Sc wetting layer which carries tensile strain and relaxes with increasing thickness. Further GaN deposition results in the formation of 3D islands which fully relax before island coalescence occurs. The implications of the GaN/Sc2O3 interface configuration, the 3D nucleation growth mode, and the coalescence process of misaligned islands are discussed with respect to the defect characteristics (inversion domains, cubic inclusions, threading dislocations) of the final GaN layer.

  20. Virtual screening using combinatorial cyclic peptide libraries reveals protein interfaces readily targetable by cyclic peptides.

    PubMed

    Duffy, Fergal J; O'Donovan, Darragh; Devocelle, Marc; Moran, Niamh; O'Connell, David J; Shields, Denis C

    2015-03-23

    Protein-protein and protein-peptide interactions are responsible for the vast majority of biological functions in vivo, but targeting these interactions with small molecules has historically been difficult. What is required are efficient combined computational and experimental screening methods to choose among a number of potential protein interfaces worthy of targeting lead macrocyclic compounds for further investigation. To achieve this, we have generated combinatorial 3D virtual libraries of short disulfide-bonded peptides and compared them to pharmacophore models of important protein-protein and protein-peptide structures, including short linear motifs (SLiMs), protein-binding peptides, and turn structures at protein-protein interfaces, built from 3D models available in the Protein Data Bank. We prepared a total of 372 reference pharmacophores, which were matched against 108,659 multiconformer cyclic peptides. After normalization to exclude nonspecific cyclic peptides, the top hits notably are enriched for mimetics of turn structures, including a turn at the interaction surface of human α thrombin, and also feature several protein-binding peptides. The top cyclic peptide hits also cover the critical "hot spot" interaction sites predicted from the interaction crystal structure. We have validated our method by testing cyclic peptides predicted to inhibit thrombin, a key protein in the blood coagulation pathway of important therapeutic interest, identifying a cyclic peptide inhibitor with lead-like activity. We conclude that protein interfaces most readily targetable by cyclic peptides and related macrocyclic drugs may be identified computationally among a set of candidate interfaces, accelerating the choice of interfaces against which lead compounds may be screened.

  1. Productive confusions: learning from simulations of pandemic virus outbreaks in Second Life

    NASA Astrophysics Data System (ADS)

    Cárdenas, Micha; Greci, Laura S.; Hurst, Samantha; Garman, Karen; Hoffman, Helene; Huang, Ricky; Gates, Michael; Kho, Kristen; Mehrmand, Elle; Porteous, Todd; Calvitti, Alan; Higginbotham, Erin; Agha, Zia

    2011-03-01

    Users of immersive virtual reality environments have reported a wide variety of side and after effects including the confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real can be turned around to explore the possibilities for immersion with minimal technological support in virtual world group training simulations. This paper will describe observations from my time working as an artist/researcher with the UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will examine moments of training to learn the software interface, moments within the drill and interviews after the drill.

  2. Virtual Satellite

    NASA Technical Reports Server (NTRS)

    Hammrs, Stephan R.

    2008-01-01

    Virtual Satellite (VirtualSat) is a computer program that creates an environment that facilitates the development, verification, and validation of flight software for a single spacecraft or for multiple spacecraft flying in formation. In this environment, enhanced functionality and autonomy of navigation, guidance, and control systems of a spacecraft are provided by a virtual satellite that is, a computational model that simulates the dynamic behavior of the spacecraft. Within this environment, it is possible to execute any associated software, the development of which could benefit from knowledge of, and possible interaction (typically, exchange of data) with, the virtual satellite. Examples of associated software include programs for simulating spacecraft power and thermal- management systems. This environment is independent of the flight hardware that will eventually host the flight software, making it possible to develop the software simultaneously with, or even before, the hardware is delivered. Optionally, by use of interfaces included in VirtualSat, hardware can be used instead of simulated. The flight software, coded in the C or C++ programming language, is compilable and loadable into VirtualSat without any special modifications. Thus, VirtualSat can serve as a relatively inexpensive software test-bed for development test, integration, and post-launch maintenance of spacecraft flight software.

  3. Human responses to augmented virtual scaffolding models.

    PubMed

    Hsiao, Hongwei; Simeonov, Peter; Dotson, Brian; Ammons, Douglas; Kau, Tsui-Ying; Chiou, Sharon

    2005-08-15

    This study investigated the effect of adding real planks, in virtual scaffolding models of elevation, on human performance in a surround-screen virtual reality (SSVR) system. Twenty-four construction workers and 24 inexperienced controls performed walking tasks on real and virtual planks at three virtual heights (0, 6 m, 12 m) and two scaffolding-platform-width conditions (30, 60 cm). Gait patterns, walking instability measurements and cardiovascular reactivity were assessed. The results showed differences in human responses to real vs. virtual planks in walking patterns, instability score and heart-rate inter-beat intervals; it appeared that adding real planks in the SSVR virtual scaffolding model enhanced the quality of SSVR as a human - environment interface research tool. In addition, there were significant differences in performance between construction workers and the control group. The inexperienced participants were more unstable as compared to construction workers. Both groups increased their stride length with repetitions of the task, indicating a possibly confidence- or habit-related learning effect. The practical implications of this study are in the adoption of augmented virtual models of elevated construction environments for injury prevention research, and the development of programme for balance-control training to reduce the risk of falls at elevation before workers enter a construction job.

  4. Simulating Humans as Integral Parts of Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Bruins, Anthony C.; Rice, Robert; Nguyen, Lac; Nguyen, Heidi; Saito, Tim; Russell, Elaine

    2006-01-01

    The Collaborative-Virtual Environment Simulation Tool (C-VEST) software was developed for use in a NASA project entitled "3-D Interactive Digital Virtual Human." The project is oriented toward the use of a comprehensive suite of advanced software tools in computational simulations for the purposes of human-centered design of spacecraft missions and of the spacecraft, space suits, and other equipment to be used on the missions. The C-VEST software affords an unprecedented suite of capabilities for three-dimensional virtual-environment simulations with plug-in interfaces for physiological data, haptic interfaces, plug-and-play software, realtime control, and/or playback control. Mathematical models of the mechanics of the human body and of the aforementioned equipment are implemented in software and integrated to simulate forces exerted on and by astronauts as they work. The computational results can then support the iterative processes of design, building, and testing in applied systems engineering and integration. The results of the simulations provide guidance for devising measures to counteract effects of microgravity on the human body and for the rapid development of virtual (that is, simulated) prototypes of advanced space suits, cockpits, and robots to enhance the productivity, comfort, and safety of astronauts. The unique ability to implement human-in-the-loop immersion also makes the C-VEST software potentially valuable for use in commercial and academic settings beyond the original space-mission setting.

  5. VERDEX: A virtual environment demonstrator for remote driving applications

    NASA Technical Reports Server (NTRS)

    Stone, Robert J.

    1991-01-01

    One of the key areas of the National Advanced Robotics Centre's enabling technologies research program is that of the human system interface, phase 1 of which started in July 1989 and is currently addressing the potential of virtual environments to permit intuitive and natural interactions between a human operator and a remote robotic vehicle. The aim of the first 12 months of this program (to September, 1990) is to develop a virtual human-interface demonstrator for use later as a test bed for human factors experimentation. This presentation will describe the current state of development of the test bed, and will outline some human factors issues and problems for more general discussion. In brief, the virtual telepresence system for remote driving has been designed to take the following form. The human operator will be provided with a helmet-mounted stereo display assembly, facilities for speech recognition and synthesis (using the Marconi Macrospeak system), and a VPL DataGlove Model 2 unit. The vehicle to be used for the purposes of remote driving is a Cybermotion Navmaster K2A system, which will be equipped with a stereo camera and microphone pair, mounted on a motorized high-speed pan-and-tilt head incorporating a closed-loop laser ranging sensor for camera convergence control (currently under contractual development). It will be possible to relay information to and from the vehicle and sensory system via an umbilical or RF link. The aim is to develop an interactive audio-visual display system capable of presenting combined stereo TV pictures and virtual graphics windows, the latter featuring control representations appropriate for vehicle driving and interaction using a graphical 'hand,' slaved to the flex and tracking sensors of the DataGlove and an additional helmet-mounted Polhemus IsoTrack sensor. Developments planned for the virtual environment test bed include transfer of operator control between remote driving and remote manipulation, dexterous end effector integration, virtual force and tactile sensing (also the focus of a current ARRL contract, initially employing a 14-pneumatic bladder glove attachment), and sensor-driven world modeling for total virtual environment generation and operator-assistance in remote scene interrogation.

  6. Network and user interface for PAT DOME virtual motion environment system

    NASA Technical Reports Server (NTRS)

    Worthington, J. W.; Duncan, K. M.; Crosier, W. G.

    1993-01-01

    The Device for Orientation and Motion Environments Preflight Adaptation Trainer (DOME PAT) provides astronauts a virtual microgravity sensory environment designed to help alleviate tye symptoms of space motion sickness (SMS). The system consists of four microcomputers networked to provide real time control, and an image generator (IG) driving a wide angle video display inside a dome structure. The spherical display demands distortion correction. The system is currently being modified with a new graphical user interface (GUI) and a new Silicon Graphics IG. This paper will concentrate on the new GUI and the networking scheme. The new GUI eliminates proprietary graphics hardware and software, and instead makes use of standard and low cost PC video (CGA) and off the shelf software (Microsoft's Quick C). Mouse selection for user input is supported. The new Silicon Graphics IG requires an Ethernet interface. The microcomputer known as the Real Time Controller (RTC), which has overall control of the system and is written in Ada, was modified to use the free public domain NCSA Telnet software for Ethernet communications with the Silicon Graphics IG. The RTC also maintains the original ARCNET communications through Novell Netware IPX with the rest of the system. The Telnet TCP/IP protocol was first used for real-time communication, but because of buffering problems the Telnet datagram (UDP) protocol needed to be implemented. Since the Telnet modules are written in C, the Adap pragma 'Interface' was used to interface with the network calls.

  7. Creation of a 3-dimensional virtual dental patient for computer-guided surgery and CAD-CAM interim complete removable and fixed dental prostheses: A clinical report.

    PubMed

    Harris, Bryan T; Montero, Daniel; Grant, Gerald T; Morton, Dean; Llop, Daniel R; Lin, Wei-Shao

    2017-02-01

    This clinical report proposes a digital workflow using 2-dimensional (2D) digital photographs, a 3D extraoral facial scan, and cone beam computed tomography (CBCT) volumetric data to create a 3D virtual patient with craniofacial hard tissue, remaining dentition (including surrounding intraoral soft tissue), and the realistic appearance of facial soft tissue at an exaggerated smile under static conditions. The 3D virtual patient was used to assist the virtual diagnostic tooth arrangement process, providing patient with a pleasing preoperative virtual smile design that harmonized with facial features. The 3D virtual patient was also used to gain patient's pretreatment approval (as a communication tool), design a prosthetically driven surgical plan for computer-guided implant surgery, and fabricate the computer-aided design and computer-aided manufacturing (CAD-CAM) interim prostheses. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  8. Autonomous power expert fault diagnostic system for Space Station Freedom electrical power system testbed

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.

  9. [The virtual university in medicine. Context, concepts, specifications, users' manual].

    PubMed

    Duvauferrier, R; Séka, L P; Rolland, Y; Rambeau, M; Le Beux, P; Morcet, N

    1998-09-01

    The widespread use of Web servers, with the emergence of interactive functions and the possibility of credit card payment via Internet, together with the requirement for continuing education and the subsequent need for a computer to link into the health care network have incited the development of a virtual university scheme on Internet. The Virtual University of Radiology is not only a computer-assisted teaching tool with a set of attractive features, but also a powerful engine allowing the organization, distribution and control of medical knowledge available in the www.server. The scheme provides patient access to general information, a secretary's office for enrollment and the Virtual University itself, with its library, image database, a forum for subspecialties and clinical case reports, an evaluation module and various guides and help tools for diagnosis, prescription and indexing. Currently the Virtual University of Radiology offers diagnostic imaging, but can also be used by other specialties and for general practice.

  10. Fab MOR03268 triggers absorption shift of a diagnostic dye via packaging in a solvent-shielded Fab dimer interface.

    PubMed

    Hillig, Roman C; Urlinger, Stefanie; Fanghänel, Jörg; Brocks, Bodo; Haenel, Cornelia; Stark, Yvonne; Sülzle, Detlev; Svergun, Dmitri I; Baesler, Siegfried; Malawski, Guido; Moosmayer, Dieter; Menrad, Andreas; Schirner, Michael; Licha, Kai

    2008-03-14

    Molecular interactions between near-IR fluorescent probes and specific antibodies may be exploited to generate novel smart probes for diagnostic imaging. Using a new phage display technology, we developed such antibody Fab fragments with subnanomolar binding affinity for tetrasulfocyanine, a near-IR in vivo imaging agent. Unexpectedly, some Fabs induced redshifts of the dye absorption peak of up to 44 nm. This is the largest shift reported for a biological system so far. Crystal structure determination and absorption spectroscopy in the crystal in combination with microcalorimetry and small-angle X-ray scattering in solution revealed that the redshift is triggered by formation of a Fab dimer, with tetrasulfocyanine being buried in a fully closed protein cavity within the dimer interface. The derived principle of shifting the absorption peak of a symmetric dye via packaging within a Fab dimer interface may be transferred to other diagnostic fluorophores, opening the way towards smart imaging probes that change their wavelength upon interaction with an antibody.

  11. Human Machine Interfaces for Teleoperators and Virtual Environments: Conference Held in Santa Barbara, California on 4-9 March 1990.

    DTIC Science & Technology

    1990-03-01

    decided to have three kinds of sessions: invited-paper sessions, panel discussions, and poster sessions. The invited papers were divided into papers...soon followed. Applications in medicine, involving exploration and operation within the human body, are now receiving increased attention . Early... attention toward issues that may be important for the design of auditory interfaces. The importance of appropriate auditory inputs to observers with normal

  12. Multipath transport for virtual private networks

    DTIC Science & Technology

    2017-03-01

    Using a Wi - Fi and Cellular Connection . . . . . . . . . 13 Figure 2.8 OpenVPN Interaction with Kernel. Adapted from [14]. . . . . . . 17 Figure 3.1 MPTCP...to enable a client to connect to his corporate offices using a hotel Wi - Fi connection while traveling for business. Maybe a small business is...interface of the client to each interface of the server [7]. Figure 2.7 provides a simplified scenario of a MPTCP client with Wi - Fi and cellular

  13. Grasping trajectories in a virtual environment adhere to Weber's law.

    PubMed

    Ozana, Aviad; Berman, Sigal; Ganel, Tzvi

    2018-06-01

    Virtual-reality and telerobotic devices simulate local motor control of virtual objects within computerized environments. Here, we explored grasping kinematics within a virtual environment and tested whether, as in normal 3D grasping, trajectories in the virtual environment are performed analytically, violating Weber's law with respect to object's size. Participants were asked to grasp a series of 2D objects using a haptic system, which projected their movements to a virtual space presented on a computer screen. The apparatus also provided object-specific haptic information upon "touching" the edges of the virtual targets. The results showed that grasping movements performed within the virtual environment did not produce the typical analytical trajectory pattern obtained during 3D grasping. Unlike as in 3D grasping, grasping trajectories in the virtual environment adhered to Weber's law, which indicates relative resolution in size processing. In addition, the trajectory patterns differed from typical trajectories obtained during 3D grasping, with longer times to complete the movement, and with maximum grip apertures appearing relatively early in the movement. The results suggest that grasping movements within a virtual environment could differ from those performed in real space, and are subjected to irrelevant effects of perceptual information. Such atypical pattern of visuomotor control may be mediated by the lack of complete transparency between the interface and the virtual environment in terms of the provided visual and haptic feedback. Possible implications of the findings to movement control within robotic and virtual environments are further discussed.

  14. STS-134 crew and Expedition 24/25 crew member Shannon Walker

    NASA Image and Video Library

    2010-03-25

    JSC2010-E-043673 (25 March 2010) --- NASA astronauts Gregory H. Johnson, STS-134 pilot; and Shannon Walker, Expedition 24/25 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  15. STS-134 crew and Expedition 24/25 crew member Shannon Walker

    NASA Image and Video Library

    2010-03-25

    JSC2010-E-043661 (25 March 2010) --- NASA astronauts Gregory H. Johnson, STS-134 pilot; and Shannon Walker, Expedition 24/25 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  16. STS-132 crew during their MSS/SIMP EVA3 OPS 4 training

    NASA Image and Video Library

    2010-01-28

    JSC2010-E-014953 (28 Jan. 2010) --- NASA astronauts Piers Sellers, STS-132 mission specialist; and Tracy Caldwell Dyson, Expedition 23/24 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  17. STS-132 crew during their MSS/SIMP EVA3 OPS 4 training

    NASA Image and Video Library

    2010-01-28

    JSC2010-E-014949 (28 Jan. 2010) --- NASA astronauts Piers Sellers, STS-132 mission specialist; and Tracy Caldwell Dyson, Expedition 23/24 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  18. Virtual workstation - A multimodal, stereoscopic display environment

    NASA Astrophysics Data System (ADS)

    Fisher, S. S.; McGreevy, M.; Humphries, J.; Robinett, W.

    1987-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed for use in a multipurpose interface environment. The system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, application scenarios, and research directions are described.

  19. STS-132 crew during their MSS/SIMP EVA3 OPS 4 training

    NASA Image and Video Library

    2010-01-28

    JSC2010-E-014956 (28 Jan. 2010) --- NASA astronauts Ken Ham (left foreground), STS-132 commander; Michael Good, mission specialist; and Tony Antonelli (right), pilot, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  20. STS-131 crew during VR Lab MSS/EVAB SUPT3 Team 91016 training

    NASA Image and Video Library

    2009-09-25

    JSC2009-E-214346 (25 Sept. 2009) --- Japan Aerospace Exploration Agency (JAXA) astronaut Naoko Yamazaki, STS-131 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  1. STS-131 crew during VR Lab MSS/EVAB SUPT3 Team 91016 training

    NASA Image and Video Library

    2009-09-25

    JSC2009-E-214328 (25 Sept. 2009) --- Japan Aerospace Exploration Agency (JAXA) astronaut Naoko Yamazaki, STS-131 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of her duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  2. STS-132 crew during their MSS/SIMP EVA3 OPS 4 training

    NASA Image and Video Library

    2010-01-28

    JSC2010-E-014951 (28 Jan. 2010) --- NASA astronauts Michael Good (seated), Garrett Reisman (right foreground), both STS-132 mission specialists; and Tony Antonelli, pilot, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  3. STS-111 Training in VR lab with Expedition IV and V Crewmembers

    NASA Image and Video Library

    2001-10-18

    JSC2001-E-39085 (18 October 2001) --- Cosmonaut Valeri G. Korzun (left), Expedition Five mission commander, astronaut Peggy A. Whitson, Expedition Five flight engineer, and astronaut Carl E. Walz, Expedition Four flight engineer, use the virtual reality lab at the Johnson Space Center (JSC) to train for their duties on the International Space Station (ISS). This type of computer interface paired with virtual reality training hardware and software helps the entire team for dealing with ISS elements. Korzun represents Rosaviakosmos.

  4. STS-133 crew training in VR Lab with replacement crew member Steve Bowen

    NASA Image and Video Library

    2011-01-24

    JSC2011-E-006293 (24 Jan. 2011) --- NASA astronaut Michael Barratt, STS-133 mission specialist, uses the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of his duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements. Photo credit: NASA or National Aeronautics and Space Administration

  5. Photographic coverage of STS-112 during EVA 3 in VR Lab.

    NASA Image and Video Library

    2002-08-21

    JSC2002-E-34625 (21 Aug. 2002) --- Astronaut Sandra H. Magnus (left), STS-112 mission specialist, uses the virtual reality lab at NASA?s Johnson Space Center (JSC) to train for her duties aboard the space shuttle Atlantis. This type of computer interface paired with virtual reality training hardware and software helps to prepare the entire team for dealing with ISS elements. Lead SSRMS instructor Elizabeth C. Bloomer assisted Magnus. Astronaut Ellen Ochoa (standing) looks on. Photo credit: NASA

  6. STS-134 crew and Expedition 24/25 crew member Shannon Walker

    NASA Image and Video Library

    2010-03-25

    JSC2010-E-043662 (25 March 2010) --- NASA astronauts Gregory H. Johnson, STS-134 pilot; and Shannon Walker, Expedition 24/25 flight engineer, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare crew members for dealing with space station elements.

  7. STS-131 crew during VR Lab MSS/EVAB SUPT3 Team 91016 training

    NASA Image and Video Library

    2009-09-25

    JSC2009-E-214321 (25 Sept. 2009) --- NASA astronauts James P. Dutton Jr., STS-131 pilot; and Stephanie Wilson, mission specialist, use the virtual reality lab in the Space Vehicle Mock-up Facility at NASA's Johnson Space Center to train for some of their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements.

  8. STS-120 crew along with Expedition crew members Dan Tani and Sandra Magnus

    NASA Image and Video Library

    2007-08-09

    JSC2007-E-41538 (9 Aug. 2007) --- Astronauts Stephanie Wilson, STS-120 mission specialist; Sandra Magnus, Expedition 17 flight engineer; and Dan Tani, Expedition 16 flight engineer, use the virtual reality lab at Johnson Space Center to train for their duties aboard the space shuttle and space station. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements. A computer display is visible in the foreground.

  9. Helmet Mounted Eye Tracking for Virtual Panoramic Displays. Volume 1: Review of Current Eye Movement Measurement Technology

    DTIC Science & Technology

    1989-08-01

    paths for integration with the off-aperture and dual-mirror VPD designs. PREFACE The goal of this work was to explore integration of an eye line-of- gaze ...Relationship in one plane between point-of- gaze on a flat scene and relative eye, detector, and scene positions...and eye line-of- gaze measurement. As a first step towards the design of an appropriate eye trac.<ing system for interface with the virtual cockpit

  10. Design of virtual SCADA simulation system for pressurized water reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wijaksono, Umar, E-mail: umar.wijaksono@student.upi.edu; Abdullah, Ade Gafar; Hakim, Dadang Lukman

    The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles ofmore » energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.« less

  11. Virtual Environment User Interfaces to Support RLV and Space Station Simulations in the ANVIL Virtual Reality Lab

    NASA Technical Reports Server (NTRS)

    Dumas, Joseph D., II

    1998-01-01

    Several virtual reality I/O peripherals were successfully configured and integrated as part of the author's 1997 Summer Faculty Fellowship work. These devices, which were not supported by the developers of VR software packages, use new software drivers and configuration files developed by the author to allow them to be used with simulations developed using those software packages. The successful integration of these devices has added significant capability to the ANVIL lab at MSFC. In addition, the author was able to complete the integration of a networked virtual reality simulation of the Space Shuttle Remote Manipulator System docking Space Station modules which was begun as part of his 1996 Fellowship. The successful integration of this simulation demonstrates the feasibility of using VR technology for ground-based training as well as on-orbit operations.

  12. Graphics interfaces and numerical simulations: Mexican Virtual Solar Observatory

    NASA Astrophysics Data System (ADS)

    Hernández, L.; González, A.; Salas, G.; Santillán, A.

    2007-08-01

    Preliminary results associated to the computational development and creation of the Mexican Virtual Solar Observatory (MVSO) are presented. Basically, the MVSO prototype consists of two parts: the first, related to observations that have been made during the past ten years at the Solar Observation Station (EOS) and at the Carl Sagan Observatory (OCS) of the Universidad de Sonora in Mexico. The second part is associated to the creation and manipulation of a database produced by numerical simulations related to solar phenomena, we are using the MHD ZEUS-3D code. The development of this prototype was made using mysql, apache, java and VSO 1.2. based GNU and `open source philosophy'. A graphic user interface (GUI) was created in order to make web-based, remote numerical simulations. For this purpose, Mono was used, because it is provides the necessary software to develop and run .NET client and server applications on Linux. Although this project is still under development, we hope to have access, by means of this portal, to other virtual solar observatories and to be able to count on a database created through numerical simulations or, given the case, perform simulations associated to solar phenomena.

  13. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  14. Virtual chromoendoscopy improves the diagnostic yield of small bowel capsule endoscopy in obscure gastrointestinal bleeding.

    PubMed

    Boal Carvalho, Pedro; Magalhães, Joana; Dias de Castro, Francisca; Gonçalves, Tiago Cúrdia; Rosa, Bruno; Moreira, Maria João; Cotter, José

    2016-02-01

    Small bowel capsule endoscopy represents the initial investigation for obscure gastrointestinal bleeding. Flexible spectral imaging colour enhancement (FICE) is a virtual chromoendoscopy technique designed to enhance mucosal lesions, available in different settings according to light wavelength-- FICE1, 2 and 3. To compare the diagnostic yield of FICE1 and white light during capsule endoscopy in patients with obscure gastrointestinal bleeding. Retrospective single-centre study including 60 consecutive patients referred for small bowel capsule endoscopy for obscure gastrointestinal bleeding. Endoscopies were independently reviewed in FICE1 and white light; findings were then reviewed by another researcher, establishing a gold standard. Diagnostic yield was defined as the presence of lesions with high bleeding potential (P2) angioectasias, ulcers or tumours. Diagnostic yield using FICE1 was significantly higher than white light (55% vs. 42%, p=0.021). A superior number of P2 lesions was detected with FICE1 (74 vs. 44, p=0.003), particularly angioectasias (54 vs. 26, p=0.002), but not ulcers or tumours. FICE1 was significantly superior to white light, resulting in a 13% improvement in diagnostic yield, and potentially bleeding lesions particularly angioectasias were more often observed. Our results support the use of FICE1 while reviewing small bowel capsule endoscopy for obscure gastrointestinal bleeding. Copyright © 2015 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  15. Novel graphical environment for virtual and real-world operations of tracked mobile manipulators

    NASA Astrophysics Data System (ADS)

    Chen, ChuXin; Trivedi, Mohan M.; Azam, Mir; Lassiter, Nils T.

    1993-08-01

    A simulation, animation, visualization and interactive control (SAVIC) environment has been developed for the design and operation of an integrated mobile manipulator system. This unique system possesses the abilities for (1) multi-sensor simulation, (2) kinematics and locomotion animation, (3) dynamic motion and manipulation animation, (4) transformation between real and virtual modes within the same graphics system, (5) ease in exchanging software modules and hardware devices between real and virtual world operations, and (6) interfacing with a real robotic system. This paper describes a working system and illustrates the concepts by presenting the simulation, animation and control methodologies for a unique mobile robot with articulated tracks, a manipulator, and sensory modules.

  16. Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.

    PubMed

    Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J

    2011-11-01

    To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Visualizing vascular structures in virtual environments

    NASA Astrophysics Data System (ADS)

    Wischgoll, Thomas

    2013-01-01

    In order to learn more about the cause of coronary heart diseases and develop diagnostic tools, the extraction and visualization of vascular structures from volumetric scans for further analysis is an important step. By determining a geometric representation of the vasculature, the geometry can be inspected and additional quantitative data calculated and incorporated into the visualization of the vasculature. To provide a more user-friendly visualization tool, virtual environment paradigms can be utilized. This paper describes techniques for interactive rendering of large-scale vascular structures within virtual environments. This can be applied to almost any virtual environment configuration, such as CAVE-type displays. Specifically, the tools presented in this paper were tested on a Barco I-Space and a large 62x108 inch passive projection screen with a Kinect sensor for user tracking.

  18. Exploring virtual reality technology and the Oculus Rift for the examination of digital pathology slides

    PubMed Central

    Farahani, Navid; Post, Robert; Duboy, Jon; Ahmed, Ishtiaque; Kolowitz, Brian J.; Krinchai, Teppituk; Monaco, Sara E.; Fine, Jeffrey L.; Hartman, Douglas J.; Pantanowitz, Liron

    2016-01-01

    Background: Digital slides obtained from whole slide imaging (WSI) platforms are typically viewed in two dimensions using desktop personal computer monitors or more recently on mobile devices. To the best of our knowledge, we are not aware of any studies viewing digital pathology slides in a virtual reality (VR) environment. VR technology enables users to be artificially immersed in and interact with a computer-simulated world. Oculus Rift is among the world's first consumer-targeted VR headsets, intended primarily for enhanced gaming. Our aim was to explore the use of the Oculus Rift for examining digital pathology slides in a VR environment. Methods: An Oculus Rift Development Kit 2 (DK2) was connected to a 64-bit computer running Virtual Desktop software. Glass slides from twenty randomly selected lymph node cases (ten with benign and ten malignant diagnoses) were digitized using a WSI scanner. Three pathologists reviewed these digital slides on a 27-inch 5K display and with the Oculus Rift after a 2-week washout period. Recorded endpoints included concordance of final diagnoses and time required to examine slides. The pathologists also rated their ease of navigation, image quality, and diagnostic confidence for both modalities. Results: There was 90% diagnostic concordance when reviewing WSI using a 5K display and Oculus Rift. The time required to examine digital pathology slides on the 5K display averaged 39 s (range 10–120 s), compared to 62 s with the Oculus Rift (range 15–270 s). All pathologists confirmed that digital pathology slides were easily viewable in a VR environment. The ratings for image quality and diagnostic confidence were higher when using the 5K display. Conclusion: Using the Oculus Rift DK2 to view and navigate pathology whole slide images in a virtual environment is feasible for diagnostic purposes. However, image resolution using the Oculus Rift device was limited. Interactive VR technologies such as the Oculus Rift are novel tools that may be of use in digital pathology. PMID:27217972

  19. Recent Progress and Future Plans for Fusion Plasma Synthetic Diagnostics Platform

    NASA Astrophysics Data System (ADS)

    Shi, Lei; Kramer, Gerrit; Tang, William; Tobias, Benjamin; Valeo, Ernest; Churchill, Randy; Hausammann, Loic

    2015-11-01

    The Fusion Plasma Synthetic Diagnostics Platform (FPSDP) is a Python package developed at the Princeton Plasma Physics Laboratory. It is dedicated to providing an integrated programmable environment for applying a modern ensemble of synthetic diagnostics to the experimental validation of fusion plasma simulation codes. The FPSDP will allow physicists to directly compare key laboratory measurements to simulation results. This enables deeper understanding of experimental data, more realistic validation of simulation codes, quantitative assessment of existing diagnostics, and new capabilities for the design and optimization of future diagnostics. The Fusion Plasma Synthetic Diagnostics Platform now has data interfaces for the GTS and XGC-1 global particle-in-cell simulation codes with synthetic diagnostic modules including: (i) 2D and 3D Reflectometry; (ii) Beam Emission Spectroscopy; and (iii) 1D Electron Cyclotron Emission. Results will be reported on the delivery of interfaces for the global electromagnetic PIC code GTC, the extended MHD M3D-C1 code, and the electromagnetic hybrid NOVAK eigenmode code. Progress toward development of a more comprehensive 2D Electron Cyclotron Emission module will also be discussed. This work is supported by DOE contract #DEAC02-09CH11466.

  20. Modeling and Design of an Electro-Rheological Fluid Based Haptic System for Tele-Operation of Space Robots

    NASA Technical Reports Server (NTRS)

    Mavroidis, Constantinos; Pfeiffer, Charles; Paljic, Alex; Celestino, James; Lennon, Jamie; Bar-Cohen, Yoseph

    2000-01-01

    For many years, the robotic community sought to develop robots that can eventually operate autonomously and eliminate the need for human operators. However, there is an increasing realization that there are some tasks that human can perform significantly better but, due to associated hazards, distance, physical limitations and other causes, only robot can be employed to perform these tasks. Remotely performing these types of tasks requires operating robots as human surrogates. While current "hand master" haptic systems are able to reproduce the feeling of rigid objects, they present great difficulties in emulating the feeling of remote/virtual stiffness. In addition, they tend to be heavy, cumbersome and usually they only allow limited operator workspace. In this paper a novel haptic interface is presented to enable human-operators to "feel" and intuitively mirror the stiffness/forces at remote/virtual sites enabling control of robots as human-surrogates. This haptic interface is intended to provide human operators intuitive feeling of the stiffness and forces at remote or virtual sites in support of space robots performing dexterous manipulation tasks (such as operating a wrench or a drill). Remote applications are referred to the control of actual robots whereas virtual applications are referred to simulated operations. The developed haptic interface will be applicable to IVA operated robotic EVA tasks to enhance human performance, extend crew capability and assure crew safety. The electrically controlled stiffness is obtained using constrained ElectroRheological Fluids (ERF), which changes its viscosity under electrical stimulation. Forces applied at the robot end-effector due to a compliant environment will be reflected to the user using this ERF device where a change in the system viscosity will occur proportionally to the force to be transmitted. In this paper, we will present the results of our modeling, simulation, and initial testing of such an electrorheological fluid (ERF) based haptic device.

  1. Surgeon Design Interface for Patient-Specific Concentric Tube Robots

    PubMed Central

    Morimoto, Tania K.; Greer, Joseph D.; Hsieh, Michael H.; Okamura, Allison M.

    2017-01-01

    Concentric tube robots have potential for use in a wide variety of surgical procedures due to their small size, dexterity, and ability to move in highly curved paths. Unlike most existing clinical robots, the design of these robots can be developed and manufactured on a patient- and procedure-specific basis. The design of concentric tube robots typically requires significant computation and optimization, and it remains unclear how the surgeon should be involved. We propose to use a virtual reality-based design environment for surgeons to easily and intuitively visualize and design a set of concentric tube robots for a specific patient and procedure. In this paper, we describe a novel patient-specific design process in the context of the virtual reality interface. We also show a resulting concentric tube robot design, created by a pediatric urologist to access a kidney stone in a pediatric patient. PMID:28656124

  2. Surgeon Design Interface for Patient-Specific Concentric Tube Robots.

    PubMed

    Morimoto, Tania K; Greer, Joseph D; Hsieh, Michael H; Okamura, Allison M

    2016-06-01

    Concentric tube robots have potential for use in a wide variety of surgical procedures due to their small size, dexterity, and ability to move in highly curved paths. Unlike most existing clinical robots, the design of these robots can be developed and manufactured on a patient- and procedure-specific basis. The design of concentric tube robots typically requires significant computation and optimization, and it remains unclear how the surgeon should be involved. We propose to use a virtual reality-based design environment for surgeons to easily and intuitively visualize and design a set of concentric tube robots for a specific patient and procedure. In this paper, we describe a novel patient-specific design process in the context of the virtual reality interface. We also show a resulting concentric tube robot design, created by a pediatric urologist to access a kidney stone in a pediatric patient.

  3. Implementation of a graphical user interface for the virtual multifrequency spectrometer: The VMS-Draw tool.

    PubMed

    Licari, Daniele; Baiardi, Alberto; Biczysko, Malgorzata; Egidi, Franco; Latouche, Camille; Barone, Vincenzo

    2015-02-15

    This article presents the setup and implementation of a graphical user interface (VMS-Draw) for a virtual multifrequency spectrometer. Special attention is paid to ease of use, generality and robustness for a panel of spectroscopic techniques and quantum mechanical approaches. Depending on the kind of data to be analyzed, VMS-Draw produces different types of graphical representations, including two-dimensional or three-dimesional (3D) plots, bar charts, or heat maps. Among other integrated features, one may quote the convolution of stick spectra to obtain realistic line-shapes. It is also possible to analyze and visualize, together with the structure, the molecular orbitals and/or the vibrational motions of molecular systems thanks to 3D interactive tools. On these grounds, VMS-Draw could represent a useful additional tool for spectroscopic studies integrating measurements and computer simulations. Copyright © 2014 Wiley Periodicals, Inc.

  4. Design and implementation of a status at a glance user interface for a power distribution expert system

    NASA Technical Reports Server (NTRS)

    Liberman, Eugene M.; Manner, David B.; Dolce, James L.; Mellor, Pamela A.

    1993-01-01

    A user interface to the power distribution expert system for Space Station Freedom is discussed. The importance of features which simplify assessing system status and which minimize navigating through layers of information are examined. Design rationale and implementation choices are also presented. The amalgamation of such design features as message linking arrows, reduced information content screens, high salience anomaly icons, and color choices with failure detection and diagnostic explanation from an expert system is shown to provide an effective status-at-a-glance monitoring system for power distribution. This user interface design offers diagnostic reasoning without compromising the monitoring of current events. The display can convey complex concepts in terms that are clear to its users.

  5. Virtual faces expressing emotions: an initial concomitant and construct validity study.

    PubMed

    Joyal, Christian C; Jacob, Laurence; Cigna, Marie-Hélène; Guay, Jean-Pierre; Renaud, Patrice

    2014-01-01

    Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

  6. A randomized trial of teaching clinical skills using virtual and live standardized patients.

    PubMed

    Triola, M; Feldman, H; Kalet, A L; Zabar, S; Kachur, E K; Gillespie, C; Anderson, M; Griesser, C; Lipkin, M

    2006-05-01

    We developed computer-based virtual patient (VP) cases to complement an interactive continuing medical education (CME) course that emphasizes skills practice using standardized patients (SP). Virtual patient simulations have the significant advantages of requiring fewer personnel and resources, being accessible at any time, and being highly standardized. Little is known about the educational effectiveness of these new resources. We conducted a randomized trial to assess the educational effectiveness of VPs and SPs in teaching clinical skills. To determine the effectiveness of VP cases when compared with live SP cases in improving clinical skills and knowledge. Randomized trial. Fifty-five health care providers (registered nurses 45%, physicians 15%, other provider types 40%) who attended a CME program. Participants were randomized to receive either 4 live cases (n=32) or 2 live and 2 virtual cases (n=23). Other aspects of the course were identical for both groups. Participants in both groups were equivalent with respect to pre-post workshop improvement in comfort level (P=.66) and preparedness to respond (P=.61), to screen (P=.79), and to care (P=.055) for patients using the skills taught. There was no difference in subjective ratings of effectiveness of the VPs and SPs by participants who experienced both (P=.79). Improvement in diagnostic abilities were equivalent in groups who experienced cases either live or virtually. Improvements in performance and diagnostic ability were equivalent between the groups and participants rated VP and SP cases equally. Including well-designed VPs has a potentially powerful and efficient place in clinical skills training for practicing health care workers.

  7. The ALICE Software Release Validation cluster

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Krzewicki, M.

    2015-12-01

    One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future.

  8. Measuring the spacecraft and environmental interactions of the 8-cm mercury ion thrusters on the P80-1 mission

    NASA Technical Reports Server (NTRS)

    Power, J. L.

    1981-01-01

    The subject interface measurements are described for the Ion Auxiliary Propulsion System (IAPS) flight test of two 8-cm thrusters. The diagnostic devices and the effects to be measured include: 1) quartz crystal microbalances to detect nonvolatile deposition due to thruster operation; 2) warm and cold solar cell monitors for nonvolatile and volatile (mercury) deposition; 3) retarding potential ion collectors to characterize the low energy thruster ionic efflux; and 4) a probe to measure the spacecraft potential and thruster generated electron currents to biased spacecraft surfaces. The diagnostics will also assess space environmental interactions of the spacecraft and thrusters. The diagnostic data will characterize mercury thruster interfaces and provide data useful for future applications.

  9. Design of virtual display and testing system for moving mass electromechanical actuator

    NASA Astrophysics Data System (ADS)

    Gao, Zhigang; Geng, Keda; Zhou, Jun; Li, Peng

    2015-12-01

    Aiming at the problem of control, measurement and movement virtual display of moving mass electromechanical actuator(MMEA), the virtual testing system of MMEA was developed based on the PC-DAQ architecture and the software platform of LabVIEW, and the comprehensive test task such as drive control of MMEA, tests of kinematic parameter, measurement of centroid position and virtual display of movement could be accomplished. The system could solve the alignment for acquisition time between multiple measurement channels in different DAQ cards, then on this basis, the researches were focused on the dynamic 3D virtual display by the LabVIEW, and the virtual display of MMEA were realized by the method of calling DLL and the method of 3D graph drawing controls. Considering the collaboration with the virtual testing system, including the hardware drive, the measurement software of data acquisition, and the 3D graph drawing controls method was selected, which could obtained the synchronization measurement, control and display. The system can measure dynamic centroid position and kinematic position of movable mass block while controlling the MMEA, and the interface of 3D virtual display has realistic effect and motion smooth, which can solve the problem of display and playback about MMEA in the closed shell.

  10. Virtual hydrology observatory: an immersive visualization of hydrology modeling

    NASA Astrophysics Data System (ADS)

    Su, Simon; Cruz-Neira, Carolina; Habib, Emad; Gerndt, Andreas

    2009-02-01

    The Virtual Hydrology Observatory will provide students with the ability to observe the integrated hydrology simulation with an instructional interface by using a desktop based or immersive virtual reality setup. It is the goal of the virtual hydrology observatory application to facilitate the introduction of field experience and observational skills into hydrology courses through innovative virtual techniques that mimic activities during actual field visits. The simulation part of the application is developed from the integrated atmospheric forecast model: Weather Research and Forecasting (WRF), and the hydrology model: Gridded Surface/Subsurface Hydrologic Analysis (GSSHA). Both the output from WRF and GSSHA models are then used to generate the final visualization components of the Virtual Hydrology Observatory. The various visualization data processing techniques provided by VTK are 2D Delaunay triangulation and data optimization. Once all the visualization components are generated, they are integrated into the simulation data using VRFlowVis and VR Juggler software toolkit. VR Juggler is used primarily to provide the Virtual Hydrology Observatory application with fully immersive and real time 3D interaction experience; while VRFlowVis provides the integration framework for the hydrologic simulation data, graphical objects and user interaction. A six-sided CAVETM like system is used to run the Virtual Hydrology Observatory to provide the students with a fully immersive experience.

  11. [Web-based training in radiology - student course in the Virtual University of Bavaria].

    PubMed

    Grunewald, M; Gebhard, H; Jakob, C; Wagner, M; Hothorn, T; Neuhuber, W L; Bautz, W A; Greess, H R

    2004-06-01

    The ninth version of the licensing regulation for medical doctors (Approbation Regulation (AR)) sets a benchmark in terms of practical experience, interdigitation of preclinical and clinical studies, interdisciplinary approach, economic efficiency, independence of students, added new teaching and learning modalities, and ongoing evaluation of the progress of the medical students. It is the aim to implement these major points of the AR in a model course for diagnostic radiology and radiation protection within the scope of the Virtual University of Bavaria and test them in practice. In cooperation with residents and board certified radiologists, students developed the virtual course "Web-Based Training (WBT) Radiology" in diagnostic radiology and radiation protection for students in the first clinical semester. A representative target group taken from the student body was asked about the options to get access to the World Wide Web (Internet), and the satisfaction concerning configuration and content of the newly developed program. A comparison was made between the results of the final examination taken by students who made use of the virtual course in addition to conventional lessons and taken by students who did not subscribe to the virtual course and exclusively relied on conventional lessons. In addition, a pilot study was conducted in the winter semester 2002/03, which compared students taking either the traditional lessons or the new virtual course on the Internet. The virtual course-model had test results with a positive trend. All targeted students had Internet access. Constructive criticism was immediately implemented and contributed to rapid optimization. The learning success of the additive or alternative virtual course was in no way less than the learning success achieved with the conventional course. The learning success as measure of quality in teaching and the acceptance by students and teachers justify the continuation of this course model and its expansion. Besides enabling the learning in small study groups; the course "WBT Radiology" might not only help implementing the major points of the new AR but might also complement any deficiencies in the current education. Economic aspects may encourage their implementations.

  12. Interface Design Implications for Recalling the Spatial Configuration of Virtual Auditory Environments

    NASA Astrophysics Data System (ADS)

    McMullen, Kyla A.

    Although the concept of virtual spatial audio has existed for almost twenty-five years, only in the past fifteen years has modern computing technology enabled the real-time processing needed to deliver high-precision spatial audio. Furthermore, the concept of virtually walking through an auditory environment did not exist. The applications of such an interface have numerous potential uses. Spatial audio has the potential to be used in various manners ranging from enhancing sounds delivered in virtual gaming worlds to conveying spatial locations in real-time emergency response systems. To incorporate this technology in real-world systems, various concerns should be addressed. First, to widely incorporate spatial audio into real-world systems, head-related transfer functions (HRTFs) must be inexpensively created for each user. The present study further investigated an HRTF subjective selection procedure previously developed within our research group. Users discriminated auditory cues to subjectively select their preferred HRTF from a publicly available database. Next, the issue of training to find virtual sources was addressed. Listeners participated in a localization training experiment using their selected HRTFs. The training procedure was created from the characterization of successful search strategies in prior auditory search experiments. Search accuracy significantly improved after listeners performed the training procedure. Next, in the investigation of auditory spatial memory, listeners completed three search and recall tasks with differing recall methods. Recall accuracy significantly decreased in tasks that required the storage of sound source configurations in memory. To assess the impacts of practical scenarios, the present work assessed the performance effects of: signal uncertainty, visual augmentation, and different attenuation modeling. Fortunately, source uncertainty did not affect listeners' ability to recall or identify sound sources. The present study also found that the presence of visual reference frames significantly increased recall accuracy. Additionally, the incorporation of drastic attenuation significantly improved environment recall accuracy. Through investigating the aforementioned concerns, the present study made initial footsteps guiding the design of virtual auditory environments that support spatial configuration recall.

  13. Modular mechatronic system for stationary bicycles interfaced with virtual environment for rehabilitation.

    PubMed

    Ranky, Richard G; Sivak, Mark L; Lewis, Jeffrey A; Gade, Venkata K; Deutsch, Judith E; Mavroidis, Constantinos

    2014-06-05

    Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider's lower extremities. The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders.

  14. Fast and Efficient Radiological Interventions via a Graphical User Interface Commanded Magnetic Resonance Compatible Robotic Device

    PubMed Central

    Özcan, Alpay; Christoforou, Eftychios; Brown, Daniel; Tsekos, Nikolaos

    2011-01-01

    The graphical user interface for an MR compatible robotic device has the capability of displaying oblique MR slices in 2D and a 3D virtual environment along with the representation of the robotic arm in order to swiftly complete the intervention. Using the advantages of the MR modality the device saves time and effort, is safer for the medical staff and is more comfortable for the patient. PMID:17946067

  15. FermiGrid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yocum, D.R.; Berman, E.; Canal, P.

    2007-05-01

    As one of the founding members of the Open Science Grid Consortium (OSG), Fermilab enables coherent access to its production resources through the Grid infrastructure system called FermiGrid. This system successfully provides for centrally managed grid services, opportunistic resource access, development of OSG Interfaces for Fermilab, and an interface to the Fermilab dCache system. FermiGrid supports virtual organizations (VOs) including high energy physics experiments (USCMS, MINOS, D0, CDF, ILC), astrophysics experiments (SDSS, Auger, DES), biology experiments (GADU, Nanohub) and educational activities.

  16. The use of virtual ground to control transmembrane voltages and measure bilayer currents in serial arrays of droplet interface bilayers

    NASA Astrophysics Data System (ADS)

    Sarles, Stephen A.

    2013-09-01

    The droplet interface bilayer (DIB) is a simple technique for constructing a stable lipid bilayer at the interface of two lipid-encased water droplets submerged in oil. Networks of DIBs formed by connecting more than two droplets constitute a new form of modular biomolecular smart material, where the transduction properties of a single lipid bilayer can affect the actions performed at other interface bilayers in the network via diffusion through the aqueous environments of shared droplet connections. The passive electrical properties of a lipid bilayer and the arrangement of droplets that determine the paths for transport in the network require specific electrical control to stimulate and interrogate each bilayer. Here, we explore the use of virtual ground for electrodes inserted into specific droplets in the network and employ a multichannel patch clamp amplifier to characterize bilayer formation and ion-channel activity in a serial DIB array. Analysis of serial connections of DIBs is discussed to understand how assigning electrode connections to the measurement device can be used to measure activity across all lipid membranes within a network. Serial arrays of DIBs are assembled using the regulated attachment method within a multi-compartment flexible substrate, and wire-type electrodes inserted into each droplet compartment of the substrate enable the application of voltage and measurement of current in each droplet in the array.

  17. A study of System Interface Sets (SIS) for the host, target and integration environments of the Space Station Program (SSP)

    NASA Technical Reports Server (NTRS)

    Mckay, Charles; Auty, David; Rogers, Kathy

    1987-01-01

    System interface sets (SIS) for large, complex, non-stop, distributed systems are examined. The SIS of the Space Station Program (SSP) was selected as the focus of this study because an appropriate virtual interface specification of the SIS is believed to have the most potential to free the project from four life cycle tyrannies which are rooted in a dependance on either a proprietary or particular instance of: operating systems, data management systems, communications systems, and instruction set architectures. The static perspective of the common Ada programming support environment interface set (CAIS) and the portable common execution environment (PCEE) activities are discussed. Also, the dynamic perspective of the PCEE is addressed.

  18. Transforming an educational virtual reality simulation into a work of fine art.

    PubMed

    Panaiotis; Addison, Laura; Vergara, Víctor M; Hakamata, Takeshi; Alverson, Dale C; Saiki, Stanley M; Caudell, Thomas Preston

    2008-01-01

    This paper outlines user interface and interaction issues, technical considerations, and problems encountered in transforming an educational VR simulation of a reified kidney nephron into an interactive artwork appropriate for a fine arts museum.

  19. How to tell a patient's story? Influence of the case narrative design on the clinical reasoning process in virtual patients.

    PubMed

    Hege, Inga; Dietl, Anita; Kiesewetter, Jan; Schelling, Jörg; Kiesewetter, Isabel

    2018-02-28

    Virtual patients (VPs) are narrative-based educational activities to train clinical reasoning in a safe environment. Our aim was to explore the influence of the design of the narrative and level of difficulty on the clinical reasoning process, diagnostic accuracy and time-on-task. In a randomized controlled trial, we analyzed the clinical reasoning process of 46 medical students with six VPs in three different variations: (1) patients showing a friendly behavior, (2) patients showing a disruptive behavior and (3) a version without a patient story. For easy VPs, we did not see a significant difference in diagnostic accuracy. For difficult VPs, the diagnostic accuracy was significantly higher for participants who worked on the friendly VPs compared to the other two groups. Independent from VP difficulty, participants identified significantly more problems and tests for disruptive than for friendly VPs; time on task was comparable for these two groups. The extrinsic motivation of participants working on the VPs without a patient story was significantly lower than for the students working on the friendly VPs. Our results indicate that the measured VP difficulty has a higher influence on the clinical reasoning process and diagnostic accuracy than the variations in the narratives.

  20. VirGO: A Visual Browser for the ESO Science Archive Facility

    NASA Astrophysics Data System (ADS)

    Chéreau, F.

    2008-08-01

    VirGO is the next generation Visual Browser for the ESO Science Archive Facility developed by the Virtual Observatory (VO) Systems Department. It is a plug-in for the popular open source software Stellarium adding capabilities for browsing professional astronomical data. VirGO gives astronomers the possibility to easily discover and select data from millions of observations in a new visual and intuitive way. Its main feature is to perform real-time access and graphical display of a large number of observations by showing instrumental footprints and image previews, and to allow their selection and filtering for subsequent download from the ESO SAF web interface. It also allows the loading of external FITS files or VOTables, the superimposition of Digitized Sky Survey (DSS) background images, and the visualization of the sky in a `real life' mode as seen from the main ESO sites. All data interfaces are based on Virtual Observatory standards which allow access to images and spectra from external data centers, and interaction with the ESO SAF web interface or any other VO applications supporting the PLASTIC messaging system. The main website for VirGO is at http://archive.eso.org/cms/virgo.

  1. Mathematical analysis of a sharp-diffuse interfaces model for seawater intrusion

    NASA Astrophysics Data System (ADS)

    Choquet, C.; Diédhiou, M. M.; Rosier, C.

    2015-10-01

    We consider a new model mixing sharp and diffuse interface approaches for seawater intrusion phenomena in free aquifers. More precisely, a phase field model is introduced in the boundary conditions on the virtual sharp interfaces. We thus include in the model the existence of diffuse transition zones but we preserve the simplified structure allowing front tracking. The three-dimensional problem then reduces to a two-dimensional model involving a strongly coupled system of partial differential equations of parabolic type describing the evolution of the depths of the two free surfaces, that is the interface between salt- and freshwater and the water table. We prove the existence of a weak solution for the model completed with initial and boundary conditions. We also prove that the depths of the two interfaces satisfy a coupled maximum principle.

  2. Virtual Hubs for facilitating access to Open Data

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Latre, Miguel Á.; Ernst, Julia; Brumana, Raffaella; Brauman, Stefan; Nativi, Stefano

    2015-04-01

    In October 2014 the ENERGIC-OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". In ENERGIC-OD, Virtual Hubs are conceived as information systems supporting the full life cycle of Open Data: publishing, discovery and access. They facilitate the use of Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and data services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus, beyond a certain extent, heterogeneity is irreducible especially in interdisciplinary contexts. ENERGIC-OD Virtual Hubs address heterogeneity adopting a mediation and brokering approach: specific components (brokers) are dedicated to harmonize service interfaces, metadata and data models, enabling seamless discovery and access to heterogeneous infrastructures and datasets. As an innovation project, ENERGIC-OD will integrate several existing technologies to implement Virtual Hubs as single points of access to geospatial datasets provided by new or existing platforms and infrastructures, including INSPIRE-compliant systems and Copernicus services. ENERGIC OD will deploy a set of five Virtual Hubs (VHs) at national level in France, Germany, Italy, Poland, Spain and an additional one at the European level. VHs will be provided according to the cloud Software-as-a-Services model. The main expected impact of VHs is the creation of new business opportunities opening up access to Research Data and Public Sector Information. Therefore, ENERGIC-OD addresses not only end-users, who will have the opportunity to access the VH through a geo-portal, but also application developers who will be able to access VH functionalities through simple Application Programming Interfaces (API). ENERGIC-OD Consortium will develop ten different applications on top of the deployed VHs. They aim to demonstrate how VHs facilitate the development of new and multidisciplinary applications based on the full exploitation of (open) GI, hence stimulating innovation and business activities.

  3. A Virtual Reality Simulator Prototype for Learning and Assessing Phaco-sculpting Skills

    NASA Astrophysics Data System (ADS)

    Choi, Kup-Sze

    This paper presents a virtual reality based simulator prototype for learning phacoemulsification in cataract surgery, with focus on the skills required for making a cross-shape trench in cataractous lens by an ultrasound probe during the phaco-sculpting procedure. An immersive virtual environment is created with 3D models of the lens and surgical tools. Haptic device is also used as 3D user interface. Phaco-sculpting is simulated by interactively deleting the constituting tetrahedrons of the lens model. Collisions between the virtual probe and the lens are effectively identified by partitioning the space containing the lens hierarchically with an octree. The simulator can be programmed to collect real-time quantitative user data for reviewing and assessing trainee's performance in an objective manner. A game-based learning environment can be created on top of the simulator by incorporating gaming elements based on the quantifiable performance metrics.

  4. The use of virtual reality simulation of head trauma in a surgical boot camp.

    PubMed

    Vergara, Victor M; Panaiotis; Kingsley, Darra; Alverson, Dale C; Godsmith, Timothy; Xia, Shan; Caudell, Thomas P

    2009-01-01

    Surgical "boot camps" provide excellent opportunities to enhance orientation, learning, and preparation of new surgery interns as they enter the clinical arena. This paper describes the utilization of an interactive virtual reality (VR) simulation and associated virtual patient (VP) as an additional tool for surgical boot camps. Complementing other forms of simulation, virtual patients (VPs) require less specialized equipment and can also provide a wide variety of medical scenarios. In this paper we discuss a study that measured the learning effectiveness of a real-world VP simulation used by a class of new surgery interns who operated it with a standard computer interface. The usability of the simulator as a learning tool has been demonstrated and measured. This study brings the use of VR simulation with VPs closer to wider application and integration into a training curriculum, such as a surgery intern boot camp.

  5. Envisioning the future of home care: applications of immersive virtual reality.

    PubMed

    Brennan, Patricia Flatley; Arnott Smith, Catherine; Ponto, Kevin; Radwin, Robert; Kreutz, Kendra

    2013-01-01

    Accelerating the design of technologies to support health in the home requires 1) better understanding of how the household context shapes consumer health behaviors and (2) the opportunity to afford engineers, designers, and health professionals the chance to systematically study the home environment. We developed the Living Environments Laboratory (LEL) with a fully immersive, six-sided virtual reality CAVE to enable recreation of a broad range of household environments. We have successfully developed a virtual apartment, including a kitchen, living space, and bathroom. Over 2000 people have visited the LEL CAVE. Participants use an electronic wand to activate common household affordances such as opening a refrigerator door or lifting a cup. Challenges currently being explored include creating natural gesture to interface with virtual objects, developing robust, simple procedures to capture actual living environments and rendering them in a 3D visualization, and devising systematic stable terminologies to characterize home environments.

  6. Toward a comprehensive hybrid physical-virtual reality simulator of peripheral anesthesia with ultrasound and neurostimulator guidance.

    PubMed

    Samosky, Joseph T; Allen, Pete; Boronyak, Steve; Branstetter, Barton; Hein, Steven; Juhas, Mark; Nelson, Douglas A; Orebaugh, Steven; Pinto, Rohan; Smelko, Adam; Thompson, Mitch; Weaver, Robert A

    2011-01-01

    We are developing a simulator of peripheral nerve block utilizing a mixed-reality approach: the combination of a physical model, an MRI-derived virtual model, mechatronics and spatial tracking. Our design uses tangible (physical) interfaces to simulate surface anatomy, haptic feedback during needle insertion, mechatronic display of muscle twitch corresponding to the specific nerve stimulated, and visual and haptic feedback for the injection syringe. The twitch response is calculated incorporating the sensed output of a real neurostimulator. The virtual model is isomorphic with the physical model and is derived from segmented MRI data. This model provides the subsurface anatomy and, combined with electromagnetic tracking of a sham ultrasound probe and a standard nerve block needle, supports simulated ultrasound display and measurement of needle location and proximity to nerves and vessels. The needle tracking and virtual model also support objective performance metrics of needle targeting technique.

  7. Accessing SDO data in a pipeline environment using the VSO WSDL/SOAP interface

    NASA Astrophysics Data System (ADS)

    Suarez Sola, F. I.; Hourcle, J. A.; Amezcua, A.; Bogart, R.; Davey, A. R.; Gurman, J. B.; Hill, F.; Hughitt, V. K.; Martens, P. C.; Spencer, J.; Vso Team

    2010-12-01

    As part of the Virtual Solar Observatory (VSO) effort to support the Solar Dynamics Observatory (SDO) data, the VSO has worked on bringing up to date its WSDL document and SOAP interface to make it compatible with most widely used web services core engines. (E.g. axis2, jws, etc.) In this presentation we will explore the possibilities available for searching and/or fetching data within pipeline code. We will explain some of the WSDL/VSO-SDO interface intricacies and show how the vast amount of data that is available via the VSO can be tapped via IDL, Java, Perl or C in an uncomplicated way.

  8. Loop Group Parakeet Virtual Cable Concept Demonstrator

    NASA Astrophysics Data System (ADS)

    Dowsett, T.; McNeill, T. C.; Reynolds, A. B.; Blair, W. D.

    2002-07-01

    The Parakeet Virtual Cable (PVC) concept demonstrator uses the Ethernet Local Area Network (LAN) laid for the Battle Command Support System (BCSS) to connect the Parakeet DVT(DA) (voice terminal) to the Parakeet multiplexer. This currently requires pairs of PVC interface units to be installed for each DVT(DA) . To reduce the cost of a PVC installation, the concept of a Loop Group Parakeet Virtual Cable (LGPVC) was proposed. This device was designed to replace the up to 30 PVC boxes and the multiplexer at the multiplexer side of a PVC installation. While the demonstrator is largely complete, testing has revealed an incomplete understanding of how to emulate the proprietary handshaking occurring between the circuit switch and the multiplexer. The LGPVC concept cannot yet be demonstrated.

  9. Immersive telepresence system using high-resolution omnidirectional movies and a locomotion interface

    NASA Astrophysics Data System (ADS)

    Ikeda, Sei; Sato, Tomokazu; Kanbara, Masayuki; Yokoya, Naokazu

    2004-05-01

    Technology that enables users to experience a remote site virtually is called telepresence. A telepresence system using real environment images is expected to be used in the field of entertainment, medicine, education and so on. This paper describes a novel telepresence system which enables users to walk through a photorealistic virtualized environment by actual walking. To realize such a system, a wide-angle high-resolution movie is projected on an immersive multi-screen display to present users the virtualized environments and a treadmill is controlled according to detected user's locomotion. In this study, we use an omnidirectional multi-camera system to acquire images real outdoor scene. The proposed system provides users with rich sense of walking in a remote site.

  10. Virtual hospital--a computer-aided platform to evaluate the sense of direction.

    PubMed

    Jiang, Ching-Fen; Li, Yuan-Shyi

    2007-01-01

    This paper presents a computer-aided platform, named Virtual Hospital (VH), to evaluate the wayfinding ability that is found impaired in senile people with early dementia. The development of the VH takes the advantage of virtual reality technology to make the evaluation of the sense of direction more convenient and accurate then the conventional way. A pilot study was carried out to test its feasibility in differentiating the sense of direction between different genders. The results with significant differences in the response time (p<0.05) and the pointing error (p<0.01) between genders suggest the potential of the VH for clinical uses. Further improvement on the human-machine interface is necessary to make it easy for geriatric people to use.

  11. Virtual- and real-world operation of mobile robotic manipulators: integrated simulation, visualization, and control environment

    NASA Astrophysics Data System (ADS)

    Chen, ChuXin; Trivedi, Mohan M.

    1992-03-01

    This research is focused on enhancing the overall productivity of an integrated human-robot system. A simulation, animation, visualization, and interactive control (SAVIC) environment has been developed for the design and operation of an integrated robotic manipulator system. This unique system possesses the abilities for multisensor simulation, kinematics and locomotion animation, dynamic motion and manipulation animation, transformation between real and virtual modes within the same graphics system, ease in exchanging software modules and hardware devices between real and virtual world operations, and interfacing with a real robotic system. This paper describes a working system and illustrates the concepts by presenting the simulation, animation, and control methodologies for a unique mobile robot with articulated tracks, a manipulator, and sensory modules.

  12. A pilot feasibility study of virtual patient simulation to enhance social work students' brief mental health assessment skills.

    PubMed

    Washburn, Micki; Bordnick, Patrick; Rizzo, Albert Skip

    2016-10-01

    This study presents preliminary feasibility and acceptability data on the use of virtual patient (VP) simulations to develop brief assessment skills within an interdisciplinary care setting. Results support the acceptability of technology-enhanced simulations and offer preliminary evidence for an association between engagement in VP practice simulations and improvements in diagnostic accuracy and clinical interviewing skills. Recommendations and next steps for research on technology-enhanced simulations within social work are discussed.

  13. A digital atlas of breast histopathology: an application of web based virtual microscopy

    PubMed Central

    Lundin, M; Lundin, J; Helin, H; Isola, J

    2004-01-01

    Aims: To develop an educationally useful atlas of breast histopathology, using advanced web based virtual microscopy technology. Methods: By using a robotic microscope and software adopted and modified from the aerial and satellite imaging industry, a virtual microscopy system was developed that allows fully automated slide scanning and image distribution via the internet. More than 150 slides were scanned at high resolution with an oil immersion ×40 objective (numerical aperture, 1.3) and archived on an image server residing in a high speed university network. Results: A publicly available website was constructed, http://www.webmicroscope.net/breastatlas, which features a comprehensive virtual slide atlas of breast histopathology according to the World Health Organisation 2003 classification. Users can view any part of an entire specimen at any magnification within a standard web browser. The virtual slides are supplemented with concise textual descriptions, but can also be viewed without diagnostic information for self assessment of histopathology skills. Conclusions: Using the technology described here, it is feasible to develop clinically and educationally useful virtual microscopy applications. Web based virtual microscopy will probably become widely used at all levels in pathology teaching. PMID:15563669

  14. A Testbed for Data Fusion for Helicopter Diagnostics and Prognostics

    DTIC Science & Technology

    2003-03-01

    and algorithm design and tuning in order to develop advanced diagnostic and prognostic techniques for air craft health monitoring . Here a...and development of models for diagnostics, prognostics , and anomaly detection . Figure 5 VMEP Server Browser Interface 7 Download... detections , and prognostic prediction time horizons. The VMEP system and in particular the web component are ideal for performing data collection

  15. The Pathologist 2.0: An Update on Digital Pathology in Veterinary Medicine.

    PubMed

    Bertram, Christof A; Klopfleisch, Robert

    2017-09-01

    Using light microscopy to describe the microarchitecture of normal and diseased tissues has changed very little since the middle of the 19th century. While the premise of histologic analysis remains intact, our relationship with the microscope is changing dramatically. Digital pathology offers new forms of visualization, and delivery of images is facilitated in unprecedented ways. This new technology can untether us entirely from our light microscopes, with many pathologists already performing their jobs using virtual microscopy. Several veterinary colleges have integrated virtual microscopy in their curriculum, and some diagnostic histopathology labs are switching to virtual microscopy as their main tool for the assessment of histologic specimens. Considering recent technical advancements of slide scanner and viewing software, digital pathology should now be considered a serious alternative to traditional light microscopy. This review therefore intends to give an overview of the current digital pathology technologies and their potential in all fields of veterinary pathology (ie, research, diagnostic service, and education). A future integration of digital pathology in the veterinary pathologist's workflow seems to be inevitable, and therefore it is proposed that trainees should be taught in digital pathology to keep up with the unavoidable digitization of the profession.

  16. STS-116 and Expedition 12 Preflight Training, VR Lab Bldg. 9.

    NASA Image and Video Library

    2005-05-06

    JSC2005-E-18147 (6 May 2005) --- Astronauts Sunita L. Williams (left), Expedition 14 flight engineer, and Joan E. Higginbotham, STS-116 mission specialist, use the virtual reality lab at the Johnson Space Center to train for their duties aboard the space shuttle. This type of computer interface, paired with virtual reality training hardware and software, helps to prepare the entire team for dealing with space station elements. Williams will join Expedition 14 in progress and serve as a flight engineer after traveling to the station on space shuttle mission STS-116.

  17. Research on Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Lobeck, William E.

    2002-01-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  18. Enterprise Cloud Architecture for Chinese Ministry of Railway

    NASA Astrophysics Data System (ADS)

    Shan, Xumei; Liu, Hefeng

    Enterprise like PRC Ministry of Railways (MOR), is facing various challenges ranging from highly distributed computing environment and low legacy system utilization, Cloud Computing is increasingly regarded as one workable solution to address this. This article describes full scale cloud solution with Intel Tashi as virtual machine infrastructure layer, Hadoop HDFS as computing platform, and self developed SaaS interface, gluing virtual machine and HDFS with Xen hypervisor. As a result, on demand computing task application and deployment have been tackled per MOR real working scenarios at the end of article.

  19. Research on Intelligent Synthesis Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.; Loftin, R. Bowen

    2002-12-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  20. The Virtual Solar Observatory: Progress and Diversions

    NASA Astrophysics Data System (ADS)

    Gurman, Joseph B.; Bogart, R. S.; Amezcua, A.; Hill, Frank; Oien, Niles; Davey, Alisdair R.; Hourcle, Joseph; Mansky, E.; Spencer, Jennifer L.

    2017-08-01

    The Virtual Solar Observatory (VSO) is a known and useful method for identifying and accessing solar physics data online. We review current "behind the scenes" work on the VSO, including the addition of new data providers and the return of access to data sets to which service was temporarily interrupted. We also report on the effect on software development efforts when government IT “security” initiatives impinge on finite resoruces. As always, we invite SPD members to identify data sets, services, and interfaces they would like to see implemented in the VSO.

  1. CliniSpace: a multiperson 3D online immersive training environment accessible through a browser.

    PubMed

    Dev, Parvati; Heinrichs, W LeRoy; Youngblood, Patricia

    2011-01-01

    Immersive online medical environments, with dynamic virtual patients, have been shown to be effective for scenario-based learning (1). However, ease of use and ease of access have been barriers to their use. We used feedback from prior evaluation of these projects to design and develop CliniSpace. To improve usability, we retained the richness of prior virtual environments but modified the user interface. To improve access, we used a Software-as-a-Service (SaaS) approach to present a richly immersive 3D environment within a web browser.

  2. Environmental Integrity of Coating/Metal Interface.

    DTIC Science & Technology

    1988-01-01

    34. Report No. 1 FROM 02/01/87 TO 01/31/88 1988, JANUARY 32 * ’B SUPOLEMEN’ARY NOTATiON - 7 COSAT CODES 18 SUBJECT TERMS ,Co’r ’nXe on reverse ’,"ecessa’, ac ...AgCI accelerate disbonding by the formation of a weak fluid boundary layer at the coating/metal interface just ahead of electroosmotically produced...pockets of electroosmotically formed electrolyte or swollen regions of the heterogeneous polymer. A time series of micrographs allowed a virtually

  3. A Tutorial on Interfacing the Object Management Group (OMG) Data Distribution Service (DDS) with LabView

    NASA Technical Reports Server (NTRS)

    Smith, Kevin

    2011-01-01

    This tutorial will explain the concepts and steps for interfacing a National Instruments LabView virtual instrument (VI) running on a Windows platform with another computer via the Object Management Group (OMG) Data Distribution Service (DDS) as implemented by the Twin Oaks Computing CoreDX. This paper is for educational purposes only and therefore, the referenced source code will be simplistic and void of all error checking. Implementation will be accomplished using the C programming language.

  4. Direct handling of sharp interfacial energy for microstructural evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernández–Rivera, Efraín; Tikare, Veena; Noirot, Laurence

    In this study, we introduce a simplification to the previously demonstrated hybrid Potts–phase field (hPPF), which relates interfacial energies to microstructural sharp interfaces. The model defines interfacial energy by a Potts-like discrete interface approach of counting unlike neighbors, which we use to compute local curvature. The model is compared to the hPPF by studying interfacial characteristics and grain growth behavior. The models give virtually identical results, while the new model allows the simulator more direct control of interfacial energy.

  5. Direct handling of sharp interfacial energy for microstructural evolution

    DOE PAGES

    Hernández–Rivera, Efraín; Tikare, Veena; Noirot, Laurence; ...

    2014-08-24

    In this study, we introduce a simplification to the previously demonstrated hybrid Potts–phase field (hPPF), which relates interfacial energies to microstructural sharp interfaces. The model defines interfacial energy by a Potts-like discrete interface approach of counting unlike neighbors, which we use to compute local curvature. The model is compared to the hPPF by studying interfacial characteristics and grain growth behavior. The models give virtually identical results, while the new model allows the simulator more direct control of interfacial energy.

  6. Qualification of a multi-diagnostic detonator-output characterization procedure utilizing PMMA witness blocks

    NASA Astrophysics Data System (ADS)

    Biss, Matthew; Murphy, Michael; Lieber, Mark

    2017-06-01

    Experiments were conducted in an effort to qualify a multi-diagnostic characterization procedure for the performance output of a detonator when fired into a poly(methyl methacrylate) (PMMA) witness block. A suite of optical diagnostics were utilized in combination to both bound the shock wave interaction state at the detonator/PMMA interface and characterize the nature of the shock wave decay in PMMA. The diagnostics included the Shock Wave Image Framing Technique (SWIFT), a photocathode tube streak camera, and photonic Doppler velocimetry (PDV). High-precision, optically clear witness blocks permitted dynamic flow visualization of the shock wave in PMMA via focused shadowgraphy. SWIFT- and streak-imaging diagnostics captured the spatiotemporally evolving shock wave, providing a two-dimensional temporally discrete image set and a one-dimensional temporally continuous image, respectively. PDV provided the temporal velocity history of the detonator output along the detonator axis. Through combination of the results obtained, a bound was able to be placed on the interface condition and a more-physical profile representing the shock wave decay in PMMA for an exploding-bridgewire detonator was achieved.

  7. Kinematic evaluation of virtual walking trajectories.

    PubMed

    Cirio, Gabriel; Olivier, Anne-Hélène; Marchal, Maud; Pettré, Julien

    2013-04-01

    Virtual walking, a fundamental task in Virtual Reality (VR), is greatly influenced by the locomotion interface being used, by the specificities of input and output devices, and by the way the virtual environment is represented. No matter how virtual walking is controlled, the generation of realistic virtual trajectories is absolutely required for some applications, especially those dedicated to the study of walking behaviors in VR, navigation through virtual places for architecture, rehabilitation and training. Previous studies focused on evaluating the realism of locomotion trajectories have mostly considered the result of the locomotion task (efficiency, accuracy) and its subjective perception (presence, cybersickness). Few focused on the locomotion trajectory itself, but in situation of geometrically constrained task. In this paper, we study the realism of unconstrained trajectories produced during virtual walking by addressing the following question: did the user reach his destination by virtually walking along a trajectory he would have followed in similar real conditions? To this end, we propose a comprehensive evaluation framework consisting on a set of trajectographical criteria and a locomotion model to generate reference trajectories. We consider a simple locomotion task where users walk between two oriented points in space. The travel path is analyzed both geometrically and temporally in comparison to simulated reference trajectories. In addition, we demonstrate the framework over a user study which considered an initial set of common and frequent virtual walking conditions, namely different input devices, output display devices, control laws, and visualization modalities. The study provides insight into the relative contributions of each condition to the overall realism of the resulting virtual trajectories.

  8. Virtual microscopy: merging of computer mediated communication and intuitive interfacing

    NASA Astrophysics Data System (ADS)

    de Ridder, Huib; de Ridder-Sluiter, Johanna G.; Kluin, Philip M.; Christiaans, Henri H. C. M.

    2009-02-01

    Ubiquitous computing (or Ambient Intelligence) is an upcoming technology that is usually associated with futuristic smart environments in which information is available anytime anywhere and with which humans can interact in a natural, multimodal way. However spectacular the corresponding scenarios may be, it is equally challenging to consider how this technology may enhance existing situations. This is illustrated by a case study from the Dutch medical field: central quality reviewing for pathology in child oncology. The main goal of the review is to assess the quality of the diagnosis based on patient material. The sharing of knowledge in social face-to-face interaction during such meeting is an important advantage. At the same time there is the disadvantage that the experts from the seven Dutch academic medical centers have to travel to the review meeting and that the required logistics to collect and bring patient material and data to the meeting is cumbersome and time-consuming. This paper focuses on how this time-consuming, nonefficient way of reviewing can be replaced by a virtual collaboration system by merging technology supporting Computer Mediated Collaboration and intuitive interfacing. This requires insight in the preferred way of communication and collaboration as well as knowledge about preferred interaction style with a virtual shared workspace.

  9. Harmonizing Physics & Cosmology With Everything Else in the Universe(s)

    NASA Astrophysics Data System (ADS)

    Asija, Pal

    2006-03-01

    This paper postulates a theory of everything including our known finite physical universe within and as sub-set of an infinite virtual invisible universe occupying some of the same space and time. It attempts to harmonize astrophysics with everything else including life. It compares and contrasts properties, similarities, differences and relationships between the two universe(s). A particular attention is paid to the interface between the two and the challenges of building and/or traversing bridges between them. A number of inflection points between the two are identified. The paper also delineates their relationship to big bang, theory of evolution, gravity, dark matter, black holes, time travel, speed of light, theory of relativity and string theory just to name a few. Several new terms are introduced and defined to discuss proper relationship, transition and interface between the body, soul and spirit as well as their relationship to brain and mind. Physical bodies & beings are compared with virtual, meta and ultra bodies and beings and how the ``Virtual Inside'' relates to people, pets, plants and particles and their micro constituents as well as macro sets. The past, present, and potential of the concurrent universe(s) is compared and contrasted along with many myths and misconceptions of the meta physics as well as modern physics.

  10. Llnking the EarthScope Data Virtual Catalog to the GEON Portal

    NASA Astrophysics Data System (ADS)

    Lin, K.; Memon, A.; Baru, C.

    2008-12-01

    The EarthScope Data Portal provides a unified, single-point of access to EarthScope data and products from USArray, Plate Boundary Observatory (PBO), and San Andreas Fault Observatory at Depth (SAFOD) experiments. The portal features basic search and data access capabilities to allow users to discover and access EarthScope data using spatial, temporal, and other metadata-based (data type, station specific) search conditions. The portal search module is the user interface implementation of the EarthScope Data Search Web Service. This Web Service acts as a virtual catalog that in turn invokes Web services developed by IRIS (Incorporated Research Institutions for Seismology), UNAVCO (University NAVSTAR Consortium), and GFZ (German Research Center for Geosciences) to search for EarthScope data in the archives at each of these locations. These Web Services provide information about all resources (data) that match the specified search conditions. In this presentation we will describe how the EarthScope Data Search Web service can be integrated into the GEONsearch application in the GEON Portal (see http://portal.geongrid.org). Thus, a search request issued at the GEON Portal will also search the EarthScope virtual catalog thereby providing users seamless access to data in GEON as well as the Earthscope via a common user interface.

  11. Implementation of a virtual laboratory for training on sound insulation testing and uncertainty calculations in acoustic tests.

    PubMed

    Asensio, C; Gasco, L; Ruiz, M; Recuero, M

    2015-02-01

    This paper describes a methodology and case study for the implementation of educational virtual laboratories for practice training on acoustic tests according to international standards. The objectives of this activity are (a) to help the students understand and apply the procedures described in the standards and (b) to familiarize the students with the uncertainty in measurement and its estimation in acoustics. The virtual laboratory will not focus on the handling and set-up of real acoustic equipment but rather on procedures and uncertainty. The case study focuses on the application of the virtual laboratory for facade sound insulation tests according to ISO 140-5:1998 (International Organization for Standardization, Geneva, Switzerland, 1998), and the paper describes the causal and stochastic models and the constraints applied in the virtual environment under consideration. With a simple user interface, the laboratory will provide measurement data that the students will have to process to report the insulation results that must converge with the "virtual true values" in the laboratory. The main advantage of the virtual laboratory is derived from the customization of factors in which the student will be instructed or examined (for instance, background noise correction, the detection of sporadic corrupted observations, and the effect of instrument precision).

  12. Importance of Matching Physical Friction, Hardness, and Texture in Creating Realistic Haptic Virtual Surfaces.

    PubMed

    Culbertson, Heather; Kuchenbecker, Katherine J

    2017-01-01

    Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omni's inherent mechanical properties corrupted the user's haptic experience, decreasing realism of the virtual surface.

  13. Preoperative planning of thoracic surgery with use of three-dimensional reconstruction, rapid prototyping, simulation and virtual navigation.

    PubMed

    Heuts, Samuel; Sardari Nia, Peyman; Maessen, Jos G

    2016-01-01

    For the past decades, surgeries have become more complex, due to the increasing age of the patient population referred for thoracic surgery, more complex pathology and the emergence of minimally invasive thoracic surgery. Together with the early detection of thoracic disease as a result of innovations in diagnostic possibilities and the paradigm shift to personalized medicine, preoperative planning is becoming an indispensable and crucial aspect of surgery. Several new techniques facilitating this paradigm shift have emerged. Pre-operative marking and staining of lesions are already a widely accepted method of preoperative planning in thoracic surgery. However, three-dimensional (3D) image reconstructions, virtual simulation and rapid prototyping (RP) are still in development phase. These new techniques are expected to become an important part of the standard work-up of patients undergoing thoracic surgery in the future. This review aims at graphically presenting and summarizing these new diagnostic and therapeutic tools.

  14. A two-class self-paced BCI to control a robot in four directions.

    PubMed

    Ron-Angevin, Ricardo; Velasco-Alvarez, Francisco; Sancha-Ros, Salvador; da Silva-Sauer, Leandro

    2011-01-01

    In this work, an electroencephalographic analysis-based, self-paced (asynchronous) brain-computer interface (BCI) is proposed to control a mobile robot using four different navigation commands: turn right, turn left, move forward and move back. In order to reduce the probability of misclassification, the BCI is to be controlled with only two mental tasks (relaxed state versus imagination of right hand movements), using an audio-cued interface. Four healthy subjects participated in the experiment. After two sessions controlling a simulated robot in a virtual environment (which allowed the user to become familiar with the interface), three subjects successfully moved the robot in a real environment. The obtained results show that the proposed interface enables control over the robot, even for subjects with low BCI performance. © 2011 IEEE

  15. Strong modification of thin film properties due to screening across the interface

    NASA Astrophysics Data System (ADS)

    Altendorf, S. G.; Reisner, A.; Tam, B.; Meneghin, F.; Wirth, S.; Tjeng, L. H.

    2018-04-01

    We report on our investigation of the influence of screening across the interface on the properties of semiconducting thin films. Using EuO as a well-defined model material, layers of various thickness deposited on yttria-stabilized zirconia (100) substrates were covered half with Mg metal and half with the wide-band-gap insulator MgO. We observed that the Curie temperature for the thinnest films is significantly higher for the part which is interfaced with the metal compared to the part which is interfaced with the insulator. We infer that the proximity of a polarizable medium reduces the energies of virtual charge excitations and thus increases the effective exchange interactions, a strong effect that can be utilized systematically for the design of thin film and multilayer systems.

  16. Varieties of virtualization

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    1991-01-01

    Natural environments have a content, i.e., the objects in them; a geometry, i.e., a pattern of rules for positioning and displacing the objects; and a dynamics, i.e., a system of rules describing the effects of forces acting on the objects. Human interaction with most common natural environments has been optimized by centuries of evolution. Virtual environments created through the human-computer interface similarly have a content, geometry, and dynamics, but the arbitrary character of the computer simulation creating them does not insure that human interaction with these virtual environments will be natural. The interaction, indeed, could be supernatural but it also could be impossible. An important determinant of the comprehensibility of a virtual environment is the correspondence between the environmental frames of reference and those associated with the control of environmental objects. The effects of rotation and displacement of control frames of reference with respect to corresponding environmental references differ depending upon whether perceptual judgement or manual tracking performance is measured. The perceptual effects of frame of reference displacement may be analyzed in terms of distortions in the process of virtualizing the synthetic environment space. The effects of frame of reference displacement and rotation have been studied by asking subjects to estimate exocentric direction in a virtual space.

  17. A New Continent of Ideas

    NASA Technical Reports Server (NTRS)

    1990-01-01

    While a new technology called 'virtual reality' is still at the 'ground floor' level, one of its basic components, 3D computer graphics is already in wide commercial use and expanding. Other components that permit a human operator to 'virtually' explore an artificial environment and to interact with it are being demonstrated routinely at Ames and elsewhere. Virtual reality might be defined as an environment capable of being virtually entered - telepresence, it is called - or interacted with by a human. The Virtual Interface Environment Workstation (VIEW) is a head-mounted stereoscopic display system in which the display may be an artificial computer-generated environment or a real environment relayed from remote video cameras. Operator can 'step into' this environment and interact with it. The DataGlove has a series of fiber optic cables and sensors that detect any movement of the wearer's fingers and transmit the information to a host computer; a computer generated image of the hand will move exactly as the operator is moving his gloved hand. With appropriate software, the operator can use the glove to interact with the computer scene by grasping an object. The DataSuit is a sensor equipped full body garment that greatly increases the sphere of performance for virtual reality simulations.

  18. Human-machine interface hardware: The next decade

    NASA Technical Reports Server (NTRS)

    Marcus, Elizabeth A.

    1991-01-01

    In order to understand where human-machine interface hardware is headed, it is important to understand where we are today, how we got there, and what our goals for the future are. As computers become more capable, faster, and programs become more sophisticated, it becomes apparent that the interface hardware is the key to an exciting future in computing. How can a user interact and control a seemingly limitless array of parameters effectively? Today, the answer is most often a limitless array of controls. The link between these controls and human sensory motor capabilities does not utilize existing human capabilities to their full extent. Interface hardware for teleoperation and virtual environments is now facing a crossroad in design. Therefore, we as developers need to explore how the combination of interface hardware, human capabilities, and user experience can be blended to get the best performance today and in the future.

  19. A Mediator-Based Approach to Resolving Interface Heterogeneity of Web Services

    NASA Astrophysics Data System (ADS)

    Leitner, Philipp; Rosenberg, Florian; Michlmayr, Anton; Huber, Andreas; Dustdar, Schahram

    In theory, service-oriented architectures are based on the idea of increasing flexibility in the selection of internal and external business partners using loosely-coupled services. However, in practice this flexibility is limited by the fact that partners need not only to provide the same service, but to do so via virtually the same interface in order to actually be interchangeable easily. Invocation-level mediation may be used to overcome this issue — by using mediation interface differences can be resolved transparently at runtime. In this chapter we discuss the basic ideas of mediation, with a focus on interface-level mediation. We show how interface mediation is integrated into our dynamic Web service invocation framework DAIOS, and present three different mediation strategies, one based on structural message similarity, one based on semantically annotated WSDL, and one which is embedded into the VRESCo SOA runtime, a larger research project with explicit support for service mediation.

  20. A Virtual Instrument Panel and Serial Interface for the Parr 1672 Thermometer

    ERIC Educational Resources Information Center

    Salter, Gail; Range, Kevin; Salter, Carl

    2005-01-01

    The various features of a Visual Basic Program, which implements the 1672 Parr thermometer are described. The program permits remote control of the calorimetry experiment and also provides control for the flow of data and for file storage.

  1. Navigation in a Virtual Environment Using a Walking Interface

    DTIC Science & Technology

    2000-11-01

    Fukusima, 1993; Mittelstaedt & Glasauer, 1991; Schmuckler, 1995). Thus, only visual information is available for navigation by dead reckoning ( Gallistel ...Washington DC: National Academy Press. Gallistel , C.R. (1990). The Organization of Learning. Cambridge, MA: MIT Press. lwata, H. & Matsuda, K. (1992). Haptic

  2. Using a web-based application to define the accuracy of diagnostic tests when the gold standard is imperfect.

    PubMed

    Lim, Cherry; Wannapinij, Prapass; White, Lisa; Day, Nicholas P J; Cooper, Ben S; Peacock, Sharon J; Limmathurotsakul, Direk

    2013-01-01

    Estimates of the sensitivity and specificity for new diagnostic tests based on evaluation against a known gold standard are imprecise when the accuracy of the gold standard is imperfect. Bayesian latent class models (LCMs) can be helpful under these circumstances, but the necessary analysis requires expertise in computational programming. Here, we describe open-access web-based applications that allow non-experts to apply Bayesian LCMs to their own data sets via a user-friendly interface. Applications for Bayesian LCMs were constructed on a web server using R and WinBUGS programs. The models provided (http://mice.tropmedres.ac) include two Bayesian LCMs: the two-tests in two-population model (Hui and Walter model) and the three-tests in one-population model (Walter and Irwig model). Both models are available with simplified and advanced interfaces. In the former, all settings for Bayesian statistics are fixed as defaults. Users input their data set into a table provided on the webpage. Disease prevalence and accuracy of diagnostic tests are then estimated using the Bayesian LCM, and provided on the web page within a few minutes. With the advanced interfaces, experienced researchers can modify all settings in the models as needed. These settings include correlation among diagnostic test results and prior distributions for all unknown parameters. The web pages provide worked examples with both models using the original data sets presented by Hui and Walter in 1980, and by Walter and Irwig in 1988. We also illustrate the utility of the advanced interface using the Walter and Irwig model on a data set from a recent melioidosis study. The results obtained from the web-based applications were comparable to those published previously. The newly developed web-based applications are open-access and provide an important new resource for researchers worldwide to evaluate new diagnostic tests.

  3. A Virtual Laboratory for Aviation and Airspace Prognostics Research

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan; Gorospe, George; Teubert, Christ; Quach, Cuong C.; Hogge, Edward; Darafsheh, Kaveh

    2017-01-01

    Integration of Unmanned Aerial Vehicles (UAVs), autonomy, spacecraft, and other aviation technologies, in the airspace is becoming more and more complicated, and will continue to do so in the future. Inclusion of new technology and complexity into the airspace increases the importance and difficulty of safety assurance. Additionally, testing new technologies on complex aviation systems and systems of systems can be challenging, expensive, and at times unsafe when implementing real life scenarios. The application of prognostics to aviation and airspace management may produce new tools and insight into these problems. Prognostic methodology provides an estimate of the health and risks of a component, vehicle, or airspace and knowledge of how that will change over time. That measure is especially useful in safety determination, mission planning, and maintenance scheduling. In our research, we develop a live, distributed, hardware- in-the-loop Prognostics Virtual Laboratory testbed for aviation and airspace prognostics. The developed testbed will be used to validate prediction algorithms for the real-time safety monitoring of the National Airspace System (NAS) and the prediction of unsafe events. In our earlier work1 we discussed the initial Prognostics Virtual Laboratory testbed development work and related results for milestones 1 & 2. This paper describes the design, development, and testing of the integrated tested which are part of milestone 3, along with our next steps for validation of this work. Through a framework consisting of software/hardware modules and associated interface clients, the distributed testbed enables safe, accurate, and inexpensive experimentation and research into airspace and vehicle prognosis that would not have been possible otherwise. The testbed modules can be used cohesively to construct complex and relevant airspace scenarios for research. Four modules are key to this research: the virtual aircraft module which uses the X-Plane simulator and X-PlaneConnect toolbox, the live aircraft module which connects fielded aircraft using onboard cellular communications devices, the hardware in the loop (HITL) module which connects laboratory based bench-top hardware testbeds and the research module which contains diagnostics and prognostics tools for analysis of live air traffic situations and vehicle health conditions. The testbed also features other modules for data recording and playback, information visualization, and air traffic generation. Software reliability, safety, and latency are some of the critical design considerations in development of the testbed.

  4. DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.

    PubMed

    Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques

    2008-09-08

    Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.

  5. 3D workflow for HDR image capture of projection systems and objects for CAVE virtual environments authoring with wireless touch-sensitive devices

    NASA Astrophysics Data System (ADS)

    Prusten, Mark J.; McIntyre, Michelle; Landis, Marvin

    2006-02-01

    A 3D workflow pipeline is presented for High Dynamic Range (HDR) image capture of projected scenes or objects for presentation in CAVE virtual environments. The methods of HDR digital photography of environments vs. objects are reviewed. Samples of both types of virtual authoring being the actual CAVE environment and a sculpture are shown. A series of software tools are incorporated into a pipeline called CAVEPIPE, allowing for high-resolution objects and scenes to be composited together in natural illumination environments [1] and presented in our CAVE virtual reality environment. We also present a way to enhance the user interface for CAVE environments. The traditional methods of controlling the navigation through virtual environments include: glove, HUD's and 3D mouse devices. By integrating a wireless network that includes both WiFi (IEEE 802.11b/g) and Bluetooth (IEEE 802.15.1) protocols the non-graphical input control device can be eliminated. Therefore wireless devices can be added that would include: PDA's, Smart Phones, TabletPC's, Portable Gaming consoles, and PocketPC's.

  6. Virtual surface characteristics of a tactile display using magneto-rheological fluids.

    PubMed

    Lee, Chul-Hee; Jang, Min-Gyu

    2011-01-01

    Virtual surface characteristics of tactile displays are investigated to characterize the feeling of human touch for a haptic interface application. In order to represent the tactile feeling, a prototype tactile display incorporating Magneto-Rheological (MR) fluid has been developed. Tactile display devices simulate the finger's skin to feel the sensations of contact such as compliance, friction, and topography of the surface. Thus, the tactile display can provide information on the surface of an organic tissue to the surgeon in virtual reality. In order to investigate the compliance feeling of a human finger's touch, normal force responses of a tactile display under various magnetic fields have been assessed. Also, shearing friction force responses of the tactile display are investigated to simulate the action of finger dragging on the surface. Moreover, different matrix arrays of magnetic poles are applied to form the virtual surface topography. From the results, different tactile feelings are observed according to the applied magnetic field strength as well as the arrays of magnetic poles combinations. This research presents a smart tactile display technology for virtual surfaces.

  7. Application of advanced virtual reality and 3D computer assisted technologies in tele-3D-computer assisted surgery in rhinology.

    PubMed

    Klapan, Ivica; Vranjes, Zeljko; Prgomet, Drago; Lukinović, Juraj

    2008-03-01

    The real-time requirement means that the simulation should be able to follow the actions of the user that may be moving in the virtual environment. The computer system should also store in its memory a three-dimensional (3D) model of the virtual environment. In that case a real-time virtual reality system will update the 3D graphic visualization as the user moves, so that up-to-date visualization is always shown on the computer screen. Upon completion of the tele-operation, the surgeon compares the preoperative and postoperative images and models of the operative field, and studies video records of the procedure itself Using intraoperative records, animated images of the real tele-procedure performed can be designed. Virtual surgery offers the possibility of preoperative planning in rhinology. The intraoperative use of computer in real time requires development of appropriate hardware and software to connect medical instrumentarium with the computer and to operate the computer by thus connected instrumentarium and sophisticated multimedia interfaces.

  8. Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars

    PubMed Central

    Patil, Shashidhar; Chintalapalli, Harinadha Reddy; Kim, Dubeom; Chai, Youngho

    2015-01-01

    In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor. PMID:26094629

  9. Towards a real-time interface between a biomimetic model of sensorimotor cortex and a robotic arm

    PubMed Central

    Dura-Bernal, Salvador; Chadderdon, George L; Neymotin, Samuel A; Francis, Joseph T; Lytton, William W

    2015-01-01

    Brain-machine interfaces can greatly improve the performance of prosthetics. Utilizing biomimetic neuronal modeling in brain machine interfaces (BMI) offers the possibility of providing naturalistic motor-control algorithms for control of a robotic limb. This will allow finer control of a robot, while also giving us new tools to better understand the brain’s use of electrical signals. However, the biomimetic approach presents challenges in integrating technologies across multiple hardware and software platforms, so that the different components can communicate in real-time. We present the first steps in an ongoing effort to integrate a biomimetic spiking neuronal model of motor learning with a robotic arm. The biomimetic model (BMM) was used to drive a simple kinematic two-joint virtual arm in a motor task requiring trial-and-error convergence on a single target. We utilized the output of this model in real time to drive mirroring motion of a Barrett Technology WAM robotic arm through a user datagram protocol (UDP) interface. The robotic arm sent back information on its joint positions, which was then used by a visualization tool on the remote computer to display a realistic 3D virtual model of the moving robotic arm in real time. This work paves the way towards a full closed-loop biomimetic brain-effector system that can be incorporated in a neural decoder for prosthetic control, to be used as a platform for developing biomimetic learning algorithms for controlling real-time devices. PMID:26709323

  10. Integration of PGD-virtual charts into an engineering design process

    NASA Astrophysics Data System (ADS)

    Courard, Amaury; Néron, David; Ladevèze, Pierre; Ballere, Ludovic

    2016-04-01

    This article deals with the efficient construction of approximations of fields and quantities of interest used in geometric optimisation of complex shapes that can be encountered in engineering structures. The strategy, which is developed herein, is based on the construction of virtual charts that allow, once computed offline, to optimise the structure for a negligible online CPU cost. These virtual charts can be used as a powerful numerical decision support tool during the design of industrial structures. They are built using the proper generalized decomposition (PGD) that offers a very convenient framework to solve parametrised problems. In this paper, particular attention has been paid to the integration of the procedure into a genuine engineering design process. In particular, a dedicated methodology is proposed to interface the PGD approach with commercial software.

  11. Chemical Bonding Technology: Direct Investigation of Interfacial Bonds

    NASA Technical Reports Server (NTRS)

    Koenig, J. L.; Boerio, F. J.; Plueddemann, E. P.; Miller, J.; Willis, P. B.; Cuddihy, E. F.

    1986-01-01

    This is the third Flat-Plate Solar Array (FSA) Project document reporting on chemical bonding technology for terrestrial photovoltaic (PV) modules. The impetus for this work originated in the late 1970s when PV modules employing silicone encapsulation materials were undergoing delamination during outdoor exposure. At that time, manufacturers were not employing adhesion promoters and, hence, module interfaces in common with the silicone materials were only in physical contact and therefore easily prone to separation if, for example, water were to penetrate to the interfaces. Delamination with silicone materials virtually vanished when adhesion promoters, recommended by silicone manufacturers, were used. The activities related to the direct investigation of chemically bonded interfaces are described.

  12. Tactile Data Entry System

    NASA Technical Reports Server (NTRS)

    Adams, Richard J.

    2015-01-01

    The patent-pending Glove-Enabled Computer Operations (GECO) design leverages extravehicular activity (EVA) glove design features as platforms for instrumentation and tactile feedback, enabling the gloves to function as human-computer interface devices. Flexible sensors in each finger enable control inputs that can be mapped to any number of functions (e.g., a mouse click, a keyboard strike, or a button press). Tracking of hand motion is interpreted alternatively as movement of a mouse (change in cursor position on a graphical user interface) or a change in hand position on a virtual keyboard. Programmable vibro-tactile actuators aligned with each finger enrich the interface by creating the haptic sensations associated with control inputs, such as recoil of a button press.

  13. User Interface Design for Military AR Applications

    DTIC Science & Technology

    2010-12-12

    virtual objects with the real world: seeing ultrasound imagery within the patient. In: Computer graphics (SIGGRAPH ’ 92 proceedings), vol 26, pp 203–210... airborne reconnaissance and weapon delivery. In: Proceedings of symposium for image display and recording, US Air Force Avionics Laboratory, Wright

  14. Virtual Ships: NATO Standards Development and Implementation

    DTIC Science & Technology

    2009-10-01

    interfaces. Such simulations were unable to be re-used for other applications because they were too application specific and too highly customised ...provides water flow field data (including water flow induced forces and moments and added masses ) to other federates that request it.  Ship motion

  15. Human-Avatar Symbiosis for the Treatment of Auditory Verbal Hallucinations in Schizophrenia through Virtual/Augmented Reality and Brain-Computer Interfaces

    PubMed Central

    Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J.; Latorre, José M.; Rodriguez-Jimenez, Roberto

    2017-01-01

    This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis. PMID:29209193

  16. Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind.

    PubMed

    Kim, Donghun; Kim, Kwangtaek; Lee, Sangyoun

    2014-06-13

    In this paper, we propose a new haptic-assisted virtual cane system operated by a simple finger pointing gesture. The system is developed by two stages: development of visual information delivery assistant (VIDA) with a stereo camera and adding a tactile feedback interface with dual actuators for guidance and distance feedbacks. In the first stage, user's pointing finger is automatically detected using color and disparity data from stereo images and then a 3D pointing direction of the finger is estimated with its geometric and textural features. Finally, any object within the estimated pointing trajectory in 3D space is detected and the distance is then estimated in real time. For the second stage, identifiable tactile signals are designed through a series of identification experiments, and an identifiable tactile feedback interface is developed and integrated into the VIDA system. Our approach differs in that navigation guidance is provided by a simple finger pointing gesture and tactile distance feedbacks are perfectly identifiable to the blind.

  17. "Virtual shear box" experiments of stress and slip cycling within a subduction interface mélange

    NASA Astrophysics Data System (ADS)

    Webber, Sam; Ellis, Susan; Fagereng, Åke

    2018-04-01

    What role does the progressive geometric evolution of subduction-related mélange shear zones play in the development of strain transients? We use a "virtual shear box" experiment, based on outcrop-scale observations from an ancient exhumed subduction interface - the Chrystalls Beach Complex (CBC), New Zealand - to constrain numerical models of slip processes within a meters-thick shear zone. The CBC is dominated by large, competent clasts surrounded by interconnected weak matrix. Under constant slip velocity boundary conditions, models of the CBC produce stress cycling behavior, accompanied by mixed brittle-viscous deformation. This occurs as a consequence of the reorganization of competent clasts, and the progressive development and breakdown of stress bridges as clasts mutually obstruct one another. Under constant shear stress boundary conditions, the models show periods of relative inactivity punctuated by aseismic episodic slip at rapid rates (meters per year). Such a process may contribute to the development of strain transients such as slow slip.

  18. Brain-computer interface using P300 and virtual reality: a gaming approach for treating ADHD.

    PubMed

    Rohani, Darius Adam; Sorensen, Helge B D; Puthusserypady, Sadasivan

    2014-01-01

    This paper presents a novel brain-computer interface (BCI) system aiming at the rehabilitation of attention-deficit/hyperactive disorder in children. It uses the P300 potential in a series of feedback games to improve the subjects' attention. We applied a support vector machine (SVM) using temporal and template-based features to detect these P300 responses. In an experimental setup using five subjects, an average error below 30% was achieved. To make it more challenging the BCI system has been embedded inside an immersive 3D virtual reality (VR) classroom with simulated distractions, which was created by combining a low-cost infrared camera and an "off-axis perspective projection" algorithm. This system is intended for kids by operating with four electrodes, as well as a non-intrusive VR setting. With the promising results, and considering the simplicity of the scheme, we hope to encourage future studies to adapt the techniques presented in this study.

  19. A Full Body Steerable Wind Display for a Locomotion Interface.

    PubMed

    Kulkarni, Sandip D; Fisher, Charles J; Lefler, Price; Desai, Aditya; Chakravarthy, Shanthanu; Pardyjak, Eric R; Minor, Mark A; Hollerbach, John M

    2015-10-01

    This paper presents the Treadport Active Wind Tunnel (TPAWT)-a full-body immersive virtual environment for the Treadport locomotion interface designed for generating wind on a user from any frontal direction at speeds up to 20 kph. The goal is to simulate the experience of realistic wind while walking in an outdoor virtual environment. A recirculating-type wind tunnel was created around the pre-existing Treadport installation by adding a large fan, ducting, and enclosure walls. Two sheets of air in a non-intrusive design flow along the side screens of the back-projection CAVE-like visual display, where they impinge and mix at the front screen to redirect towards the user in a full-body cross-section. By varying the flow conditions of the air sheets, the direction and speed of wind at the user are controlled. Design challenges to fit the wind tunnel in the pre-existing facility, and to manage turbulence to achieve stable and steerable flow, were overcome. The controller performance for wind speed and direction is demonstrated experimentally.

  20. Human-Avatar Symbiosis for the Treatment of Auditory Verbal Hallucinations in Schizophrenia through Virtual/Augmented Reality and Brain-Computer Interfaces.

    PubMed

    Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J; Latorre, José M; Rodriguez-Jimenez, Roberto

    2017-01-01

    This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis.

Top