Sample records for virtual environment system

  1. Using smartphone technology to deliver a virtual pedestrian environment: usability and validation.

    PubMed

    Schwebel, David C; Severson, Joan; He, Yefei

    2017-09-01

    Various programs effectively teach children to cross streets more safely, but all are labor- and cost-intensive. Recent developments in mobile phone technology offer opportunity to deliver virtual reality pedestrian environments to mobile smartphone platforms. Such an environment may offer a cost- and labor-effective strategy to teach children to cross streets safely. This study evaluated usability, feasibility, and validity of a smartphone-based virtual pedestrian environment. A total of 68 adults completed 12 virtual crossings within each of two virtual pedestrian environments, one delivered by smartphone and the other a semi-immersive kiosk virtual environment. Participants completed self-report measures of perceived realism and simulator sickness experienced in each virtual environment, plus self-reported demographic and personality characteristics. All participants followed system instructions and used the smartphone-based virtual environment without difficulty. No significant simulator sickness was reported or observed. Users rated the smartphone virtual environment as highly realistic. Convergent validity was detected, with many aspects of pedestrian behavior in the smartphone-based virtual environment matching behavior in the kiosk virtual environment. Anticipated correlations between personality and kiosk virtual reality pedestrian behavior emerged for the smartphone-based system. A smartphone-based virtual environment can be usable and valid. Future research should develop and evaluate such a training system.

  2. An Audio Architecture Integrating Sound and Live Voice for Virtual Environments

    DTIC Science & Technology

    2002-09-01

    implementation of a virtual environment. As real world training locations become scarce and training budgets are trimmed, training system developers ...look more and more towards virtual environments as the answer. Virtual environments provide training system developers with several key benefits

  3. Can virtual reality be used to conduct mass prophylaxis clinic training? A pilot program.

    PubMed

    Yellowlees, Peter; Cook, James N; Marks, Shayna L; Wolfe, Daniel; Mangin, Elanor

    2008-03-01

    To create and evaluate a pilot bioterrorism defense training environment using virtual reality technology. The present pilot project used Second Life, an internet-based virtual world system, to construct a virtual reality environment to mimic an actual setting that might be used as a Strategic National Stockpile (SNS) distribution site for northern California in the event of a bioterrorist attack. Scripted characters were integrated into the system as mock patients to analyze various clinic workflow scenarios. Users tested the virtual environment over two sessions. Thirteen users who toured the environment were asked to complete an evaluation survey. Respondents reported that the virtual reality system was relevant to their practice and had potential as a method of bioterrorism defense training. Computer simulations of bioterrorism defense training scenarios are feasible with existing personal computer technology. The use of internet-connected virtual environments holds promise for bioterrorism defense training. Recommendations are made for public health agencies regarding the implementation and benefits of using virtual reality for mass prophylaxis clinic training.

  4. Methods and systems relating to an augmented virtuality environment

    DOEpatents

    Nielsen, Curtis W; Anderson, Matthew O; McKay, Mark D; Wadsworth, Derek C; Boyce, Jodie R; Hruska, Ryan C; Koudelka, John A; Whetten, Jonathan; Bruemmer, David J

    2014-05-20

    Systems and methods relating to an augmented virtuality system are disclosed. A method of operating an augmented virtuality system may comprise displaying imagery of a real-world environment in an operating picture. The method may further include displaying a plurality of virtual icons in the operating picture representing at least some assets of a plurality of assets positioned in the real-world environment. Additionally, the method may include displaying at least one virtual item in the operating picture representing data sensed by one or more of the assets of the plurality of assets and remotely controlling at least one asset of the plurality of assets by interacting with a virtual icon associated with the at least one asset.

  5. Virtual interface environment workstations

    NASA Technical Reports Server (NTRS)

    Fisher, S. S.; Wenzel, E. M.; Coler, C.; Mcgreevy, M. W.

    1988-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed at NASA's Ames Research Center for use as a multipurpose interface environment. This Virtual Interface Environment Workstation (VIEW) system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, research scenarios, and research directions are described.

  6. LVC interaction within a mixed-reality training system

    NASA Astrophysics Data System (ADS)

    Pollock, Brice; Winer, Eliot; Gilbert, Stephen; de la Cruz, Julio

    2012-03-01

    The United States military is increasingly pursuing advanced live, virtual, and constructive (LVC) training systems for reduced cost, greater training flexibility, and decreased training times. Combining the advantages of realistic training environments and virtual worlds, mixed reality LVC training systems can enable live and virtual trainee interaction as if co-located. However, LVC interaction in these systems often requires constructing immersive environments, developing hardware for live-virtual interaction, tracking in occluded environments, and an architecture that supports real-time transfer of entity information across many systems. This paper discusses a system that overcomes these challenges to empower LVC interaction in a reconfigurable, mixed reality environment. This system was developed and tested in an immersive, reconfigurable, and mixed reality LVC training system for the dismounted warfighter at ISU, known as the Veldt, to overcome LVC interaction challenges and as a test bed for cuttingedge technology to meet future U.S. Army battlefield requirements. Trainees interact physically in the Veldt and virtually through commercial and developed game engines. Evaluation involving military trained personnel found this system to be effective, immersive, and useful for developing the critical decision-making skills necessary for the battlefield. Procedural terrain modeling, model-matching database techniques, and a central communication server process all live and virtual entity data from system components to create a cohesive virtual world across all distributed simulators and game engines in real-time. This system achieves rare LVC interaction within multiple physical and virtual immersive environments for training in real-time across many distributed systems.

  7. Virtual hand: a 3D tactile interface to virtual environments

    NASA Astrophysics Data System (ADS)

    Rogowitz, Bernice E.; Borrel, Paul

    2008-02-01

    We introduce a novel system that allows users to experience the sensation of touch in a computer graphics environment. In this system, the user places his/her hand on an array of pins, which is moved about space on a 6 degree-of-freedom robot arm. The surface of the pins defines a surface in the virtual world. This "virtual hand" can move about the virtual world. When the virtual hand encounters an object in the virtual world, the heights of the pins are adjusted so that they represent the object's shape, surface, and texture. A control system integrates pin and robot arm motions to transmit information about objects in the computer graphics world to the user. It also allows the user to edit, change and move the virtual objects, shapes and textures. This system provides a general framework for touching, manipulating, and modifying objects in a 3-D computer graphics environment, which may be useful in a wide range of applications, including computer games, computer aided design systems, and immersive virtual worlds.

  8. A virtual therapeutic environment with user projective agents.

    PubMed

    Ookita, S Y; Tokuda, H

    2001-02-01

    Today, we see the Internet as more than just an information infrastructure, but a socializing place and a safe outlet of inner feelings. Many personalities develop aside from real world life due to its anonymous environment. Virtual world interactions are bringing about new psychological illnesses ranging from netaddiction to technostress, as well as online personality disorders and conflicts in multiple identities that exist in the virtual world. Presently, there are no standard therapy models for the virtual environment. There are very few therapeutic environments, or tools especially made for virtual therapeutic environments. The goal of our research is to provide the therapy model and middleware tools for psychologists to use in virtual therapeutic environments. We propose the Cyber Therapy Model, and Projective Agents, a tool used in the therapeutic environment. To evaluate the effectiveness of the tool, we created a prototype system, called the Virtual Group Counseling System, which is a therapeutic environment that allows the user to participate in group counseling through the eyes of their Projective Agent. Projective Agents inherit the user's personality traits. During the virtual group counseling, the user's Projective Agent interacts and collaborates to recover and increase their psychological growth. The prototype system provides a simulation environment where psychologists can adjust the parameters and customize their own simulation environment. The model and tool is a first attempt toward simulating online personalities that may exist only online, and provide data for observation.

  9. Proceedings of the 1993 Conference on Intelligent Computer-Aided Training and Virtual Environment Technology, Volume 1

    NASA Technical Reports Server (NTRS)

    Hyde, Patricia R.; Loftin, R. Bowen

    1993-01-01

    These proceedings are organized in the same manner as the conference's contributed sessions, with the papers grouped by topic area. These areas are as follows: VE (virtual environment) training for Space Flight, Virtual Environment Hardware, Knowledge Aquisition for ICAT (Intelligent Computer-Aided Training) & VE, Multimedia in ICAT Systems, VE in Training & Education (1 & 2), Virtual Environment Software (1 & 2), Models in ICAT systems, ICAT Commercial Applications, ICAT Architectures & Authoring Systems, ICAT Education & Medical Applications, Assessing VE for Training, VE & Human Systems (1 & 2), ICAT Theory & Natural Language, ICAT Applications in the Military, VE Applications in Engineering, Knowledge Acquisition for ICAT, and ICAT Applications in Aerospace.

  10. A collaborative molecular modeling environment using a virtual tunneling service.

    PubMed

    Lee, Jun; Kim, Jee-In; Kang, Lin-Woo

    2012-01-01

    Collaborative researches of three-dimensional molecular modeling can be limited by different time zones and locations. A networked virtual environment can be utilized to overcome the problem caused by the temporal and spatial differences. However, traditional approaches did not sufficiently consider integration of different computing environments, which were characterized by types of applications, roles of users, and so on. We propose a collaborative molecular modeling environment to integrate different molecule modeling systems using a virtual tunneling service. We integrated Co-Coot, which is a collaborative crystallographic object-oriented toolkit, with VRMMS, which is a virtual reality molecular modeling system, through a collaborative tunneling system. The proposed system showed reliable quantitative and qualitative results through pilot experiments.

  11. A Stochastic Model of Plausibility in Live Virtual Constructive Environments

    DTIC Science & Technology

    2017-09-14

    objective in virtual environment research and design is the maintenance of adequate consistency levels in the face of limited system resources such as...provides some commentary with regard to system design considerations and future research directions. II. SYSTEM MODEL DVEs are often designed as a...exceed the system’s requirements. Research into predictive models of virtual environment consistency is needed to provide designers the tools to

  12. Advanced Collaborative Environments Supporting Systems Integration and Design

    DTIC Science & Technology

    2003-03-01

    concurrently view a virtual system or product model while maintaining natural, human communication . These virtual systems operate within a computer-generated...These environments allow multiple individuals to concurrently view a virtual system or product model while simultaneously maintaining natural, human ... communication . As a result, TARDEC researchers and system developers are using this advanced high-end visualization technology to develop future

  13. A Collaborative Molecular Modeling Environment Using a Virtual Tunneling Service

    PubMed Central

    Lee, Jun; Kim, Jee-In; Kang, Lin-Woo

    2012-01-01

    Collaborative researches of three-dimensional molecular modeling can be limited by different time zones and locations. A networked virtual environment can be utilized to overcome the problem caused by the temporal and spatial differences. However, traditional approaches did not sufficiently consider integration of different computing environments, which were characterized by types of applications, roles of users, and so on. We propose a collaborative molecular modeling environment to integrate different molecule modeling systems using a virtual tunneling service. We integrated Co-Coot, which is a collaborative crystallographic object-oriented toolkit, with VRMMS, which is a virtual reality molecular modeling system, through a collaborative tunneling system. The proposed system showed reliable quantitative and qualitative results through pilot experiments. PMID:22927721

  14. Seeing ahead: experience and language in spatial perspective.

    PubMed

    Alloway, Tracy Packiam; Corley, Martin; Ramscar, Michael

    2006-03-01

    Spatial perspective can be directed by various reference frames, as well as by the direction of motion. In the present study, we explored how ambiguity in spatial tasks can be resolved. Participants were presented with virtual reality environments in order to stimulate a spatialreference frame based on motion. They interacted with an ego-moving spatial system in Experiment 1 and an object-moving spatial system in Experiment 2. While interacting with the virtual environment, the participants were presented with either a question representing a motion system different from that of the virtual environment or a nonspatial question relating to physical features of the virtual environment. They then performed the target task assign the label front in an ambiguous spatial task. The findings indicate that the disambiguation of spatial terms can be influenced by embodied experiences, as represented by the virtual environment, as well as by linguistic context.

  15. Exploring Learner Acceptance of the Use of Virtual Reality in Medical Education: A Case Study of Desktop and Projection-Based Display Systems

    ERIC Educational Resources Information Center

    Huang, Hsiu-Mei; Liaw, Shu-Sheng; Lai, Chung-Min

    2016-01-01

    Advanced technologies have been widely applied in medical education, including human-patient simulators, immersive virtual reality Cave Automatic Virtual Environment systems, and video conferencing. Evaluating learner acceptance of such virtual reality (VR) learning environments is a critical issue for ensuring that such technologies are used to…

  16. Knowledge Acquisition and Job Training for Advanced Technical Skills Using Immersive Virtual Environment

    NASA Astrophysics Data System (ADS)

    Watanuki, Keiichi; Kojima, Kazuyuki

    The environment in which Japanese industry has achieved great respect is changing tremendously due to the globalization of world economies, while Asian countries are undergoing economic and technical development as well as benefiting from the advances in information technology. For example, in the design of custom-made casting products, a designer who lacks knowledge of casting may not be able to produce a good design. In order to obtain a good design and manufacturing result, it is necessary to equip the designer and manufacturer with a support system related to casting design, or a so-called knowledge transfer and creation system. This paper proposes a new virtual reality based knowledge acquisition and job training system for casting design, which is composed of the explicit and tacit knowledge transfer systems using synchronized multimedia and the knowledge internalization system using portable virtual environment. In our proposed system, the education content is displayed in the immersive virtual environment, whereby a trainee may experience work in the virtual site operation. Provided that the trainee has gained explicit and tacit knowledge of casting through the multimedia-based knowledge transfer system, the immersive virtual environment catalyzes the internalization of knowledge and also enables the trainee to gain tacit knowledge before undergoing on-the-job training at a real-time operation site.

  17. System-Level Virtualization for High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vallee, Geoffroy R; Naughton, III, Thomas J; Engelmann, Christian

    2008-01-01

    System-level virtualization has been a research topic since the 70's but regained popularity during the past few years because of the availability of efficient solution such as Xen and the implementation of hardware support in commodity processors (e.g. Intel-VT, AMD-V). However, a majority of system-level virtualization projects is guided by the server consolidation market. As a result, current virtualization solutions appear to not be suitable for high performance computing (HPC) which is typically based on large-scale systems. On another hand there is significant interest in exploiting virtual machines (VMs) within HPC for a number of other reasons. By virtualizing themore » machine, one is able to run a variety of operating systems and environments as needed by the applications. Virtualization allows users to isolate workloads, improving security and reliability. It is also possible to support non-native environments and/or legacy operating environments through virtualization. In addition, it is possible to balance work loads, use migration techniques to relocate applications from failing machines, and isolate fault systems for repair. This document presents the challenges for the implementation of a system-level virtualization solution for HPC. It also presents a brief survey of the different approaches and techniques to address these challenges.« less

  18. Height effects in real and virtual environments.

    PubMed

    Simeonov, Peter I; Hsiao, Hongwei; Dotson, Brian W; Ammons, Douglas E

    2005-01-01

    The study compared human perceptions of height, danger, and anxiety, as well as skin conductance and heart rate responses and postural instability effects, in real and virtual height environments. The 24 participants (12 men, 12 women), whose average age was 23.6 years, performed "lean-over-the-railing" and standing tasks on real and comparable virtual balconies, using a surround-screen virtual reality (SSVR) system. The results indicate that the virtual display of elevation provided realistic perceptual experience and induced some physiological responses and postural instability effects comparable to those found in a real environment. It appears that a simulation of elevated work environment in a SSVR system, although with reduced visual fidelity, is a valid tool for safety research. Potential applications of this study include the design of virtual environments that will help in safe evaluation of human performance at elevation, identification of risk factors leading to fall incidents, and assessment of new fall prevention strategies.

  19. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  20. A training system of orientation and mobility for blind people using acoustic virtual reality.

    PubMed

    Seki, Yoshikazu; Sato, Tetsuji

    2011-02-01

    A new auditory orientation training system was developed for blind people using acoustic virtual reality (VR) based on a head-related transfer function (HRTF) simulation. The present training system can reproduce a virtual training environment for orientation and mobility (O&M) instruction, and the trainee can walk through the virtual training environment safely by listening to sounds such as vehicles, stores, ambient noise, etc., three-dimensionally through headphones. The system can reproduce not only sound sources but also sound reflection and insulation, so that the trainee can learn both sound location and obstacle perception skills. The virtual training environment is described in extensible markup language (XML), and the O&M instructor can edit it easily according to the training curriculum. Evaluation experiments were conducted to test the efficiency of some features of the system. Thirty subjects who had not acquired O&M skills attended the experiments. The subjects were separated into three groups: a no-training group, a virtual-training group using the present system, and a real-training group in real environments. The results suggested that virtual-training can reduce "veering" more than real-training and also can reduce stress as much as real training. The subjective technical and anxiety scores also improved.

  1. Ergonomic aspects of a virtual environment.

    PubMed

    Ahasan, M R; Väyrynen, S

    1999-01-01

    A virtual environment is an interactive graphic system mediated through computer technology that allows a certain level of reality or a sense of presence to access virtual information. To create reality in a virtual environment, ergonomics issues are explored in this paper, aiming to develop the design of presentation formats with related information, that is possible to attain and to maintain user-friendly application.

  2. Digital Immersive Virtual Environments and Instructional Computing

    ERIC Educational Resources Information Center

    Blascovich, Jim; Beall, Andrew C.

    2010-01-01

    This article reviews theory and research relevant to the development of digital immersive virtual environment-based instructional computing systems. The review is organized within the context of a multidimensional model of social influence and interaction within virtual environments that models the interaction of four theoretical factors: theory…

  3. Distributed virtual environment for emergency medical training

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.; Garcia, Brian W.; Godsell-Stytz, Gayl M.

    1997-07-01

    In many professions where individuals must work in a team in a high stress environment to accomplish a time-critical task, individual and team performance can benefit from joint training using distributed virtual environments (DVEs). One professional field that lacks but needs a high-fidelity team training environment is the field of emergency medicine. Currently, emergency department (ED) medical personnel train by using words to create a metal picture of a situation for the physician and staff, who then cooperate to solve the problems portrayed by the word picture. The need in emergency medicine for realistic virtual team training is critical because ED staff typically encounter rarely occurring but life threatening situations only once in their careers and because ED teams currently have no realistic environment in which to practice their team skills. The resulting lack of experience and teamwork makes diagnosis and treatment more difficult. Virtual environment based training has the potential to redress these shortfalls. The objective of our research is to develop a state-of-the-art virtual environment for emergency medicine team training. The virtual emergency room (VER) allows ED physicians and medical staff to realistically prepare for emergency medical situations by performing triage, diagnosis, and treatment on virtual patients within an environment that provides them with the tools they require and the team environment they need to realistically perform these three tasks. There are several issues that must be addressed before this vision is realized. The key issues deal with distribution of computations; the doctor and staff interface to the virtual patient and ED equipment; the accurate simulation of individual patient organs' response to injury, medication, and treatment; and an accurate modeling of the symptoms and appearance of the patient while maintaining a real-time interaction capability. Our ongoing work addresses all of these issues. In this paper we report on our prototype VER system and its distributed system architecture for an emergency department distributed virtual environment for emergency medical staff training. The virtual environment enables emergency department physicians and staff to develop their diagnostic and treatment skills using the virtual tools they need to perform diagnostic and treatment tasks. Virtual human imagery, and real-time virtual human response are used to create the virtual patient and present a scenario. Patient vital signs are available to the emergency department team as they manage the virtual case. The work reported here consists of the system architectures we developed for the distributed components of the virtual emergency room. The architectures we describe consist of the network level architecture as well as the software architecture for each actor within the virtual emergency room. We describe the role of distributed interactive simulation and other enabling technologies within the virtual emergency room project.

  4. Realistic terrain visualization based on 3D virtual world technology

    NASA Astrophysics Data System (ADS)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2009-09-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  5. Realistic terrain visualization based on 3D virtual world technology

    NASA Astrophysics Data System (ADS)

    Huang, Fengru; Lin, Hui; Chen, Bin; Xiao, Cai

    2010-11-01

    The rapid advances in information technologies, e.g., network, graphics processing, and virtual world, have provided challenges and opportunities for new capabilities in information systems, Internet applications, and virtual geographic environments, especially geographic visualization and collaboration. In order to achieve meaningful geographic capabilities, we need to explore and understand how these technologies can be used to construct virtual geographic environments to help to engage geographic research. The generation of three-dimensional (3D) terrain plays an important part in geographical visualization, computer simulation, and virtual geographic environment applications. The paper introduces concepts and technologies of virtual worlds and virtual geographic environments, explores integration of realistic terrain and other geographic objects and phenomena of natural geographic environment based on SL/OpenSim virtual world technologies. Realistic 3D terrain visualization is a foundation of construction of a mirror world or a sand box model of the earth landscape and geographic environment. The capabilities of interaction and collaboration on geographic information are discussed as well. Further virtual geographic applications can be developed based on the foundation work of realistic terrain visualization in virtual environments.

  6. A workout for virtual bodybuilders (design issues for embodiment in multi-actor virtual environments)

    NASA Technical Reports Server (NTRS)

    Benford, Steve; Bowers, John; Fahlen, Lennart E.; Greenhalgh, Chris; Snowdon, Dave

    1994-01-01

    This paper explores the issue of user embodiment within collaborative virtual environments. By user embodiment we mean the provision of users with appropriate body images so as to represent them to others and also to themselves. By collaborative virtual environments we mean multi-user virtual reality systems which support cooperative work (although we argue that the results of our exploration may also be applied to other kinds of collaborative systems). The main part of the paper identifies a list of embodiment design issues including: presence, location, identity, activity, availability, history of activity, viewpoint, action point, gesture, facial expression, voluntary versus involuntary expression, degree of presence, reflecting capabilities, manipulating the user's view of others, representation across multiple media, autonomous and distributed body parts, truthfulness and efficiency. Following this, we show how these issues are reflected in our own DIVE and MASSIVE prototype collaborative virtual environments.

  7. Virtual reality environments for post-stroke arm rehabilitation.

    PubMed

    Subramanian, Sandeep; Knaut, Luiz A; Beaudoin, Christian; McFadyen, Bradford J; Feldman, Anatol G; Levin, Mindy F

    2007-06-22

    Optimal practice and feedback elements are essential requirements for maximal motor recovery in patients with motor deficits due to central nervous system lesions. A virtual environment (VE) was created that incorporates practice and feedback elements necessary for maximal motor recovery. It permits varied and challenging practice in a motivating environment that provides salient feedback. The VE gives the user knowledge of results feedback about motor behavior and knowledge of performance feedback about the quality of pointing movements made in a virtual elevator. Movement distances are related to length of body segments. We describe an immersive and interactive experimental protocol developed in a virtual reality environment using the CAREN system. The VE can be used as a training environment for the upper limb in patients with motor impairments.

  8. Immersive Virtual Moon Scene System Based on Panoramic Camera Data of Chang'E-3

    NASA Astrophysics Data System (ADS)

    Gao, X.; Liu, J.; Mu, L.; Yan, W.; Zeng, X.; Zhang, X.; Li, C.

    2014-12-01

    The system "Immersive Virtual Moon Scene" is used to show the virtual environment of Moon surface in immersive environment. Utilizing stereo 360-degree imagery from panoramic camera of Yutu rover, the system enables the operator to visualize the terrain and the celestial background from the rover's point of view in 3D. To avoid image distortion, stereo 360-degree panorama stitched by 112 images is projected onto inside surface of sphere according to panorama orientation coordinates and camera parameters to build the virtual scene. Stars can be seen from the Moon at any time. So we render the sun, planets and stars according to time and rover's location based on Hipparcos catalogue as the background on the sphere. Immersing in the stereo virtual environment created by this imaged-based rendering technique, the operator can zoom, pan to interact with the virtual Moon scene and mark interesting objects. Hardware of the immersive virtual Moon system is made up of four high lumen projectors and a huge curve screen which is 31 meters long and 5.5 meters high. This system which take all panoramic camera data available and use it to create an immersive environment, enable operator to interact with the environment and mark interesting objects contributed heavily to establishment of science mission goals in Chang'E-3 mission. After Chang'E-3 mission, the lab with this system will be open to public. Besides this application, Moon terrain stereo animations based on Chang'E-1 and Chang'E-2 data will be showed to public on the huge screen in the lab. Based on the data of lunar exploration,we will made more immersive virtual moon scenes and animations to help the public understand more about the Moon in the future.

  9. Virtual interface environment

    NASA Technical Reports Server (NTRS)

    Fisher, Scott S.

    1986-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed for use as a multipurpose interface environment. The system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, application scenarios, and research directions are described.

  10. Digital evaluation of sitting posture comfort in human-vehicle system under Industry 4.0 framework

    NASA Astrophysics Data System (ADS)

    Tao, Qing; Kang, Jinsheng; Sun, Wenlei; Li, Zhaobo; Huo, Xiao

    2016-09-01

    Most of the previous studies on the vibration ride comfort of the human-vehicle system were focused only on one or two aspects of the investigation. A hybrid approach which integrates all kinds of investigation methods in real environment and virtual environment is described. The real experimental environment includes the WBV(whole body vibration) test, questionnaires for human subjective sensation and motion capture. The virtual experimental environment includes the theoretical calculation on simplified 5-DOF human body vibration model, the vibration simulation and analysis within ADAMS/VibrationTM module, and the digital human biomechanics and occupational health analysis in Jack software. While the real experimental environment provides realistic and accurate test results, it also serves as core and validation for the virtual experimental environment. The virtual experimental environment takes full advantages of current available vibration simulation and digital human modelling software, and makes it possible to evaluate the sitting posture comfort in a human-vehicle system with various human anthropometric parameters. How this digital evaluation system for car seat comfort design is fitted in the Industry 4.0 framework is also proposed.

  11. ViRPET--combination of virtual reality and PET brain imaging

    DOEpatents

    Majewski, Stanislaw; Brefczynski-Lewis, Julie

    2017-05-23

    Various methods, systems and apparatus are provided for brain imaging during virtual reality stimulation. In one example, among others, a system for virtual ambulatory environment brain imaging includes a mobile brain imager configured to obtain positron emission tomography (PET) scans of a subject in motion, and a virtual reality (VR) system configured to provide one or more stimuli to the subject during the PET scans. In another example, a method for virtual ambulatory environment brain imaging includes providing stimulation to a subject through a virtual reality (VR) system; and obtaining a positron emission tomography (PET) scan of the subject while moving in response to the stimulation from the VR system. The mobile brain imager can be positioned on the subject with an array of imaging photodetector modules distributed about the head of the subject.

  12. Inclusion of Immersive Virtual Learning Environments and Visual Control Systems to Support the Learning of Students with Asperger Syndrome

    ERIC Educational Resources Information Center

    Lorenzo, Gonzalo; Pomares, Jorge; Lledo, Asuncion

    2013-01-01

    This paper presents the use of immersive virtual reality systems in the educational intervention with Asperger students. The starting points of this study are features of these students' cognitive style that requires an explicit teaching style supported by visual aids and highly structured environments. The proposed immersive virtual reality…

  13. Verification of Emmert's law in actual and virtual environments.

    PubMed

    Nakamizo, Sachio; Imamura, Mariko

    2004-11-01

    We examined Emmert's law by measuring the perceived size of an afterimage and the perceived distance of the surface on which the afterimage was projected in actual and virtual environments. The actual environment consisted of a corridor with ample cues as to distance and depth. The virtual environment was made from the CAVE of a virtual reality system. The afterimage, disc-shaped and one degree in diameter, was produced by flashing with an electric photoflash. The observers were asked to estimate the perceived distance to surfaces located at various physical distances (1 to 24 m) by the magnitude estimation method and to estimate the perceived size of the afterimage projected on the surfaces by a matching method. The results show that the perceived size of the afterimage was directly proportional to the perceived distance in both environments; thus, Emmert's law holds in virtual as well as actual environments. We suggest that Emmert's law is a specific case of a functional principle of distance scaling by the visual system.

  14. Virtual workstation - A multimodal, stereoscopic display environment

    NASA Astrophysics Data System (ADS)

    Fisher, S. S.; McGreevy, M.; Humphries, J.; Robinett, W.

    1987-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed for use in a multipurpose interface environment. The system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, application scenarios, and research directions are described.

  15. The virtual environment display system

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1991-01-01

    Virtual environment technology is a display and control technology that can surround a person in an interactive computer generated or computer mediated virtual environment. It has evolved at NASA-Ames since 1984 to serve NASA's missions and goals. The exciting potential of this technology, sometimes called Virtual Reality, Artificial Reality, or Cyberspace, has been recognized recently by the popular media, industry, academia, and government organizations. Much research and development will be necessary to bring it to fruition.

  16. Virtual Control Systems Environment (VCSE)

    ScienceCinema

    Atkins, Will

    2018-02-14

    Will Atkins, a Sandia National Laboratories computer engineer discusses cybersecurity research work for process control systems. Will explains his work on the Virtual Control Systems Environment project to develop a modeling and simulation framework of the U.S. electric grid in order to study and mitigate possible cyberattacks on infrastructure.

  17. Virtual odors to transmit emotions in virtual agents

    NASA Astrophysics Data System (ADS)

    Delgado-Mata, Carlos; Aylett, Ruth

    2003-04-01

    In this paper we describe an emotional-behvioral architecture. The emotional engine sits at a higher layer than the behavior system, and can alter behavior patterns, the engine is designed to simulate Emotionally-Intelligent Agents in a Virtual Environment, where each agent senses its own emotions, and other creature emotions through a virtual smell sensor; senses obstacles and other moving creatures in the environment and reacts to them. The architecture consists of an emotion engine, behavior synthesis system, a motor layer and a library of sensors.

  18. A Virtual Environment System for the Comparison of Dome and HMD Systems

    NASA Technical Reports Server (NTRS)

    Chen, Jian; Harm, Deboran L.; Loftin, R. Bowen; Lin, Ching-yao; Leiss, Ernst L.

    2002-01-01

    For effective astronaut training applications, choosing the right display devices to present images is crucial. In order to assess what devices are appropriate, it is important to design a successful virtual environment for a comparison study of the display devices. We present a comprehensive system for the comparison of Dome and head-mounted display (HMD) systems. In particular, we address interactions techniques and playback environments.

  19. Grasping trajectories in a virtual environment adhere to Weber's law.

    PubMed

    Ozana, Aviad; Berman, Sigal; Ganel, Tzvi

    2018-06-01

    Virtual-reality and telerobotic devices simulate local motor control of virtual objects within computerized environments. Here, we explored grasping kinematics within a virtual environment and tested whether, as in normal 3D grasping, trajectories in the virtual environment are performed analytically, violating Weber's law with respect to object's size. Participants were asked to grasp a series of 2D objects using a haptic system, which projected their movements to a virtual space presented on a computer screen. The apparatus also provided object-specific haptic information upon "touching" the edges of the virtual targets. The results showed that grasping movements performed within the virtual environment did not produce the typical analytical trajectory pattern obtained during 3D grasping. Unlike as in 3D grasping, grasping trajectories in the virtual environment adhered to Weber's law, which indicates relative resolution in size processing. In addition, the trajectory patterns differed from typical trajectories obtained during 3D grasping, with longer times to complete the movement, and with maximum grip apertures appearing relatively early in the movement. The results suggest that grasping movements within a virtual environment could differ from those performed in real space, and are subjected to irrelevant effects of perceptual information. Such atypical pattern of visuomotor control may be mediated by the lack of complete transparency between the interface and the virtual environment in terms of the provided visual and haptic feedback. Possible implications of the findings to movement control within robotic and virtual environments are further discussed.

  20. Proof-of-Concept Part Task Trainer for Close Air Support Procedures

    DTIC Science & Technology

    2016-06-01

    TVDL Tactical Video Down Link VE Virtual Environment VR Virtual Reality WTI Weapons and Tactics Instructor xvii ACKNOWLEDGMENTS I would first...in training of USMC pilots for close air support operations? • What is the feasibility of developing a prototype virtual reality (VR) system that...Chapter IV provides a review of virtual reality (VR)/ virtual environment (VE) and part-task trainers currently used in military training

  1. Vision-based navigation in a dynamic environment for virtual human

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Sun, Ji-Zhou; Zhang, Jia-Wan; Li, Ming-Chu

    2004-06-01

    Intelligent virtual human is widely required in computer games, ergonomics software, virtual environment and so on. We present a vision-based behavior modeling method to realize smart navigation in a dynamic environment. This behavior model can be divided into three modules: vision, global planning and local planning. Vision is the only channel for smart virtual actor to get information from the outside world. Then, the global and local planning module use A* and D* algorithm to find a way for virtual human in a dynamic environment. Finally, the experiments on our test platform (Smart Human System) verify the feasibility of this behavior model.

  2. Nature and origins of virtual environments - A bibliographical essay

    NASA Technical Reports Server (NTRS)

    Ellis, S. R.

    1991-01-01

    Virtual environments presented via head-mounted, computer-driven displays provide a new media for communication. They may be analyzed by considering: (1) what may be meant by an environment; (2) what is meant by the process of virtualization; and (3) some aspects of human performance that constrain environmental design. Their origins are traced from previous work in vehicle simulation and multimedia research. Pointers are provided to key technical references, in the dispersed, archival literature, that are relevant to the development and evaluation of virtual-environment interface systems.

  3. Brave New World

    ERIC Educational Resources Information Center

    Panettieri, Joseph C.

    2007-01-01

    Across the globe, progressive universities are embracing any number of MUVEs (multi-user virtual environments), 3D environments, and "immersive" virtual reality tools. And within the next few months, several universities are expected to test so-called "telepresence" videoconferencing systems from Cisco Systems and other leading…

  4. VBOT: Motivating computational and complex systems fluencies with constructionist virtual/physical robotics

    NASA Astrophysics Data System (ADS)

    Berland, Matthew W.

    As scientists use the tools of computational and complex systems theory to broaden science perspectives (e.g., Bar-Yam, 1997; Holland, 1995; Wolfram, 2002), so can middle-school students broaden their perspectives using appropriate tools. The goals of this dissertation project are to build, study, evaluate, and compare activities designed to foster both computational and complex systems fluencies through collaborative constructionist virtual and physical robotics. In these activities, each student builds an agent (e.g., a robot-bird) that must interact with fellow students' agents to generate a complex aggregate (e.g., a flock of robot-birds) in a participatory simulation environment (Wilensky & Stroup, 1999a). In a participatory simulation, students collaborate by acting in a common space, teaching each other, and discussing content with one another. As a result, the students improve both their computational fluency and their complex systems fluency, where fluency is defined as the ability to both consume and produce relevant content (DiSessa, 2000). To date, several systems have been designed to foster computational and complex systems fluencies through computer programming and collaborative play (e.g., Hancock, 2003; Wilensky & Stroup, 1999b); this study suggests that, by supporting the relevant fluencies through collaborative play, they become mutually reinforcing. In this work, I will present both the design of the VBOT virtual/physical constructionist robotics learning environment and a comparative study of student interaction with the virtual and physical environments across four middle-school classrooms, focusing on the contrast in systems perspectives differently afforded by the two environments. In particular, I found that while performance gains were similar overall, the physical environment supported agent perspectives on aggregate behavior, and the virtual environment supported aggregate perspectives on agent behavior. The primary research questions are: (1) What are the relative affordances of virtual and physical constructionist robotics systems towards computational and complex systems fluencies? (2) What can middle school students learn using computational/complex systems learning environments in a collaborative setting? (3) In what ways are these environments and activities effective in teaching students computational and complex systems fluencies?

  5. A usability assessment on a virtual reality system for panic disorder treatment

    NASA Astrophysics Data System (ADS)

    Lee, Jaelin; Kawai, Takashi; Yoshida, Nahoko; Izawa, Shuhei; Nomura, Shinobu; Eames, Douglas; Kaiya, Hisanobu

    2008-02-01

    The authors have developed a virtual reality exposure system that reflects the Japanese culture and environment. Concretely, the system focuses on the subway environment, which is the environment most patients receiving treatment for panic disorder at hospitals in Tokyo, Japan tend to avoid. The system is PC based and features realistic video images and highly interactive functionality. In particular, the system enables instant transformation of the virtual space and allows situations to be freely customized according to the condition and symptoms expressed by each patient. Positive results achieved in therapy assessments aimed at patients with panic disorder accompanying agoraphobia indicate the possibility of indoor treatment. Full utilization of the functionality available requires that the interactive functions be easily operable. Accordingly, there appears to be a need for usability testing aimed at determining whether or not a therapist can operate the system naturally while focusing fully on treatment. In this paper, the configuration of the virtual reality exposure system focusing on the subway environment is outlined. Further, the results of usability tests aimed at assessing how naturally it can be operated while focusing fully on treatment are described.

  6. NASA Virtual Glovebox: An Immersive Virtual Desktop Environment for Training Astronauts in Life Science Experiments

    NASA Technical Reports Server (NTRS)

    Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard

    2003-01-01

    The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.

  7. Apparatus and method for modifying the operation of a robotic vehicle in a real environment, to emulate the operation of the robotic vehicle operating in a mixed reality environment

    DOEpatents

    Garretson, Justin R [Albuquerque, NM; Parker, Eric P [Albuquerque, NM; Gladwell, T Scott [Albuquerque, NM; Rigdon, J Brian [Edgewood, NM; Oppel, III, Fred J.

    2012-05-29

    Apparatus and methods for modifying the operation of a robotic vehicle in a real environment to emulate the operation of the robotic vehicle in a mixed reality environment include a vehicle sensing system having a communications module attached to the robotic vehicle for communicating operating parameters related to the robotic vehicle in a real environment to a simulation controller for simulating the operation of the robotic vehicle in a mixed (live, virtual and constructive) environment wherein the affects of virtual and constructive entities on the operation of the robotic vehicle (and vice versa) are simulated. These effects are communicated to the vehicle sensing system which generates a modified control command for the robotic vehicle including the effects of virtual and constructive entities, causing the robot in the real environment to behave as if virtual and constructive entities existed in the real environment.

  8. Training wheelchair navigation in immersive virtual environments for patients with spinal cord injury - end-user input to design an effective system.

    PubMed

    Nunnerley, Joanne; Gupta, Swati; Snell, Deborah; King, Marcus

    2017-05-01

    A user-centred design was used to develop and test the feasibility of an immersive 3D virtual reality wheelchair training tool for people with spinal cord injury (SCI). A Wheelchair Training System was designed and modelled using the Oculus Rift headset and a Dynamic Control wheelchair joystick. The system was tested by clinicians and expert wheelchair users with SCI. Data from focus groups and individual interviews were analysed using a general inductive approach to thematic analysis. Four themes emerged: Realistic System, which described the advantages of a realistic virtual environment; a Wheelchair Training System, which described participants' thoughts on the wheelchair training applications; Overcoming Resistance to Technology, the obstacles to introducing technology within the clinical setting; and Working outside the Rehabilitation Bubble which described the protective hospital environment. The Oculus Rift Wheelchair Training System has the potential to provide a virtual rehabilitation setting which could allow wheelchair users to learn valuable community wheelchair use in a safe environment. Nausea appears to be a side effect of the system, which will need to be resolved before this can be a viable clinical tool. Implications for Rehabilitation Immersive virtual reality shows promising benefit for wheelchair training in a rehabilitation setting. Early engagement with consumers can improve product development.

  9. Virtual Satellite

    NASA Technical Reports Server (NTRS)

    Hammrs, Stephan R.

    2008-01-01

    Virtual Satellite (VirtualSat) is a computer program that creates an environment that facilitates the development, verification, and validation of flight software for a single spacecraft or for multiple spacecraft flying in formation. In this environment, enhanced functionality and autonomy of navigation, guidance, and control systems of a spacecraft are provided by a virtual satellite that is, a computational model that simulates the dynamic behavior of the spacecraft. Within this environment, it is possible to execute any associated software, the development of which could benefit from knowledge of, and possible interaction (typically, exchange of data) with, the virtual satellite. Examples of associated software include programs for simulating spacecraft power and thermal- management systems. This environment is independent of the flight hardware that will eventually host the flight software, making it possible to develop the software simultaneously with, or even before, the hardware is delivered. Optionally, by use of interfaces included in VirtualSat, hardware can be used instead of simulated. The flight software, coded in the C or C++ programming language, is compilable and loadable into VirtualSat without any special modifications. Thus, VirtualSat can serve as a relatively inexpensive software test-bed for development test, integration, and post-launch maintenance of spacecraft flight software.

  10. Handbook of Research on Collaborative Teaching Practice in Virtual Learning Environments

    ERIC Educational Resources Information Center

    Panconesi, Gianni, Ed.; Guida, Maria, Ed.

    2017-01-01

    Modern technology has enhanced many aspects of life, including classroom education. By offering virtual learning experiences, educational systems can become more efficient and effective at teaching the student population. The "Handbook of Research on Collaborative Teaching Practice in Virtual Learning Environments" highlights program…

  11. Modeling the C. elegans nematode and its environment using a particle system.

    PubMed

    Rönkkö, Mauno; Wong, Garry

    2008-07-21

    A particle system, as understood in computer science, is a novel technique for modeling living organisms in their environment. Such particle systems have traditionally been used for modeling the complex dynamics of fluids and gases. In the present study, a particle system was devised to model the movement and feeding behavior of the nematode Caenorhabditis elegans in three different virtual environments: gel, liquid, and soil. The results demonstrate that distinct movements of the nematode can be attributed to its mechanical interactions with the virtual environment. These results also revealed emergent properties associated with modeling organisms within environment-based systems.

  12. Guidelines for developing distributed virtual environment applications

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.

    1998-08-01

    We have conducted a variety of projects that served to investigate the limits of virtual environments and distributed virtual environment (DVE) technology for the military and medical professions. The projects include an application that allows the user to interactively explore a high-fidelity, dynamic scale model of the Solar System and a high-fidelity, photorealistic, rapidly reconfigurable aircraft simulator. Additional projects are a project for observing, analyzing, and understanding the activity in a military distributed virtual environment, a project to develop a distributed threat simulator for training Air Force pilots, a virtual spaceplane to determine user interface requirements for a planned military spaceplane system, and an automated wingman for use in supplementing or replacing human-controlled systems in a DVE. The last two projects are a virtual environment user interface framework; and a project for training hospital emergency department personnel. In the process of designing and assembling the DVE applications in support of these projects, we have developed rules of thumb and insights into assembling DVE applications and the environment itself. In this paper, we open with a brief review of the applications that were the source for our insights and then present the lessons learned as a result of these projects. The lessons we have learned fall primarily into five areas. These areas are requirements development, software architecture, human-computer interaction, graphical database modeling, and construction of computer-generated forces.

  13. Human Factors Virtual Analysis Techniques for NASA's Space Launch System Ground Support using MSFC's Virtual Environments Lab (VEL)

    NASA Technical Reports Server (NTRS)

    Searcy, Brittani

    2017-01-01

    Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.

  14. Immersive telepresence system using high-resolution omnidirectional movies and a locomotion interface

    NASA Astrophysics Data System (ADS)

    Ikeda, Sei; Sato, Tomokazu; Kanbara, Masayuki; Yokoya, Naokazu

    2004-05-01

    Technology that enables users to experience a remote site virtually is called telepresence. A telepresence system using real environment images is expected to be used in the field of entertainment, medicine, education and so on. This paper describes a novel telepresence system which enables users to walk through a photorealistic virtualized environment by actual walking. To realize such a system, a wide-angle high-resolution movie is projected on an immersive multi-screen display to present users the virtualized environments and a treadmill is controlled according to detected user's locomotion. In this study, we use an omnidirectional multi-camera system to acquire images real outdoor scene. The proposed system provides users with rich sense of walking in a remote site.

  15. Virtual reality system for treatment of the fear of public speaking using image-based rendering and moving pictures.

    PubMed

    Lee, Jae M; Ku, Jeong H; Jang, Dong P; Kim, Dong H; Choi, Young H; Kim, In Y; Kim, Sun I

    2002-06-01

    The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology enabled us to use virtual reality (VR) for the treatment of the fear of public speaking. There have been two techniques used to construct a virtual environment for the treatment of the fear of public speaking: model-based and movie-based. Virtual audiences and virtual environments made by model-based technique are unrealistic and unnatural. The movie-based technique has a disadvantage in that each virtual audience cannot be controlled respectively, because all virtual audiences are included in one moving picture file. To address this disadvantage, this paper presents a virtual environment made by using image-based rendering (IBR) and chroma keying simultaneously. IBR enables us to make the virtual environment realistic because the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma keying allows a virtual audience to be controlled individually. In addition, a real-time capture technique was applied in constructing the virtual environment to give the subjects more interaction, in that they can talk with a therapist or another subject.

  16. Development of visual 3D virtual environment for control software

    NASA Technical Reports Server (NTRS)

    Hirose, Michitaka; Myoi, Takeshi; Amari, Haruo; Inamura, Kohei; Stark, Lawrence

    1991-01-01

    Virtual environments for software visualization may enable complex programs to be created and maintained. A typical application might be for control of regional electric power systems. As these encompass broader computer networks than ever, construction of such systems becomes very difficult. Conventional text-oriented environments are useful in programming individual processors. However, they are obviously insufficient to program a large and complicated system, that includes large numbers of computers connected to each other; such programming is called 'programming in the large.' As a solution for this problem, the authors are developing a graphic programming environment wherein one can visualize complicated software in virtual 3D world. One of the major features of the environment is the 3D representation of concurrent process. 3D representation is used to supply both network-wide interprocess programming capability (capability for 'programming in the large') and real-time programming capability. The authors' idea is to fuse both the block diagram (which is useful to check relationship among large number of processes or processors) and the time chart (which is useful to check precise timing for synchronization) into a single 3D space. The 3D representation gives us a capability for direct and intuitive planning or understanding of complicated relationship among many concurrent processes. To realize the 3D representation, a technology to enable easy handling of virtual 3D object is a definite necessity. Using a stereo display system and a gesture input device (VPL DataGlove), our prototype of the virtual workstation has been implemented. The workstation can supply the 'sensation' of the virtual 3D space to a programmer. Software for the 3D programming environment is implemented on the workstation. According to preliminary assessments, a 50 percent reduction of programming effort is achieved by using the virtual 3D environment. The authors expect that the 3D environment has considerable potential in the field of software engineering.

  17. A Proposed Treatment for Visual Field Loss caused by Traumatic Brain Injury using Interactive Visuotactile Virtual Environment

    NASA Astrophysics Data System (ADS)

    Farkas, Attila J.; Hajnal, Alen; Shiratuddin, Mohd F.; Szatmary, Gabriella

    In this paper, we propose a novel approach of using interactive virtual environment technology in Vision Restoration Therapy caused by Traumatic Brain Injury. We called the new system Interactive Visuotactile Virtual Environment and it holds a promise of expanding the scope of already existing rehabilitation techniques. Traditional vision rehabilitation methods are based on passive psychophysical training procedures, and can last up to six months before any modest improvements can be seen in patients. A highly immersive and interactive virtual environment will allow the patient to practice everyday activities such as object identification and object manipulation through the use 3D motion sensoring handheld devices such data glove or the Nintendo Wiimote. Employing both perceptual and action components in the training procedures holds the promise of more efficient sensorimotor rehabilitation. Increased stimulation of visual and sensorimotor areas of the brain should facilitate a comprehensive recovery of visuomotor function by exploiting the plasticity of the central nervous system. Integrated with a motion tracking system and an eye tracking device, the interactive virtual environment allows for the creation and manipulation of a wide variety of stimuli, as well as real-time recording of hand-, eye- and body movements and coordination. The goal of the project is to design a cost-effective and efficient vision restoration system.

  18. The Integrated Virtual Environment Rehabilitation Treadmill System

    PubMed Central

    Feasel, Jeff; Whitton, Mary C.; Kassler, Laura; Brooks, Frederick P.; Lewek, Michael D.

    2015-01-01

    Slow gait speed and interlimb asymmetry are prevalent in a variety of disorders. Current approaches to locomotor retraining emphasize the need for appropriate feedback during intensive, task-specific practice. This paper describes the design and feasibility testing of the integrated virtual environment rehabilitation treadmill (IVERT) system intended to provide real-time, intuitive feedback regarding gait speed and asymmetry during training. The IVERT system integrates an instrumented, split-belt treadmill with a front-projection, immersive virtual environment. The novel adaptive control system uses only ground reaction force data from the treadmill to continuously update the speeds of the two treadmill belts independently, as well as to control the speed and heading in the virtual environment in real time. Feedback regarding gait asymmetry is presented 1) visually as walking a curved trajectory through the virtual environment and 2) proprioceptively in the form of different belt speeds on the split-belt treadmill. A feasibility study involving five individuals with asymmetric gait found that these individuals could effectively control the speed of locomotion and perceive gait asymmetry during the training session. Although minimal changes in overground gait symmetry were observed immediately following a single training session, further studies should be done to determine the IVERT’s potential as a tool for rehabilitation of asymmetric gait by providing patients with congruent visual and proprioceptive feedback. PMID:21652279

  19. Virtual World Currency Value Fluctuation Prediction System Based on User Sentiment Analysis.

    PubMed

    Kim, Young Bin; Lee, Sang Hyeok; Kang, Shin Jin; Choi, Myung Jin; Lee, Jung; Kim, Chang Hun

    2015-01-01

    In this paper, we present a method for predicting the value of virtual currencies used in virtual gaming environments that support multiple users, such as massively multiplayer online role-playing games (MMORPGs). Predicting virtual currency values in a virtual gaming environment has rarely been explored; it is difficult to apply real-world methods for predicting fluctuating currency values or shares to the virtual gaming world on account of differences in domains between the two worlds. To address this issue, we herein predict virtual currency value fluctuations by collecting user opinion data from a virtual community and analyzing user sentiments or emotions from the opinion data. The proposed method is straightforward and applicable to predicting virtual currencies as well as to gaming environments, including MMORPGs. We test the proposed method using large-scale MMORPGs and demonstrate that virtual currencies can be effectively and efficiently predicted with it.

  20. Virtual World Currency Value Fluctuation Prediction System Based on User Sentiment Analysis

    PubMed Central

    Kim, Young Bin; Lee, Sang Hyeok; Kang, Shin Jin; Choi, Myung Jin; Lee, Jung; Kim, Chang Hun

    2015-01-01

    In this paper, we present a method for predicting the value of virtual currencies used in virtual gaming environments that support multiple users, such as massively multiplayer online role-playing games (MMORPGs). Predicting virtual currency values in a virtual gaming environment has rarely been explored; it is difficult to apply real-world methods for predicting fluctuating currency values or shares to the virtual gaming world on account of differences in domains between the two worlds. To address this issue, we herein predict virtual currency value fluctuations by collecting user opinion data from a virtual community and analyzing user sentiments or emotions from the opinion data. The proposed method is straightforward and applicable to predicting virtual currencies as well as to gaming environments, including MMORPGs. We test the proposed method using large-scale MMORPGs and demonstrate that virtual currencies can be effectively and efficiently predicted with it. PMID:26241496

  1. DELIVERing Library Resources to the Virtual Learning Environment

    ERIC Educational Resources Information Center

    Secker, Jane

    2005-01-01

    Purpose: Examines a project to integrate digital libraries and virtual learning environments (VLE) focusing on requirements for online reading list systems. Design/methodology/approach: Conducted a user needs analysis using interviews and focus groups and evaluated three reading or resource list management systems. Findings: Provides a technical…

  2. Integrating Video-Capture Virtual Reality Technology into a Physically Interactive Learning Environment for English Learning

    ERIC Educational Resources Information Center

    Yang, Jie Chi; Chen, Chih Hung; Jeng, Ming Chang

    2010-01-01

    The aim of this study is to design and develop a Physically Interactive Learning Environment, the PILE system, by integrating video-capture virtual reality technology into a classroom. The system is designed for elementary school level English classes where students can interact with the system through physical movements. The system is designed to…

  3. Live Virtual Constructive Distributed Test Environment Characterization Report

    NASA Technical Reports Server (NTRS)

    Murphy, Jim; Kim, Sam K.

    2013-01-01

    This report documents message latencies observed over various Live, Virtual, Constructive, (LVC) simulation environment configurations designed to emulate possible system architectures for the Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project integrated tests. For each configuration, four scenarios with progressively increasing air traffic loads were used to determine system throughput and bandwidth impacts on message latency.

  4. Education about Hallucinations Using an Internet Virtual Reality System: A Qualitative Survey

    ERIC Educational Resources Information Center

    Yellowlees, Peter M.; Cook, James N.

    2006-01-01

    Objective: The authors evaluate an Internet virtual reality technology as an education tool about the hallucinations of psychosis. Method: This is a pilot project using Second Life, an Internet-based virtual reality system, in which a virtual reality environment was constructed to simulate the auditory and visual hallucinations of two patients…

  5. Declarative Knowledge Acquisition in Immersive Virtual Learning Environments

    ERIC Educational Resources Information Center

    Webster, Rustin

    2016-01-01

    The author investigated the interaction effect of immersive virtual reality (VR) in the classroom. The objective of the project was to develop and provide a low-cost, scalable, and portable VR system containing purposely designed and developed immersive virtual learning environments for the US Army. The purpose of the mixed design experiment was…

  6. Virtual reality and physical rehabilitation: a new toy or a new research and rehabilitation tool?

    PubMed Central

    Keshner, Emily A

    2004-01-01

    Virtual reality (VR) technology is rapidly becoming a popular application for physical rehabilitation and motor control research. But questions remain about whether this technology really extends our ability to influence the nervous system or whether moving within a virtual environment just motivates the individual to perform. I served as guest editor of this month's issue of the Journal of NeuroEngineering and Rehabilitation (JNER) for a group of papers on augmented and virtual reality in rehabilitation. These papers demonstrate a variety of approaches taken for applying VR technology to physical rehabilitation. The papers by Kenyon et al. and Sparto et al. address critical questions about how this technology can be applied to physical rehabilitation and research. The papers by Sveistrup and Viau et al. explore whether action within a virtual environment is equivalent to motor performance within the physical environment. Finally, papers by Riva et al. and Weiss et al. discuss the important characteristics of a virtual environment that will be most effective for obtaining changes in the motor system. PMID:15679943

  7. Virtualization for Cost-Effective Teaching of Assembly Language Programming

    ERIC Educational Resources Information Center

    Cadenas, José O.; Sherratt, R. Simon; Howlett, Des; Guy, Chris G.; Lundqvist, Karsten O.

    2015-01-01

    This paper describes a virtual system that emulates an ARM-based processor machine, created to replace a traditional hardware-based system for teaching assembly language. The virtual system proposed here integrates, in a single environment, all the development tools necessary to deliver introductory or advanced courses on modern assembly language…

  8. Evaluation of glucose controllers in virtual environment: methodology and sample application.

    PubMed

    Chassin, Ludovic J; Wilinska, Malgorzata E; Hovorka, Roman

    2004-11-01

    Adaptive systems to deliver medical treatment in humans are safety-critical systems and require particular care in both the testing and the evaluation phase, which are time-consuming, costly, and confounded by ethical issues. The objective of the present work is to develop a methodology to test glucose controllers of an artificial pancreas in a simulated (virtual) environment. A virtual environment comprising a model of the carbohydrate metabolism and models of the insulin pump and the glucose sensor is employed to simulate individual glucose excursions in subjects with type 1 diabetes. The performance of the control algorithm within the virtual environment is evaluated by considering treatment and operational scenarios. The developed methodology includes two dimensions: testing in relation to specific life style conditions, i.e. fasting, post-prandial, and life style (metabolic) disturbances; and testing in relation to various operating conditions, i.e. expected operating conditions, adverse operating conditions, and system failure. We define safety and efficacy criteria and describe the measures to be taken prior to clinical testing. The use of the methodology is exemplified by tuning and evaluating a model predictive glucose controller being developed for a wearable artificial pancreas focused on fasting conditions. Our methodology to test glucose controllers in a virtual environment is instrumental in anticipating the results of real clinical tests for different physiological conditions and for different operating conditions. The thorough testing in the virtual environment reduces costs and speeds up the development process.

  9. Human Machine Interfaces for Teleoperators and Virtual Environments Conference

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In a teleoperator system the human operator senses, moves within, and operates upon a remote or hazardous environment by means of a slave mechanism (a mechanism often referred to as a teleoperator). In a virtual environment system the interactive human machine interface is retained but the slave mechanism and its environment are replaced by a computer simulation. Video is replaced by computer graphics. The auditory and force sensations imparted to the human operator are similarly computer generated. In contrast to a teleoperator system, where the purpose is to extend the operator's sensorimotor system in a manner that facilitates exploration and manipulation of the physical environment, in a virtual environment system, the purpose is to train, inform, alter, or study the human operator to modify the state of the computer and the information environment. A major application in which the human operator is the target is that of flight simulation. Although flight simulators have been around for more than a decade, they had little impact outside aviation presumably because the application was so specialized and so expensive.

  10. Virtual Jupiter - Real Learning

    NASA Astrophysics Data System (ADS)

    Ruzhitskaya, Lanika; Speck, A.; Laffey, J.

    2010-01-01

    How many earthlings went to visit Jupiter? None. How many students visited virtual Jupiter to fulfill their introductory astronomy courses’ requirements? Within next six months over 100 students from University of Missouri will get a chance to explore the planet and its Galilean Moons using a 3D virtual environment created especially for them to learn Kepler's and Newton's laws, eclipses, parallax, and other concepts in astronomy. The virtual world of Jupiter system is a unique 3D environment that allows students to learn course material - physical laws and concepts in astronomy - while engaging them into exploration of the Jupiter's system, encouraging their imagination, curiosity, and motivation. The virtual learning environment let students to work individually or collaborate with their teammates. The 3D world is also a great opportunity for research in astronomy education to investigate impact of social interaction, gaming features, and use of manipulatives offered by a learning tool on students’ motivation and learning outcomes. Use of 3D environment is also a valuable source for exploration of how the learners’ spatial awareness can be enhanced by working in 3-dimensional environment.

  11. The Ames Virtual Environment Workstation: Implementation issues and requirements

    NASA Technical Reports Server (NTRS)

    Fisher, Scott S.; Jacoby, R.; Bryson, S.; Stone, P.; Mcdowall, I.; Bolas, M.; Dasaro, D.; Wenzel, Elizabeth M.; Coler, C.; Kerr, D.

    1991-01-01

    This presentation describes recent developments in the implementation of a virtual environment workstation in the Aerospace Human Factors Research Division of NASA's Ames Research Center. Introductory discussions are presented on the primary research objectives and applications of the system and on the system's current hardware and software configuration. Principle attention is then focused on unique issues and problems encountered in the workstation's development with emphasis on its ability to meet original design specifications for computational graphics performance and for associated human factors requirements necessary to provide compelling sense of presence and efficient interaction in the virtual environment.

  12. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  13. Design and implementation of dynamic hybrid Honeypot network

    NASA Astrophysics Data System (ADS)

    Qiao, Peili; Hu, Shan-Shan; Zhai, Ji-Qiang

    2013-05-01

    The method of constructing a dynamic and self-adaptive virtual network is suggested to puzzle adversaries, delay and divert attacks, exhaust attacker resources and collect attacking information. The concepts of Honeypot and Honeyd, which is the frame of virtual Honeypot are introduced. The techniques of network scanning including active fingerprint recognition are analyzed. Dynamic virtual network system is designed and implemented. A virtual network similar to real network topology is built according to the collected messages from real environments in this system. By doing this, the system can perplex the attackers when Hackers attack and can further analyze and research the attacks. The tests to this system prove that this design can successfully simulate real network environment and can be used in network security analysis.

  14. The Effects of a Virtual Tutee System on Academic Reading Engagement in a College Classroom

    ERIC Educational Resources Information Center

    Park, Seung Won; Kim, ChanMin

    2016-01-01

    Poor student engagement with academic readings has been frequently reported in college classrooms. As an effort to improve college students' reading engagement, researchers have developed a virtual environment in which students take on the role of tutor and teach a virtual tutee, the virtual tutee system (VTS). This research examined the…

  15. Virtual environments for scene of crime reconstruction and analysis

    NASA Astrophysics Data System (ADS)

    Howard, Toby L. J.; Murta, Alan D.; Gibson, Simon

    2000-02-01

    This paper describes research conducted in collaboration with Greater Manchester Police (UK), to evalute the utility of Virtual Environments for scene of crime analysis, forensic investigation, and law enforcement briefing and training. We present an illustrated case study of the construction of a high-fidelity virtual environment, intended to match a particular real-life crime scene as closely as possible. We describe and evaluate the combination of several approaches including: the use of the Manchester Scene Description Language for constructing complex geometrical models; the application of a radiosity rendering algorithm with several novel features based on human perceptual consideration; texture extraction from forensic photography; and experiments with interactive walkthroughs and large-screen stereoscopic display of the virtual environment implemented using the MAVERIK system. We also discuss the potential applications of Virtual Environment techniques in the Law Enforcement and Forensic communities.

  16. Proceedings of the 1993 Conference on Intelligent Computer-Aided Training and Virtual Environment Technology

    NASA Technical Reports Server (NTRS)

    Hyde, Patricia R.; Loftin, R. Bowen

    1993-01-01

    The volume 2 proceedings from the 1993 Conference on Intelligent Computer-Aided Training and Virtual Environment Technology are presented. Topics discussed include intelligent computer assisted training (ICAT) systems architectures, ICAT educational and medical applications, virtual environment (VE) training and assessment, human factors engineering and VE, ICAT theory and natural language processing, ICAT military applications, VE engineering applications, ICAT knowledge acquisition processes and applications, and ICAT aerospace applications.

  17. Distracting people from sources of discomfort in a simulated aircraft environment.

    PubMed

    Lewis, Laura; Patel, Harshada; Cobb, Sue; D'Cruz, Mirabelle; Bues, Matthias; Stefani, Oliver; Grobler, Tredeaux

    2016-07-19

    Comfort is an important factor in the acceptance of transport systems. In 2010 and 2011, the European Commission (EC) put forward its vision for air travel in the year 2050 which envisaged the use of in-flight virtual reality. This paper addressed the EC vision by investigating the effect of virtual environments on comfort. Research has shown that virtual environments can provide entertaining experiences and can be effective distracters from painful experiences. To determine the extent to which a virtual environment could distract people from sources of discomfort. Experiments which involved inducing discomfort commonly experienced in-flight (e.g. limited space, noise) in order to determine the extent to which viewing a virtual environment could distract people from discomfort. Virtual environments can fully or partially distract people from sources of discomfort, becoming more effective when they are interesting. They are also more effective at distracting people from discomfort caused by restricted space than noise disturbances. Virtual environments have the potential to enhance passenger comfort by providing positive distractions from sources of discomfort. Further research is required to understand more fully the reasons why the effect was stronger for one source of discomfort than the other.

  18. Experience in Education Environment Virtualization within the Automated Information System "Platonus" (Kazakhstan)

    ERIC Educational Resources Information Center

    Abeldina, Zhaidary; Moldumarova, Zhibek; Abeldina, Rauza; Makysh, Gulmira; Moldumarova, Zhuldyz Ilibaevna

    2016-01-01

    This work reports on the use of virtual tools as means of learning process activation. A good result can be achieved by combining the classical learning with modern computer technology. By creating a virtual learning environment and using multimedia learning tools one can obtain a significant result while facilitating the development of students'…

  19. A specification of 3D manipulation in virtual environments

    NASA Technical Reports Server (NTRS)

    Su, S. Augustine; Furuta, Richard

    1994-01-01

    In this paper we discuss the modeling of three basic kinds of 3-D manipulations in the context of a logical hand device and our virtual panel architecture. The logical hand device is a useful software abstraction representing hands in virtual environments. The virtual panel architecture is the 3-D component of the 2-D window systems. Both of the abstractions are intended to form the foundation for adaptable 3-D manipulation.

  20. Identifying Different Registers of Digital Literacy in Virtual Learning Environments

    ERIC Educational Resources Information Center

    Knutsson, Ola; Blasjo, Mona.; Hallsten, Stina; Karlstrom, Petter

    2012-01-01

    In this paper social semiotics, and systemic functional linguistics in particular, are used in order to identify registers of digital literacy in the use of virtual learning environments. The framework of social semiotics provides means to systemize and discuss digital literacy as a linguistic and semiotic issue. The following research question…

  1. Kinematic/Dynamic Characteristics for Visual and Kinesthetic Virtual Environments

    NASA Technical Reports Server (NTRS)

    Bortolussi, Michael R. (Compiler); Adelstein, B. D.; Gold, Miriam

    1996-01-01

    Work was carried out on two topics of principal importance to current progress in virtual environment research at NASA Ames and elsewhere. The first topic was directed at maximizing the temporal dynamic response of visually presented Virtual Environments (VEs) through reorganization and optimization of system hardware and software. The final results of this portion of the work was a VE system in the Advanced Display and Spatial Perception Laboratory at NASA Ames capable of updating at 60 Hz (the maximum hardware refresh rate) with latencies approaching 30 msec. In the course of achieving this system performance, specialized hardware and software tools for measurement of VE latency and analytic models correlating update rate and latency for different system configurations were developed. The second area of activity was the preliminary development and analysis of a novel kinematic architecture for three Degree Of Freedom (DOF) haptic interfaces--devices that provide force feedback for manipulative interaction with virtual and remote environments. An invention disclosure was filed on this work and a patent application is being pursued by NASA Ames. Activities in these two areas are expanded upon below.

  2. An investigation of the efficacy of collaborative virtual reality systems for moderated remote usability testing.

    PubMed

    Chalil Madathil, Kapil; Greenstein, Joel S

    2017-11-01

    Collaborative virtual reality-based systems have integrated high fidelity voice-based communication, immersive audio and screen-sharing tools into virtual environments. Such three-dimensional collaborative virtual environments can mirror the collaboration among usability test participants and facilitators when they are physically collocated, potentially enabling moderated usability tests to be conducted effectively when the facilitator and participant are located in different places. We developed a virtual collaborative three-dimensional remote moderated usability testing laboratory and employed it in a controlled study to evaluate the effectiveness of moderated usability testing in a collaborative virtual reality-based environment with two other moderated usability testing methods: the traditional lab approach and Cisco WebEx, a web-based conferencing and screen sharing approach. Using a mixed methods experimental design, 36 test participants and 12 test facilitators were asked to complete representative tasks on a simulated online shopping website. The dependent variables included the time taken to complete the tasks; the usability defects identified and their severity; and the subjective ratings on the workload index, presence and satisfaction questionnaires. Remote moderated usability testing methodology using a collaborative virtual reality system performed similarly in terms of the total number of defects identified, the number of high severity defects identified and the time taken to complete the tasks with the other two methodologies. The overall workload experienced by the test participants and facilitators was the least with the traditional lab condition. No significant differences were identified for the workload experienced with the virtual reality and the WebEx conditions. However, test participants experienced greater involvement and a more immersive experience in the virtual environment than in the WebEx condition. The ratings for the virtual environment condition were not significantly different from those for the traditional lab condition. The results of this study suggest that participants were productive and enjoyed the virtual lab condition, indicating the potential of a virtual world based approach as an alternative to conventional approaches for synchronous usability testing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. HVS: an image-based approach for constructing virtual environments

    NASA Astrophysics Data System (ADS)

    Zhang, Maojun; Zhong, Li; Sun, Lifeng; Li, Yunhao

    1998-09-01

    Virtual Reality Systems can construct virtual environment which provide an interactive walkthrough experience. Traditionally, walkthrough is performed by modeling and rendering 3D computer graphics in real-time. Despite the rapid advance of computer graphics technique, the rendering engine usually places a limit on scene complexity and rendering quality. This paper presents a approach which uses the real-world image or synthesized image to comprise a virtual environment. The real-world image or synthesized image can be recorded by camera, or synthesized by off-line multispectral image processing for Landsat TM (Thematic Mapper) Imagery and SPOT HRV imagery. They are digitally warped on-the-fly to simulate walking forward/backward, to left/right and 360-degree watching around. We have developed a system HVS (Hyper Video System) based on these principles. HVS improves upon QuickTime VR and Surround Video in the walking forward/backward.

  4. A Virtual Environment for People Who Are Blind – A Usability Study

    PubMed Central

    Lahav, O.; Schloerb, D. W.; Kumar, S.; Srinivasan, M. A.

    2013-01-01

    For most people who are blind, exploring an unknown environment can be unpleasant, uncomfortable, and unsafe. Over the past years, the use of virtual reality as a learning and rehabilitation tool for people with disabilities has been on the rise. This research is based on the hypothesis that the supply of appropriate perceptual and conceptual information through compensatory sensorial channels may assist people who are blind with anticipatory exploration. In this research we developed and tested the BlindAid system, which allows the user to explore a virtual environment. The two main goals of the research were: (a) evaluation of different modalities (haptic and audio) and navigation tools, and (b) evaluation of spatial cognitive mapping employed by people who are blind. Our research included four participants who are totally blind. The preliminary findings confirm that the system enabled participants to develop comprehensive cognitive maps by exploring the virtual environment. PMID:24353744

  5. Use of the Remote Access Virtual Environment Network (RAVEN) for coordinated IVA-EVA astronaut training and evaluation.

    PubMed

    Cater, J P; Huffman, S D

    1995-01-01

    This paper presents a unique virtual reality training and assessment tool developed under a NASA grant, "Research in Human Factors Aspects of Enhanced Virtual Environments for Extravehicular Activity (EVA) Training and Simulation." The Remote Access Virtual Environment Network (RAVEN) was created to train and evaluate the verbal, mental and physical coordination required between the intravehicular (IVA) astronaut operating the Remote Manipulator System (RMS) arm and the EVA astronaut standing in foot restraints on the end of the RMS. The RAVEN system currently allows the EVA astronaut to approach the Hubble Space Telescope (HST) under control of the IVA astronaut and grasp, remove, and replace the Wide Field Planetary Camera drawer from its location in the HST. Two viewpoints, one stereoscopic and one monoscopic, were created all linked by Ethernet, that provided the two trainees with the appropriate training environments.

  6. The development of the virtual reality system for the treatment of the fears of public speaking.

    PubMed

    Jo, H J; Ku, J H; Jang, D P; Shin, M B; Ahn, H B; Lee, J M; Cho, B H; Kim, S I

    2001-01-01

    The fear of public speaking is a kind of social phobias. The patients having the fear of public speaking show some symptoms like shame and timidity in the daily personal relationship. They are afraid that the other person would be puzzled, feel insulted, and they also fear that they should be underestimated for their mistakes. For the treatment of the fear of public speaking, the cognitive-behavioral therapy has been generally used. The cognitive-behavioral therapy is the method that makes the patients gradually experience some situations inducing the fears and overcome those at last. Recently, the virtual reality technology has been introduced as an alternative method for providing phobic situations. In this study, we developed the public speaking simulator and the virtual environments for the treatment of the fear of public speaking. The head-mounted display, the head-tracker and the 3 dimensional sound system were used for the immersive virtual environment. The imagery of the virtual environment consists of a seminar room and 8 virtual audiences. The patient will speak in front of these virtual audiences and the therapist can control motions, facial expressions, sounds, and voices of each virtual audience.

  7. A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery

    PubMed Central

    Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J.; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng

    2017-01-01

    Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the “integrated image” on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications. PMID:28198442

  8. A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery.

    PubMed

    Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng

    2017-02-15

    Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the "integrated image" on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications.

  9. Development of a Locomotion Interface for Portable Virtual Environment Systems Using an Inertial/Magnetic Sensor-Based System and a Ranging Measurement System

    DTIC Science & Technology

    2014-03-01

    56 1. Motivation ...83 1. Motivation ...........................................................................................83 2. Environment Requirements...ENVIRONMENT SYSTEMS ......................................................97 A. BACKGROUND AND MOTIVATION

  10. An intelligent control and virtual display system for evolutionary space station workstation design

    NASA Technical Reports Server (NTRS)

    Feng, Xin; Niederjohn, Russell J.; Mcgreevy, Michael W.

    1992-01-01

    Research and development of the Advanced Display and Computer Augmented Control System (ADCACS) for the space station Body-Ported Cupola Virtual Workstation (BP/VCWS) were pursued. The potential applications were explored of body ported virtual display and intelligent control technology for the human-system interfacing applications is space station environment. The new system is designed to enable crew members to control and monitor a variety of space operations with greater flexibility and efficiency than existing fixed consoles. The technologies being studied include helmet mounted virtual displays, voice and special command input devices, and microprocessor based intelligent controllers. Several research topics, such as human factors, decision support expert systems, and wide field of view, color displays are being addressed. The study showed the significant advantages of this uniquely integrated display and control system, and its feasibility for human-system interfacing applications in the space station command and control environment.

  11. SU-F-P-18: Development of the Technical Training System for Patient Set-Up Considering Rotational Correction in the Virtual Environment Using Three-Dimensional Computer Graphic Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imura, K; Fujibuchi, T; Hirata, H

    Purpose: Patient set-up skills in radiotherapy treatment room have a great influence on treatment effect for image guided radiotherapy. In this study, we have developed the training system for improving practical set-up skills considering rotational correction in the virtual environment away from the pressure of actual treatment room by using three-dimensional computer graphic (3DCG) engine. Methods: The treatment room for external beam radiotherapy was reproduced in the virtual environment by using 3DCG engine (Unity). The viewpoints to perform patient set-up in the virtual treatment room were arranged in both sides of the virtual operable treatment couch to assume actual performancemore » by two clinical staffs. The position errors to mechanical isocenter considering alignment between skin marker and laser on the virtual patient model were displayed by utilizing numerical values expressed in SI units and the directions of arrow marks. The rotational errors calculated with a point on the virtual body axis as the center of each rotation axis for the virtual environment were corrected by adjusting rotational position of the body phantom wound the belt with gyroscope preparing on table in a real space. These rotational errors were evaluated by describing vector outer product operations and trigonometric functions in the script for patient set-up technique. Results: The viewpoints in the virtual environment allowed individual user to visually recognize the position discrepancy to mechanical isocenter until eliminating the positional errors of several millimeters. The rotational errors between the two points calculated with the center point could be efficiently corrected to display the minimum technique mathematically by utilizing the script. Conclusion: By utilizing the script to correct the rotational errors as well as accurate positional recognition for patient set-up technique, the training system developed for improving patient set-up skills enabled individual user to indicate efficient positional correction methods easily.« less

  12. Sounds of silence: How to animate virtual worlds with sound

    NASA Technical Reports Server (NTRS)

    Astheimer, Peter

    1993-01-01

    Sounds are an integral and sometimes annoying part of our daily life. Virtual worlds which imitate natural environments gain a lot of authenticity from fast, high quality visualization combined with sound effects. Sounds help to increase the degree of immersion for human dwellers in imaginary worlds significantly. The virtual reality toolkit of IGD (Institute for Computer Graphics) features a broad range of standard visual and advanced real-time audio components which interpret an object-oriented definition of the scene. The virtual reality system 'Virtual Design' realized with the toolkit enables the designer of virtual worlds to create a true audiovisual environment. Several examples on video demonstrate the usage of the audio features in Virtual Design.

  13. Novel graphical environment for virtual and real-world operations of tracked mobile manipulators

    NASA Astrophysics Data System (ADS)

    Chen, ChuXin; Trivedi, Mohan M.; Azam, Mir; Lassiter, Nils T.

    1993-08-01

    A simulation, animation, visualization and interactive control (SAVIC) environment has been developed for the design and operation of an integrated mobile manipulator system. This unique system possesses the abilities for (1) multi-sensor simulation, (2) kinematics and locomotion animation, (3) dynamic motion and manipulation animation, (4) transformation between real and virtual modes within the same graphics system, (5) ease in exchanging software modules and hardware devices between real and virtual world operations, and (6) interfacing with a real robotic system. This paper describes a working system and illustrates the concepts by presenting the simulation, animation and control methodologies for a unique mobile robot with articulated tracks, a manipulator, and sensory modules.

  14. Collaboration and Synergy among Government, Industry and Academia in M&S Domain: Turkey’s Approach

    DTIC Science & Technology

    2009-10-01

    Analysis, Decision Support System Design and Implementation, Simulation Output Analysis, Statistical Data Analysis, Virtual Reality , Artificial... virtual and constructive visual simulation systems as well as integrated advanced analytical models. Collaboration and Synergy among Government...simulation systems that are ready to use, credible, integrated with C4ISR systems.  Creating synthetic environments and/or virtual prototypes of concepts

  15. 3D workflow for HDR image capture of projection systems and objects for CAVE virtual environments authoring with wireless touch-sensitive devices

    NASA Astrophysics Data System (ADS)

    Prusten, Mark J.; McIntyre, Michelle; Landis, Marvin

    2006-02-01

    A 3D workflow pipeline is presented for High Dynamic Range (HDR) image capture of projected scenes or objects for presentation in CAVE virtual environments. The methods of HDR digital photography of environments vs. objects are reviewed. Samples of both types of virtual authoring being the actual CAVE environment and a sculpture are shown. A series of software tools are incorporated into a pipeline called CAVEPIPE, allowing for high-resolution objects and scenes to be composited together in natural illumination environments [1] and presented in our CAVE virtual reality environment. We also present a way to enhance the user interface for CAVE environments. The traditional methods of controlling the navigation through virtual environments include: glove, HUD's and 3D mouse devices. By integrating a wireless network that includes both WiFi (IEEE 802.11b/g) and Bluetooth (IEEE 802.15.1) protocols the non-graphical input control device can be eliminated. Therefore wireless devices can be added that would include: PDA's, Smart Phones, TabletPC's, Portable Gaming consoles, and PocketPC's.

  16. Emerging CAE technologies and their role in Future Ambient Intelligence Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2011-03-01

    Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.

  17. The expert surgical assistant. An intelligent virtual environment with multimodal input.

    PubMed

    Billinghurst, M; Savage, J; Oppenheimer, P; Edmond, C

    1996-01-01

    Virtual Reality has made computer interfaces more intuitive but not more intelligent. This paper shows how an expert system can be coupled with multimodal input in a virtual environment to provide an intelligent simulation tool or surgical assistant. This is accomplished in three steps. First, voice and gestural input is interpreted and represented in a common semantic form. Second, a rule-based expert system is used to infer context and user actions from this semantic representation. Finally, the inferred user actions are matched against steps in a surgical procedure to monitor the user's progress and provide automatic feedback. In addition, the system can respond immediately to multimodal commands for navigational assistance and/or identification of critical anatomical structures. To show how these methods are used we present a prototype sinus surgery interface. The approach described here may easily be extended to a wide variety of medical and non-medical training applications by making simple changes to the expert system database and virtual environment models. Successful implementation of an expert system in both simulated and real surgery has enormous potential for the surgeon both in training and clinical practice.

  18. Adaptive Changes in Sensorimotor Coordination and Motion Sickness Following Repeated Exposures to Virtual Environments

    NASA Technical Reports Server (NTRS)

    Harm, D. L.; Taylor, L. C.; Bloomberg, J. J.

    2007-01-01

    Virtual environments offer unique training opportunities, particularly for training astronauts and preadapting them to the novel sensory conditions of microgravity. Two unresolved human factors issues in virtual reality (VR) systems are: 1) potential "cybersickness", and 2) maladaptive sensorimotor performance following exposure to VR systems. Interestingly, these aftereffects are often quite similar to adaptive sensorimotor responses observed in astronauts during and/or following space flight. Initial interpretation of novel sensory information may be inappropriate and result in perceptual errors. Active exploratory behavior in a new environment, with resulting feedback and the formation of new associations between sensory inputs and response outputs, promotes appropriate perception and motor control in the new environment. Thus, people adapt to consistent, sustained alterations of sensory input such as those produced by microgravity, unilateral labyrinthectomy and experimentally produced stimulus rearrangements. The purpose of this research was to compare disturbances in sensorimotor coordination produced by dome and head-mounted virtual environment displays and to examine the effects of exposure duration, and repeated exposures to VR systems. The first study examined disturbances in balance control, and the second study examined disturbances in eye-head-hand (EHH) and eye-head coordination.

  19. Virtual environment assessment for laser-based vision surface profiling

    NASA Astrophysics Data System (ADS)

    ElSoussi, Adnane; Al Alami, Abed ElRahman; Abu-Nabah, Bassam A.

    2015-03-01

    Oil and gas businesses have been raising the demand from original equipment manufacturers (OEMs) to implement a reliable metrology method in assessing surface profiles of welds before and after grinding. This certainly mandates the deviation from the commonly used surface measurement gauges, which are not only operator dependent, but also limited to discrete measurements along the weld. Due to its potential accuracy and speed, the use of laser-based vision surface profiling systems have been progressively rising as part of manufacturing quality control. This effort presents a virtual environment that lends itself for developing and evaluating existing laser vision sensor (LVS) calibration and measurement techniques. A combination of two known calibration techniques is implemented to deliver a calibrated LVS system. System calibration is implemented virtually and experimentally to scan simulated and 3D printed features of known profiles, respectively. Scanned data is inverted and compared with the input profiles to validate the virtual environment capability for LVS surface profiling and preliminary assess the measurement technique for weld profiling applications. Moreover, this effort brings 3D scanning capability a step closer towards robust quality control applications in a manufacturing environment.

  20. Software platform virtualization in chemistry research and university teaching

    PubMed Central

    2009-01-01

    Background Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Results Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Conclusion Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide. PMID:20150997

  1. Software platform virtualization in chemistry research and university teaching.

    PubMed

    Kind, Tobias; Leamy, Tim; Leary, Julie A; Fiehn, Oliver

    2009-11-16

    Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.

  2. The Evolution of Sonic Ecosystems

    NASA Astrophysics Data System (ADS)

    McCormack, Jon

    This chapter describes a novel type of artistic artificial life software environment. Agents that have the ability to make and listen to sound populate a synthetic world. An evolvable, rule-based classifier system drives agent behavior. Agents compete for limited resources in a virtual environment that is influenced by the presence and movement of people observing the system. Electronic sensors create a link between the real and virtual spaces, virtual agents evolve implicitly to try to maintain the interest of the human audience, whose presence provides them with life-sustaining food.

  3. Individual Differences in a Spatial-Semantic Virtual Environment.

    ERIC Educational Resources Information Center

    Chen, Chaomei

    2000-01-01

    Presents two empirical case studies concerning the role of individual differences in searching through a spatial-semantic virtual environment. Discusses information visualization in information systems; cognitive factors, including associative memory, spatial ability, and visual memory; user satisfaction; and cognitive abilities and search…

  4. Minimizing Input-to-Output Latency in Virtual Environment

    NASA Technical Reports Server (NTRS)

    Adelstein, Bernard D.; Ellis, Stephen R.; Hill, Michael I.

    2009-01-01

    A method and apparatus were developed to minimize latency (time delay ) in virtual environment (VE) and other discrete- time computer-base d systems that require real-time display in response to sensor input s. Latency in such systems is due to the sum of the finite time requi red for information processing and communication within and between sensors, software, and displays.

  5. Exploring the Relationship between Virtual Learning Environment Preference, Use, and Learning Outcomes in 10th Grade Earth Science Students

    ERIC Educational Resources Information Center

    Lin, Ming-Chao; Tutwiler, M. Shane; Chang, Chun-Yen

    2011-01-01

    This study investigated the relationship between the use of a three-dimensional Virtual Reality Learning Environment for Field Trip (3DVLE[subscript (ft)]) system and the achievement levels of senior high school earth science students. The 3DVLE[subscript (ft)] system was presented in two separate formats: Teacher Demonstrated Based and Student…

  6. A Framework for Aligning Instructional Design Strategies with Affordances of CAVE Immersive Virtual Reality Systems

    ERIC Educational Resources Information Center

    Ritz, Leah T.; Buss, Alan R.

    2016-01-01

    Increasing availability of immersive virtual reality (IVR) systems, such as the Cave Automatic Virtual Environment (CAVE) and head-mounted displays, for use in education contexts is providing new opportunities and challenges for instructional designers. By highlighting the affordances of IVR specific to the CAVE, the authors emphasize the…

  7. Virtual- and real-world operation of mobile robotic manipulators: integrated simulation, visualization, and control environment

    NASA Astrophysics Data System (ADS)

    Chen, ChuXin; Trivedi, Mohan M.

    1992-03-01

    This research is focused on enhancing the overall productivity of an integrated human-robot system. A simulation, animation, visualization, and interactive control (SAVIC) environment has been developed for the design and operation of an integrated robotic manipulator system. This unique system possesses the abilities for multisensor simulation, kinematics and locomotion animation, dynamic motion and manipulation animation, transformation between real and virtual modes within the same graphics system, ease in exchanging software modules and hardware devices between real and virtual world operations, and interfacing with a real robotic system. This paper describes a working system and illustrates the concepts by presenting the simulation, animation, and control methodologies for a unique mobile robot with articulated tracks, a manipulator, and sensory modules.

  8. Simulation fidelity of a virtual environment display

    NASA Technical Reports Server (NTRS)

    Nemire, Kenneth; Jacoby, Richard H.; Ellis, Stephen R.

    1994-01-01

    We assessed the degree to which a virtual environment system produced a faithful simulation of three-dimensional space by investigating the influence of a pitched optic array on the perception of gravity-referenced eye level (GREL). We compared the results with those obtained in a physical environment. In a within-subjects factorial design, 12 subjects indicated GREL while viewing virtual three-dimensional arrays at different static orientations. A physical array biased GREL more than did a geometrically identical virtual pitched array. However, addition of two sets of orthogonal parallel lines (a grid) to the virtual pitched array resulted in as large a bias as that obtained with the physical pitched array. The increased bias was caused by longitudinal, but not the transverse, components of the grid. We discuss implications of our results for spatial orientation models and for designs of virtual displays.

  9. Virtual reality: past, present and future.

    PubMed

    Gobbetti, E; Scateni, R

    1998-01-01

    This report provides a short survey of the field of virtual reality, highlighting application domains, technological requirements, and currently available solutions. The report is organized as follows: section 1 presents the background and motivation of virtual environment research and identifies typical application domain, section 2 discusses the characteristics a virtual reality system must have in order to exploit the perceptual and spatial skills of users, section 3 surveys current input/output devices for virtual reality, section 4 surveys current software approaches to support the creation of virtual reality systems, and section 5 summarizes the report.

  10. Development of a novel visuomotor integration paradigm by integrating a virtual environment with mobile eye-tracking and motion-capture systems

    PubMed Central

    Miller, Haylie L.; Bugnariu, Nicoleta; Patterson, Rita M.; Wijayasinghe, Indika; Popa, Dan O.

    2018-01-01

    Visuomotor integration (VMI), the use of visual information to guide motor planning, execution, and modification, is necessary for a wide range of functional tasks. To comprehensively, quantitatively assess VMI, we developed a paradigm integrating virtual environments, motion-capture, and mobile eye-tracking. Virtual environments enable tasks to be repeatable, naturalistic, and varied in complexity. Mobile eye-tracking and minimally-restricted movement enable observation of natural strategies for interacting with the environment. This paradigm yields a rich dataset that may inform our understanding of VMI in typical and atypical development. PMID:29876370

  11. Computer Applications and Virtual Environments (CAVE)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall SPace Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  12. ComputerApplications and Virtual Environments (CAVE)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Center (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability providedgeneral visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.

  13. ComputerApplications and Virtual Environments (CAVE)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.

  14. The Comparison Of Dome And HMD Delivery Systems: A Case Study

    NASA Technical Reports Server (NTRS)

    Chen, Jian; Harm, Deborah L.; Loftin, R. Bowen; Tyalor, Laura C.; Leiss, Ernst L.

    2002-01-01

    For effective astronaut training applications, choosing the right display devices to present images is crucial. In order to assess what devices are appropriate, it is important to design a successful virtual environment for a comparison study of the display devices. We present a comprehensive system, a Virtual environment testbed (VET), for the comparison of Dome and Head Mounted Display (HMD) systems on an SGI Onyx workstation. By writing codelets, we allow a variety of virtual scenarios and subjects' information to be loaded without programming or changing the code. This is part of an ongoing research project conducted by the NASA / JSC.

  15. Elearn: A Collaborative Educational Virtual Environment.

    ERIC Educational Resources Information Center

    Michailidou, Anna; Economides, Anastasios A.

    Virtual Learning Environments (VLEs) that support collaboration are one of the new technologies that have attracted great interest. VLEs are learning management software systems composed of computer-mediated communication software and online methods of delivering course material. This paper presents ELearn, a collaborative VLE for teaching…

  16. Towards Gesture-Based Multi-User Interactions in Collaborative Virtual Environments

    NASA Astrophysics Data System (ADS)

    Pretto, N.; Poiesi, F.

    2017-11-01

    We present a virtual reality (VR) setup that enables multiple users to participate in collaborative virtual environments and interact via gestures. A collaborative VR session is established through a network of users that is composed of a server and a set of clients. The server manages the communication amongst clients and is created by one of the users. Each user's VR setup consists of a Head Mounted Display (HMD) for immersive visualisation, a hand tracking system to interact with virtual objects and a single-hand joypad to move in the virtual environment. We use Google Cardboard as a HMD for the VR experience and a Leap Motion for hand tracking, thus making our solution low cost. We evaluate our VR setup though a forensics use case, where real-world objects pertaining to a simulated crime scene are included in a VR environment, acquired using a smartphone-based 3D reconstruction pipeline. Users can interact using virtual gesture-based tools such as pointers and rulers.

  17. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments.

    PubMed

    Rincon, J A; Poza-Lujan, Jose-Luis; Julian, V; Posadas-Yagüe, Juan-Luis; Carrascosa, C

    2016-01-01

    This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system.

  18. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments

    PubMed Central

    2016-01-01

    This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system. PMID:26926691

  19. Building interactive virtual environments for simulated training in medicine using VRML and Java/JavaScript.

    PubMed

    Korocsec, D; Holobar, A; Divjak, M; Zazula, D

    2005-12-01

    Medicine is a difficult thing to learn. Experimenting with real patients should not be the only option; simulation deserves a special attention here. Virtual Reality Modelling Language (VRML) as a tool for building virtual objects and scenes has a good record of educational applications in medicine, especially for static and animated visualisations of body parts and organs. However, to create computer simulations resembling situations in real environments the required level of interactivity and dynamics is difficult to achieve. In the present paper we describe some approaches and techniques which we used to push the limits of the current VRML technology further toward dynamic 3D representation of virtual environments (VEs). Our demonstration is based on the implementation of a virtual baby model, whose vital signs can be controlled from an external Java application. The main contributions of this work are: (a) outline and evaluation of the three-level VRML/Java implementation of the dynamic virtual environment, (b) proposal for a modified VRML Timesensor node, which greatly improves the overall control of system performance, and (c) architecture of the prototype distributed virtual environment for training in neonatal resuscitation comprising the interactive virtual newborn, active bedside monitor for vital signs and full 3D representation of the surgery room.

  20. The STRIVE-ONR Project: Stress Resistance in Virtual Environments

    DTIC Science & Technology

    2015-07-29

    from the Virtual Iraq/Afghanistan Virtual Reality Exposure Therapy (VRET) system at the University of Southern California Institute for Creative...better sense of health outcomes; that is, "how the social environment exerts a cumulative impact on the physical and mental well being of individuals...levels with functional decline in elderly men and women. Geriatrics & gerontology international, 9 3, 282-289. Goldman, N., Turra, C. M., Glei, D

  1. Two-photon calcium imaging in mice navigating a virtual reality environment.

    PubMed

    Leinweber, Marcus; Zmarz, Pawel; Buchmann, Peter; Argast, Paul; Hübener, Mark; Bonhoeffer, Tobias; Keller, Georg B

    2014-02-20

    In recent years, two-photon imaging has become an invaluable tool in neuroscience, as it allows for chronic measurement of the activity of genetically identified cells during behavior(1-6). Here we describe methods to perform two-photon imaging in mouse cortex while the animal navigates a virtual reality environment. We focus on the aspects of the experimental procedures that are key to imaging in a behaving animal in a brightly lit virtual environment. The key problems that arise in this experimental setup that we here address are: minimizing brain motion related artifacts, minimizing light leak from the virtual reality projection system, and minimizing laser induced tissue damage. We also provide sample software to control the virtual reality environment and to do pupil tracking. With these procedures and resources it should be possible to convert a conventional two-photon microscope for use in behaving mice.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markidis, S.; Rizwan, U.

    The use of virtual nuclear control room can be an effective and powerful tool for training personnel working in the nuclear power plants. Operators could experience and simulate the functioning of the plant, even in critical situations, without being in a real power plant or running any risk. 3D models can be exported to Virtual Reality formats and then displayed in the Virtual Reality environment providing an immersive 3D experience. However, two major limitations of this approach are that 3D models exhibit static textures, and they are not fully interactive and therefore cannot be used effectively in training personnel. Inmore » this paper we first describe a possible solution for embedding the output of a computer application in a 3D virtual scene, coupling real-world applications and VR systems. The VR system reported here grabs the output of an application running on an X server; creates a texture with the output and then displays it on a screen or a wall in the virtual reality environment. We then propose a simple model for providing interaction between the user in the VR system and the running simulator. This approach is based on the use of internet-based application that can be commanded by a laptop or tablet-pc added to the virtual environment. (authors)« less

  3. Towards a Transcription System of Sign Language for 3D Virtual Agents

    NASA Astrophysics Data System (ADS)

    Do Amaral, Wanessa Machado; de Martino, José Mario

    Accessibility is a growing concern in computer science. Since virtual information is mostly presented visually, it may seem that access for deaf people is not an issue. However, for prelingually deaf individuals, those who were deaf since before acquiring and formally learn a language, written information is often of limited accessibility than if presented in signing. Further, for this community, signing is their language of choice, and reading text in a spoken language is akin to using a foreign language. Sign language uses gestures and facial expressions and is widely used by deaf communities. To enabling efficient production of signed content on virtual environment, it is necessary to make written records of signs. Transcription systems have been developed to describe sign languages in written form, but these systems have limitations. Since they were not originally designed with computer animation in mind, in general, the recognition and reproduction of signs in these systems is an easy task only to those who deeply know the system. The aim of this work is to develop a transcription system to provide signed content in virtual environment. To animate a virtual avatar, a transcription system requires explicit enough information, such as movement speed, signs concatenation, sequence of each hold-and-movement and facial expressions, trying to articulate close to reality. Although many important studies in sign languages have been published, the transcription problem remains a challenge. Thus, a notation to describe, store and play signed content in virtual environments offers a multidisciplinary study and research tool, which may help linguistic studies to understand the sign languages structure and grammar.

  4. Varieties of virtualization

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    1991-01-01

    Natural environments have a content, i.e., the objects in them; a geometry, i.e., a pattern of rules for positioning and displacing the objects; and a dynamics, i.e., a system of rules describing the effects of forces acting on the objects. Human interaction with most common natural environments has been optimized by centuries of evolution. Virtual environments created through the human-computer interface similarly have a content, geometry, and dynamics, but the arbitrary character of the computer simulation creating them does not insure that human interaction with these virtual environments will be natural. The interaction, indeed, could be supernatural but it also could be impossible. An important determinant of the comprehensibility of a virtual environment is the correspondence between the environmental frames of reference and those associated with the control of environmental objects. The effects of rotation and displacement of control frames of reference with respect to corresponding environmental references differ depending upon whether perceptual judgement or manual tracking performance is measured. The perceptual effects of frame of reference displacement may be analyzed in terms of distortions in the process of virtualizing the synthetic environment space. The effects of frame of reference displacement and rotation have been studied by asking subjects to estimate exocentric direction in a virtual space.

  5. A New Continent of Ideas

    NASA Technical Reports Server (NTRS)

    1990-01-01

    While a new technology called 'virtual reality' is still at the 'ground floor' level, one of its basic components, 3D computer graphics is already in wide commercial use and expanding. Other components that permit a human operator to 'virtually' explore an artificial environment and to interact with it are being demonstrated routinely at Ames and elsewhere. Virtual reality might be defined as an environment capable of being virtually entered - telepresence, it is called - or interacted with by a human. The Virtual Interface Environment Workstation (VIEW) is a head-mounted stereoscopic display system in which the display may be an artificial computer-generated environment or a real environment relayed from remote video cameras. Operator can 'step into' this environment and interact with it. The DataGlove has a series of fiber optic cables and sensors that detect any movement of the wearer's fingers and transmit the information to a host computer; a computer generated image of the hand will move exactly as the operator is moving his gloved hand. With appropriate software, the operator can use the glove to interact with the computer scene by grasping an object. The DataSuit is a sensor equipped full body garment that greatly increases the sphere of performance for virtual reality simulations.

  6. VERDEX: A virtual environment demonstrator for remote driving applications

    NASA Technical Reports Server (NTRS)

    Stone, Robert J.

    1991-01-01

    One of the key areas of the National Advanced Robotics Centre's enabling technologies research program is that of the human system interface, phase 1 of which started in July 1989 and is currently addressing the potential of virtual environments to permit intuitive and natural interactions between a human operator and a remote robotic vehicle. The aim of the first 12 months of this program (to September, 1990) is to develop a virtual human-interface demonstrator for use later as a test bed for human factors experimentation. This presentation will describe the current state of development of the test bed, and will outline some human factors issues and problems for more general discussion. In brief, the virtual telepresence system for remote driving has been designed to take the following form. The human operator will be provided with a helmet-mounted stereo display assembly, facilities for speech recognition and synthesis (using the Marconi Macrospeak system), and a VPL DataGlove Model 2 unit. The vehicle to be used for the purposes of remote driving is a Cybermotion Navmaster K2A system, which will be equipped with a stereo camera and microphone pair, mounted on a motorized high-speed pan-and-tilt head incorporating a closed-loop laser ranging sensor for camera convergence control (currently under contractual development). It will be possible to relay information to and from the vehicle and sensory system via an umbilical or RF link. The aim is to develop an interactive audio-visual display system capable of presenting combined stereo TV pictures and virtual graphics windows, the latter featuring control representations appropriate for vehicle driving and interaction using a graphical 'hand,' slaved to the flex and tracking sensors of the DataGlove and an additional helmet-mounted Polhemus IsoTrack sensor. Developments planned for the virtual environment test bed include transfer of operator control between remote driving and remote manipulation, dexterous end effector integration, virtual force and tactile sensing (also the focus of a current ARRL contract, initially employing a 14-pneumatic bladder glove attachment), and sensor-driven world modeling for total virtual environment generation and operator-assistance in remote scene interrogation.

  7. Games as Distributed Teaching and Learning Systems

    ERIC Educational Resources Information Center

    Gee, Elisabeth; Gee, James Paul

    2017-01-01

    Background: Videogames and virtual worlds have frequently been studied as learning environments in isolation; that is, scholars have focused on understanding the features of games or virtual worlds as separate from or different than "real world" environments for learning. Although more recently, scholars have explored the teaching and…

  8. Integrated Data Visualization and Virtual Reality Tool

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  9. Estimating Distance in Real and Virtual Environments: Does Order Make a Difference?

    PubMed Central

    Ziemer, Christine J.; Plumert, Jodie M.; Cremer, James F.; Kearney, Joseph K.

    2010-01-01

    This investigation examined how the order in which people experience real and virtual environments influences their distance estimates. Participants made two sets of distance estimates in one of the following conditions: 1) real environment first, virtual environment second; 2) virtual environment first, real environment second; 3) real environment first, real environment second; or 4) virtual environment first, virtual environment second. In Experiment 1, participants imagined how long it would take to walk to targets in real and virtual environments. Participants’ first estimates were significantly more accurate in the real than in the virtual environment. When the second environment was the same as the first environment (real-real and virtual-virtual), participants’ second estimates were also more accurate in the real than in the virtual environment. When the second environment differed from the first environment (real-virtual and virtual-real), however, participants’ second estimates did not differ significantly across the two environments. A second experiment in which participants walked blindfolded to targets in the real environment and imagined how long it would take to walk to targets in the virtual environment replicated these results. These subtle, yet persistent order effects suggest that memory can play an important role in distance perception. PMID:19525540

  10. Development of and feedback on a fully automated virtual reality system for online training in weight management skills.

    PubMed

    Thomas, J Graham; Spitalnick, Josh S; Hadley, Wendy; Bond, Dale S; Wing, Rena R

    2015-01-01

    Virtual reality (VR) technology can provide a safe environment for observing, learning, and practicing use of behavioral weight management skills, which could be particularly useful in enhancing minimal contact online weight management programs. The Experience Success (ES) project developed a system for creating and deploying VR scenarios for online weight management skills training. Virtual environments populated with virtual actors allow users to experiment with implementing behavioral skills via a PC-based point and click interface. A culturally sensitive virtual coach guides the experience, including planning for real-world skill use. Thirty-seven overweight/obese women provided feedback on a test scenario focused on social eating situations. They reported that the scenario gave them greater skills, confidence, and commitment for controlling eating in social situations. © 2014 Diabetes Technology Society.

  11. Development of and Feedback on a Fully Automated Virtual Reality System for Online Training in Weight Management Skills

    PubMed Central

    Spitalnick, Josh S.; Hadley, Wendy; Bond, Dale S.; Wing, Rena R.

    2014-01-01

    Virtual reality (VR) technology can provide a safe environment for observing, learning, and practicing use of behavioral weight management skills, which could be particularly useful in enhancing minimal contact online weight management programs. The Experience Success (ES) project developed a system for creating and deploying VR scenarios for online weight management skills training. Virtual environments populated with virtual actors allow users to experiment with implementing behavioral skills via a PC-based point and click interface. A culturally sensitive virtual coach guides the experience, including planning for real-world skill use. Thirty-seven overweight/obese women provided feedback on a test scenario focused on social eating situations. They reported that the scenario gave them greater skills, confidence, and commitment for controlling eating in social situations. PMID:25367014

  12. The Modeling of Virtual Environment Distance Education

    NASA Astrophysics Data System (ADS)

    Xueqin, Chang

    This research presented a virtual environment that integrates in a virtual mockup services available in a university campus for students and teachers communication in different actual locations. Advantages of this system include: the remote access to a variety of services and educational tools, the representation of real structures and landscapes in an interactive 3D model that favors localization of services and preserves the administrative organization of the university. For that, the system was implemented a control access for users and an interface to allow the use of previous educational equipments and resources not designed for distance education mode.

  13. A virtual surgical environment for rehearsal of tympanomastoidectomy.

    PubMed

    Chan, Sonny; Li, Peter; Lee, Dong Hoon; Salisbury, J Kenneth; Blevins, Nikolas H

    2011-01-01

    This article presents a virtual surgical environment whose purpose is to assist the surgeon in preparation for individual cases. The system constructs interactive anatomical models from patient-specific, multi-modal preoperative image data, and incorporates new methods for visually and haptically rendering the volumetric data. Evaluation of the system's ability to replicate temporal bone dissections for tympanomastoidectomy, using intraoperative video of the same patients as guides, showed strong correlations between virtual and intraoperative anatomy. The result is a portable and cost-effective tool that may prove highly beneficial for the purposes of surgical planning and rehearsal.

  14. Using virtual reality environment to improve joint attention associated with pervasive developmental disorder.

    PubMed

    Cheng, Yufang; Huang, Ruowen

    2012-01-01

    The focus of this study is using data glove to practice Joint attention skill in virtual reality environment for people with pervasive developmental disorder (PDD). The virtual reality environment provides a safe environment for PDD people. Especially, when they made errors during practice in virtual reality environment, there is no suffering or dangerous consequences to deal with. Joint attention is a critical skill in the disorder characteristics of children with PDD. The absence of joint attention is a deficit frequently affects their social relationship in daily life. Therefore, this study designed the Joint Attention Skills Learning (JASL) systems with data glove tool to help children with PDD to practice joint attention behavior skills. The JASL specifically focus the skills of pointing, showing, sharing things and behavior interaction with other children with PDD. The system is designed in playroom-scene and presented in the first-person perspectives for users. The functions contain pointing and showing, moving virtual objects, 3D animation, text, speaking sounds, and feedback. The method was employed single subject multiple-probe design across subjects' designs, and analysis of visual inspection in this study. It took 3 months to finish the experimental section. Surprisingly, the experiment results reveal that the participants have further extension in improving the joint attention skills in their daily life after using the JASL system. The significant potential in this particular treatment of joint attention for each participant will be discussed in details in this paper. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Personalized Virtual Learning Environment from the Detection of Learning Styles

    ERIC Educational Resources Information Center

    Martínez Cartas, M. L.; Cruz Pérez, N.; Deliche Quesada, D.; Mateo Quero, S.

    2013-01-01

    Through the previous detection of existing learning styles in a classroom, a Virtual Learning Environment (VLE) has been designed for students of several Engineering degrees, using the Learning Management System (LMS) utilized in the University of Jaen, ILIAS. Learning styles of three different Knowledge Areas; Chemical Engineering, Materials…

  16. Virtual Environments Using Video Capture for Social Phobia with Psychosis

    PubMed Central

    White, Richard; Clarke, Timothy; Turner, Ruth; Fowler, David

    2013-01-01

    Abstract A novel virtual environment (VE) system was developed and used as an adjunct to cognitive behavior therapy (CBT) with six socially anxious patients recovering from psychosis. The novel aspect of the VE system is that it uses video capture so the patients can see a life-size projection of themselves interacting with a specially scripted and digitally edited filmed environment played in real time on a screen in front of them. Within-session process outcomes (subjective units of distress and belief ratings on individual behavioral experiments), as well as patient feedback, generated the hypothesis that this type of virtual environment can potentially add value to CBT by helping patients understand the role of avoidance and safety behaviors in the maintenance of social anxiety and paranoia and by boosting their confidence to carry out “real-life” behavioral experiments. PMID:23659722

  17. Integration of the virtual model of a Stewart platform with the avatar of a vehicle in a virtual reality

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2016-08-01

    The development of methods of computer aided design and engineering allows conducting virtual tests, among others concerning motion simulation of technical means. The paper presents a method of integrating an object in the form of a virtual model of a Stewart platform with an avatar of a vehicle moving in a virtual environment. The area of the problem includes issues related to the problem of fidelity of mapping the work of the analyzed technical mean. The main object of investigations is a 3D model of a Stewart platform, which is a subsystem of the simulator designated for driving learning for disabled persons. The analyzed model of the platform, prepared for motion simulation, was created in the “Motion Simulation” module of a CAD/CAE class system Siemens PLM NX. Whereas the virtual environment, in which the moves the avatar of the passenger car, was elaborated in a VR class system EON Studio. The element integrating both of the mentioned software environments is a developed application that reads information from the virtual reality (VR) concerning the current position of the car avatar. Then, basing on the accepted algorithm, it sends control signals to respective joints of the model of the Stewart platform (CAD).

  18. Performance implications from sizing a VM on multi-core systems: A Data analytic application s view

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Horey, James L; Begoli, Edmon

    In this paper, we present a quantitative performance analysis of data analytics applications running on multi-core virtual machines. Such environments form the core of cloud computing. In addition, data analytics applications, such as Cassandra and Hadoop, are becoming increasingly popular on cloud computing platforms. This convergence necessitates a better understanding of the performance and cost implications of such hybrid systems. For example, the very rst step in hosting applications in virtualized environments, requires the user to con gure the number of virtual processors and the size of memory. To understand performance implications of this step, we benchmarked three Yahoo Cloudmore » Serving Benchmark (YCSB) workloads in a virtualized multi-core environment. Our measurements indicate that the performance of Cassandra for YCSB workloads does not heavily depend on the processing capacity of a system, while the size of the data set is critical to performance relative to allocated memory. We also identi ed a strong relationship between the running time of workloads and various hardware events (last level cache loads, misses, and CPU migrations). From this analysis, we provide several suggestions to improve the performance of data analytics applications running on cloud computing environments.« less

  19. Fire training in a virtual-reality environment

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Jurgen; Bucken, Arno

    2005-03-01

    Although fire is very common in our daily environment - as a source of energy at home or as a tool in industry - most people cannot estimate the danger of a conflagration. Therefore it is important to train people in combating fire. Beneath training with propane simulators or real fires and real extinguishers, fire training can be performed in virtual reality, which means a pollution-free and fast way of training. In this paper we describe how to enhance a virtual-reality environment with a real-time fire simulation and visualisation in order to establish a realistic emergency-training system. The presented approach supports extinguishing of the virtual fire including recordable performance data as needed in teletraining environments. We will show how to get realistic impressions of fire using advanced particle-simulation and how to use the advantages of particles to trigger states in a modified cellular automata used for the simulation of fire-behaviour. Using particle systems that interact with cellular automata it is possible to simulate a developing, spreading fire and its reaction on different extinguishing agents like water, CO2 or oxygen. The methods proposed in this paper have been implemented and successfully tested on Cosimir, a commercial robot-and VR-simulation-system.

  20. Cardiovascular effects of Zumba® performed in a virtual environment using XBOX Kinect

    PubMed Central

    Neves, Luceli Eunice Da Silva; Cerávolo, Mariza Paver Da Silva; Silva, Elisangela; De Freitas, Wagner Zeferino; Da Silva, Fabiano Fernandes; Higino, Wonder Passoni; Carvalho, Wellington Roberto Gomes; De Souza, Renato Aparecido

    2015-01-01

    [Purpose] This study evaluated the acute cardiovascular responses during a session of Zumba® Fitness in a virtual reality environment. [Subjects] Eighteen healthy volunteers were recruited. [Methods] The following cardiovascular variables: heart rate, systolic blood pressure, diastolic blood pressure, and double product were assessed before and after the practice of virtual Zumba®, which was performed as a continuous sequence of five choreographed movements lasting for 22 min. The game Zumba Fitness Core®, with the Kinect-based virtual reality system for the XBOX 360, was used to create the virtual environment. Comparisons were made among mean delta values (delta=post-Zumba® minus pre-Zumba® values) for systolic and diastolic blood pressure, heart rate, and double product using Student’s t-test for paired samples. [Results] After a single session, a significant increase was noted in all the analyzed parameters (Systolic blood pressure=18%; Diastolic blood pressure=13%; Heart rate=67%; and Double product=97%). [Conclusion] The results support the feasibility of the use of Zumba Fitness Core® with the Kinect-based virtual reality system for the XBOX 360 in physical activity programs and further favor its indication for this purpose. PMID:26504312

  1. NASA Virtual Glovebox (VBX): Emerging Simulation Technology for Space Station Experiment Design, Development, Training and Troubleshooting

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey D.; Twombly, I. Alexander; Maese, A. Christopher; Cagle, Yvonne; Boyle, Richard

    2003-01-01

    The International Space Station demonstrates the greatest capabilities of human ingenuity, international cooperation and technology development. The complexity of this space structure is unprecedented; and training astronaut crews to maintain all its systems, as well as perform a multitude of research experiments, requires the most advanced training tools and techniques. Computer simulation and virtual environments are currently used by astronauts to train for robotic arm manipulations and extravehicular activities; but now, with the latest computer technologies and recent successes in areas of medical simulation, the capability exists to train astronauts for more hands-on research tasks using immersive virtual environments. We have developed a new technology, the Virtual Glovebox (VGX), for simulation of experimental tasks that astronauts will perform aboard the Space Station. The VGX may also be used by crew support teams for design of experiments, testing equipment integration capability and optimizing the procedures astronauts will use. This is done through the 3D, desk-top sized, reach-in virtual environment that can simulate the microgravity environment in space. Additional features of the VGX allow for networking multiple users over the internet and operation of tele-robotic devices through an intuitive user interface. Although the system was developed for astronaut training and assisting support crews, Earth-bound applications, many emphasizing homeland security, have also been identified. Examples include training experts to handle hazardous biological and/or chemical agents in a safe simulation, operation of tele-robotic systems for assessing and diffusing threats such as bombs, and providing remote medical assistance to field personnel through a collaborative virtual environment. Thus, the emerging VGX simulation technology, while developed for space- based applications, can serve a dual use facilitating homeland security here on Earth.

  2. The Optokinetic Cervical Reflex (OKCR) in Pilots of High-Performance Aircraft.

    DTIC Science & Technology

    1997-04-01

    Coupled System virtual reality - the attempt to create a realistic, three-dimensional environment or synthetic immersive environment in which the user ...factors interface between the pilot and the flight environment. The final section is a case study of head- and helmet-mounted displays (HMD) and the impact...themselves as actually moving (flying) through a virtual environment. However, in the studies of Held, et al. (1975) and Young, et al. (1975) the

  3. Emerging Conceptual Understanding of Complex Astronomical Phenomena by Using a Virtual Solar System

    ERIC Educational Resources Information Center

    Gazit, Elhanan; Yair, Yoav; Chen, David

    2005-01-01

    This study describes high school students' conceptual development of the basic astronomical phenomena during real-time interactions with a Virtual Solar System (VSS). The VSS is a non-immersive virtual environment which has a dynamic frame of reference that can be altered by the user. Ten 10th grade students were given tasks containing a set of…

  4. Binaural room simulation

    NASA Technical Reports Server (NTRS)

    Lehnert, H.; Blauert, Jens; Pompetzki, W.

    1991-01-01

    In every-day listening the auditory event perceived by a listener is determined not only by the sound signal that a sound emits but also by a variety of environmental parameters. These parameters are the position, orientation and directional characteristics of the sound source, the listener's position and orientation, the geometrical and acoustical properties of surfaces which affect the sound field and the sound propagation properties of the surrounding fluid. A complete set of these parameters can be called an Acoustic Environment. If the auditory event perceived by a listener is manipulated in such a way that the listener is shifted acoustically into a different acoustic environment without moving himself physically, a Virtual Acoustic Environment has been created. Here, we deal with a special technique to set up nearly arbitrary Virtual Acoustic Environments, the Binaural Room Simulation. The purpose of the Binaural Room Simulation is to compute the binaural impulse response related to a virtual acoustic environment taking into account all parameters mentioned above. One possible way to describe a Virtual Acoustic Environment is the concept of the virtual sound sources. Each of the virtual sources emits a certain signal which is correlated but not necessarily identical with the signal emitted by the direct sound source. If source and receiver are non moving, the acoustic environment becomes a linear time-invariant system. Then, the Binaural Impulse Response from the source to a listener' s eardrums contains all relevant auditory information related to the Virtual Acoustic Environment. Listening into the simulated environment can easily be achieved by convolving the Binaural Impulse Response with dry signals and representing the results via headphones.

  5. Achievement of Virtual and Real Objects Using a Short-Term Motor Learning Protocol in People with Duchenne Muscular Dystrophy: A Crossover Randomized Controlled Trial.

    PubMed

    Massetti, Thais; Fávero, Francis Meire; Menezes, Lilian Del Ciello de; Alvarez, Mayra Priscila Boscolo; Crocetta, Tânia Brusque; Guarnieri, Regiani; Nunes, Fátima L S; Monteiro, Carlos Bandeira de Mello; Silva, Talita Dias da

    2018-04-01

    To evaluate whether people with Duchenne muscular dystrophy (DMD) practicing a task in a virtual environment could improve performance given a similar task in a real environment, as well as distinguishing whether there is transference between performing the practice in virtual environment and then a real environment and vice versa. Twenty-two people with DMD were evaluated and divided into two groups. The goal was to reach out and touch a red cube. Group A began with the real task and had to touch a real object, and Group B began with the virtual task and had to reach a virtual object using the Kinect system. ANOVA showed that all participants decreased the movement time from the first (M = 973 ms) to the last block of acquisition (M = 783 ms) in both virtual and real tasks and motor learning could be inferred by the short-term retention and transfer task (with increasing distance of the target). However, the evaluation of task performance demonstrated that the virtual task provided an inferior performance when compared to the real task in all phases of the study, and there was no effect for sequence. Both virtual and real tasks promoted improvement of performance in the acquisition phase, short-term retention, and transfer. However, there was no transference of learning between environments. In conclusion, it is recommended that the use of virtual environments for individuals with DMD needs to be considered carefully.

  6. Immersive virtual reality improves movement patterns in patients after ACL reconstruction: implications for enhanced criteria-based return-to-sport rehabilitation.

    PubMed

    Gokeler, Alli; Bisschop, Marsha; Myer, Gregory D; Benjaminse, Anne; Dijkstra, Pieter U; van Keeken, Helco G; van Raay, Jos J A M; Burgerhof, Johannes G M; Otten, Egbert

    2016-07-01

    The purpose of this study was to evaluate the influence of immersion in a virtual reality environment on knee biomechanics in patients after ACL reconstruction (ACLR). It was hypothesized that virtual reality techniques aimed to change attentional focus would influence altered knee flexion angle, knee extension moment and peak vertical ground reaction force (vGRF) in patients following ACLR. Twenty athletes following ACLR and 20 healthy controls (CTRL) performed a step-down task in both a non-virtual reality environment and a virtual reality environment displaying a pedestrian traffic scene. A motion analysis system and force plates were used to measure kinematics and kinetics during a step-down task to analyse each single-leg landing. A significant main effect was found for environment for knee flexion excursion (P = n.s.). Significant interaction differences were found between environment and groups for vGRF (P = 0.004), knee moment (P < 0.001), knee angle at peak vGRF (P = 0.01) and knee flexion excursion (P = 0.03). There was larger effect of virtual reality environment on knee biomechanics in patients after ACLR compared with controls. Patients after ACLR immersed in virtual reality environment demonstrated knee joint biomechanics that approximate those of CTRL. The results of this study indicate that a realistic virtual reality scenario may distract patients after ACLR from conscious motor control. Application of clinically available technology may aid in current rehabilitation programmes to target altered movement patterns after ACLR. Diagnostic study, Level III.

  7. Using Virtual Reality For Outreach Purposes in Planetology

    NASA Astrophysics Data System (ADS)

    Civet, François; Le Mouélic, Stéphane; Le Menn, Erwan; Beaunay, Stéphanie

    2016-10-01

    2016 has been a year marked by a technological breakthrough : the availability for the first time to the general public of technologically mature virtual reality devices. Virtual Reality consists in visually immerging a user in a 3D environment reproduced either from real and/or imaginary data, with the possibility to move and eventually interact with the different elements. In planetology, most of the places will remain inaccessible to the public for a while, but a fleet of dedicated spacecraft's such as orbiters, landers and rovers allow the possibility to virtually reconstruct the environments, using image processing, cartography and photogrammetry. Virtual reality can then bridge the gap to virtually "send" any user into the place and enjoy the exploration.We are investigating several type of devices to render orbital or ground based data of planetological interest, mostly from Mars. The most simple system consists of a "cardboard" headset, on which the user can simply use his cellphone as the screen. A more comfortable experience is obtained with more complex systems such as the HTC vive or Oculus Rift headsets, which include a tracking system important to minimize motion sickness. The third environment that we have developed is based on the CAVE concept, were four 3D video projectors are used to project on three 2x3m walls plus the ground. These systems can be used for scientific data analysis, but also prove to be perfectly suited for outreach and education purposes.

  8. Virtual Tissues and Developmental Systems Biology (book chapter)

    EPA Science Inventory

    Virtual tissue (VT) models provide an in silico environment to simulate cross-scale properties in specific tissues or organs based on knowledge of the underlying biological networks. These integrative models capture the fundamental interactions in a biological system and enable ...

  9. Rehabilitation Program Integrating Virtual Environment to Improve Orientation and Mobility Skills for People Who Are Blind

    PubMed Central

    Lahav, Orly; Schloerb, David W.; Srinivasan, Mandayam A.

    2014-01-01

    This paper presents the integration of a virtual environment (BlindAid) in an orientation and mobility rehabilitation program as a training aid for people who are blind. BlindAid allows the users to interact with different virtual structures and objects through auditory and haptic feedback. This research explores if and how use of the BlindAid in conjunction with a rehabilitation program can help people who are blind train themselves in familiar and unfamiliar spaces. The study, focused on nine participants who were congenitally, adventitiously, and newly blind, during their orientation and mobility rehabilitation program at the Carroll Center for the Blind (Newton, Massachusetts, USA). The research was implemented using virtual environment (VE) exploration tasks and orientation tasks in virtual environments and real spaces. The methodology encompassed both qualitative and quantitative methods, including interviews, a questionnaire, videotape recording, and user computer logs. The results demonstrated that the BlindAid training gave participants additional time to explore the virtual environment systematically. Secondly, it helped elucidate several issues concerning the potential strengths of the BlindAid system as a training aid for orientation and mobility for both adults and teenagers who are congenitally, adventitiously, and newly blind. PMID:25284952

  10. [A new age of mass casuality education? : The InSitu project: realistic training in virtual reality environments].

    PubMed

    Lorenz, D; Armbruster, W; Vogelgesang, C; Hoffmann, H; Pattar, A; Schmidt, D; Volk, T; Kubulus, D

    2016-09-01

    Chief emergency physicians are regarded as an important element in the care of the injured and sick following mass casualty accidents. Their education is very theoretical; practical content in contrast often falls short. Limitations are usually the very high costs of realistic (large-scale) exercises, poor reproducibility of the scenarios, and poor corresponding results. To substantially improve the educational level because of the complexity of mass casualty accidents, modified training concepts are required that teach the not only the theoretical but above all the practical skills considerably more intensively than at present. Modern training concepts should make it possible for the learner to realistically simulate decision processes. This article examines how interactive virtual environments are applicable for the education of emergency personnel and how they could be designed. Virtual simulation and training environments offer the possibility of simulating complex situations in an adequately realistic manner. The so-called virtual reality (VR) used in this context is an interface technology that enables free interaction in addition to a stereoscopic and spatial representation of virtual large-scale emergencies in a virtual environment. Variables in scenarios such as the weather, the number wounded, and the availability of resources, can be changed at any time. The trainees are able to practice the procedures in many virtual accident scenes and act them out repeatedly, thereby testing the different variants. With the aid of the "InSitu" project, it is possible to train in a virtual reality with realistically reproduced accident situations. These integrated, interactive training environments can depict very complex situations on a scale of 1:1. Because of the highly developed interactivity, the trainees can feel as if they are a direct part of the accident scene and therefore identify much more with the virtual world than is possible with desktop systems. Interactive, identifiable, and realistic training environments based on projector systems could in future enable a repetitive exercise with changes within a decision tree, in reproducibility, and within different occupational groups. With a hard- and software environment numerous accident situations can be depicted and practiced. The main expense is the creation of the virtual accident scenes. As the appropriate city models and other three-dimensional geographical data are already available, this expenditure is very low compared with the planning costs of a large-scale exercise.

  11. Visual Stability of Objects and Environments Viewed through Head-Mounted Displays

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.; Adelstein, Bernard D.

    2015-01-01

    Virtual Environments (aka Virtual Reality) is again catching the public imagination and a number of startups (e.g. Oculus) and even not-so-startup companies (e.g. Microsoft) are trying to develop display systems to capitalize on this renewed interest. All acknowledge that this time they will get it right by providing the required dynamic fidelity, visual quality, and interesting content for the concept of VR to take off and change the world in ways it failed to do so in past incarnations. Some of the surprisingly long historical background of the technology that the form of direct simulation that underlies virtual environment and augmented reality displays will be briefly reviewed. An example of a mid 1990's augmented reality display system with good dynamic performance from our lab will be used to illustrate some of the underlying phenomena and technology concerning visual stability of virtual environments and objects during movement. In conclusion some idealized performance characteristics for a reference system will be proposed. Interestingly, many systems more or less on the market now may actually meet many of these proposed technical requirements. This observation leads to the conclusion that the current success of the IT firms trying to commercialize the technology will depend on the hidden costs of using the systems as well as the development of interesting and compelling content.

  12. Virtual Education in Universities: A Technological Imperative.

    ERIC Educational Resources Information Center

    O'Donoghue, John; Singh, Gurmak; Dorward, Lisa

    2001-01-01

    Discusses virtual universities and virtual classrooms, exploring both the benefits and the disadvantages of technology-based delivery systems. Highlights include competitive pressures to use technology; impacts on students; the need for flexibility to meet unique student needs and learning styles; learning environments; impact on society; and…

  13. The Design, Development and Evaluation of a Virtual Reality Based Learning Environment

    ERIC Educational Resources Information Center

    Chen, Chwen Jen

    2006-01-01

    Many researchers and instructional designers increasingly recognise the benefits of utilising three dimensional virtual reality (VR) technology in instruction. In general, there are two types of VR system, the immersive system and the non-immersive system. This article focuses on the latter system that merely uses the conventional personal…

  14. Status and Roadmap of CernVM

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Blomer, J.; Buncic, P.; Charalampidis, I.; Ganis, G.; Meusel, R.

    2015-12-01

    Cloud resources nowadays contribute an essential share of resources for computing in high-energy physics. Such resources can be either provided by private or public IaaS clouds (e.g. OpenStack, Amazon EC2, Google Compute Engine) or by volunteers computers (e.g. LHC@Home 2.0). In any case, experiments need to prepare a virtual machine image that provides the execution environment for the physics application at hand. The CernVM virtual machine since version 3 is a minimal and versatile virtual machine image capable of booting different operating systems. The virtual machine image is less than 20 megabyte in size. The actual operating system is delivered on demand by the CernVM File System. CernVM 3 has matured from a prototype to a production environment. It is used, for instance, to run LHC applications in the cloud, to tune event generators using a network of volunteer computers, and as a container for the historic Scientific Linux 5 and Scientific Linux 4 based software environments in the course of long-term data preservation efforts of the ALICE, CMS, and ALEPH experiments. We present experience and lessons learned from the use of CernVM at scale. We also provide an outlook on the upcoming developments. These developments include adding support for Scientific Linux 7, the use of container virtualization, such as provided by Docker, and the streamlining of virtual machine contextualization towards the cloud-init industry standard.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Bowman, Doug A; Scerbo, Siroberto

    Virtual reality (VR) systems have been proposed for use in numerous training scenarios, such as room clearing, which require the trainee to maintain spatial awareness. But many VR training systems lack a fully surrounding display, requiring trainees to use a combination of physical and virtual turns to view the environment, thus decreasing spatial awareness. One solution to this problem is to amplify head rotations, such that smaller physical turns are mapped to larger virtual turns, allowing trainees to view the surrounding environment with head movements alone. For example, in a multi-monitor system covering only a 90-degree field of regard, headmore » rotations could be amplified four times to allow the user to see the entire 360-degree surrounding environment. This solution is attractive because it can be used with lower-cost VR systems and does not require virtual turning. However, the effects of amplified head rotations on spatial awareness and training transfer are not well understood. We hypothesized that small amounts of amplification might be tolerable, but that larger amplifications might cause trainees to become disoriented and to have decreased task performance and training transfer. In this paper, we will present our findings from an experiment designed to investigate these hypotheses. The experiment placed users in a virtual warehouse and asked them to move from room to room, counting objects placed around them in space. We varied the amount of amplification applied during these trials, and also varied the type of display used (head-mounted display or CAVE). We measured task performance and spatial awareness. We then assessed training transfer in an assessment environment with a fully surrounding display and no amplification. The results of this study will inform VR training system developers about the potential negative effects of using head rotation amplification and contribute to more effective VR training system design.« less

  16. Learning Objects and Virtual Learning Environments Technical Evaluation Criteria

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Dagiene, Valentina

    2009-01-01

    The main scientific problems investigated in this article deal with technical evaluation of quality attributes of the main components of e-Learning systems (referred here as DLEs--Digital Libraries of Educational Resources and Services), i.e., Learning Objects (LOs) and Virtual Learning Environments (VLEs). The main research object of the work is…

  17. Avatars, Pedagogical Agents, and Virtual Environments: Social Learning Systems Online

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Martens, Jon; Dotterer, Gary; Calhoun, Pat

    2009-01-01

    This paper presents a review of literature that introduces major concepts and issues in using avatars and pedagogical agents in first- and second-person virtual environments (VEs) for learning online. In these VEs, avatars and pedagogical agents represent self and other learners/participants or serve as personal learning "guides". The…

  18. Using Immersive Virtual Environments for Certification

    NASA Technical Reports Server (NTRS)

    Lutz, R.; Cruz-Neira, C.

    1998-01-01

    Immersive virtual environments (VEs) technology has matured to the point where it can be utilized as a scientific and engineering problem solving tool. In particular, VEs are starting to be used to design and evaluate safety-critical systems that involve human operators, such as flight and driving simulators, complex machinery training, and emergency rescue strategies.

  19. Large-scale P2P network based distributed virtual geographic environment (DVGE)

    NASA Astrophysics Data System (ADS)

    Tan, Xicheng; Yu, Liang; Bian, Fuling

    2007-06-01

    Virtual Geographic Environment has raised full concern as a kind of software information system that helps us understand and analyze the real geographic environment, and it has also expanded to application service system in distributed environment--distributed virtual geographic environment system (DVGE), and gets some achievements. However, limited by the factor of the mass data of VGE, the band width of network, as well as numerous requests and economic, etc. DVGE still faces some challenges and problems which directly cause the current DVGE could not provide the public with high-quality service under current network mode. The Rapid development of peer-to-peer network technology has offered new ideas of solutions to the current challenges and problems of DVGE. Peer-to-peer network technology is able to effectively release and search network resources so as to realize efficient share of information. Accordingly, this paper brings forth a research subject on Large-scale peer-to-peer network extension of DVGE as well as a deep study on network framework, routing mechanism, and DVGE data management on P2P network.

  20. Ergonomic approaches to designing educational materials for immersive multi-projection system

    NASA Astrophysics Data System (ADS)

    Shibata, Takashi; Lee, JaeLin; Inoue, Tetsuri

    2014-02-01

    Rapid advances in computer and display technologies have made it possible to present high quality virtual reality (VR) environment. To use such virtual environments effectively, research should be performed into how users perceive and react to virtual environment in view of particular human factors. We created a VR simulation of sea fish for science education, and we conducted an experiment to examine how observers perceive the size and depth of an object within their reach and evaluated their visual fatigue. We chose a multi-projection system for presenting the educational VR simulation, because this system can provide actual-size objects and produce stereo images located close to the observer. The results of the experiment show that estimation of size and depth was relatively accurate when subjects used physical actions to assess them. Presenting images within the observer's reach is suggested to be useful for education in VR environment. Evaluation of visual fatigue shows that the level of symptoms from viewing stereo images with a large disparity in VR environment was low in a short time.

  1. Crushing virtual cigarettes reduces tobacco addiction and treatment discontinuation.

    PubMed

    Girard, Benoit; Turcotte, Vincent; Bouchard, Stéphane; Girard, Bruno

    2009-10-01

    Pilot studies revealed promising results regarding crushing virtual cigarettes to reduce tobacco addiction. In this study, 91 regular smokers were randomly assigned to two treatment conditions that differ only by the action performed in the virtual environment: crushing virtual cigarettes or grasping virtual balls. All participants also received minimal psychosocial support from nurses during each of 12 visits to the clinic. An affordable virtual reality system was used (eMagin HMD) with a virtual environment created by modifying a 3D game. Results revealed that crushing virtual cigarettes during 4 weekly sessions led to a statistically significant reduction in nicotine addiction (assessed with the Fagerström test), abstinence rate (confirmed with exhaled carbon monoxide), and drop-out rate from the 12-week psychosocial minimal-support treatment program. Increased retention in the program is discussed as a potential explanation for treatment success, and hypotheses are raised about self-efficacy, motivation, and learning.

  2. BIM based virtual environment for fire emergency evacuation.

    PubMed

    Wang, Bin; Li, Haijiang; Rezgui, Yacine; Bradley, Alex; Ong, Hoang N

    2014-01-01

    Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management.

  3. The benefits of virtual reality simulator training for laparoscopic surgery.

    PubMed

    Hart, Roger; Karthigasu, Krishnan

    2007-08-01

    Virtual reality is a computer-generated system that provides a representation of an environment. This review will analyse the literature with regard to any benefit to be derived from training with virtual reality equipment and to describe the current equipment available. Virtual reality systems are not currently realistic of the live operating environment because they lack tactile sensation, and do not represent a complete operation. The literature suggests that virtual reality training is a valuable learning tool for gynaecologists in training, particularly those in the early stages of their careers. Furthermore, it may be of benefit for the ongoing audit of surgical skills and for the early identification of a surgeon's deficiencies before operative incidents occur. It is only a matter of time before realistic virtual reality models of most complete gynaecological operations are available, with improved haptics as a result of improved computer technology. It is inevitable that in the modern climate of litigation virtual reality training will become an essential part of clinical training, as evidence for its effectiveness as a training tool exists, and in many countries training by operating on live animals is not possible.

  4. An artificial reality environment for remote factory control and monitoring

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    Work has begun on the merger of two well known systems, VEOS (HITLab) and CLIPS (NASA). In the recent past, the University of Massachusetts Lowell developed a parallel version of NASA CLIPS, called P-CLIPS. This modification allows users to create smaller expert systems which are able to communicate with each other to jointly solve problems. With the merger of a VEOS message system, PCLIPS-V can now act as a group of entities working within VEOS. To display the 3D virtual world we have been using a graphics package called HOOPS, from Ithaca Software. The artificial reality environment we have set up contains actors and objects as found in our Lincoln Logs Factory of the Future project. The environment allows us to view and control the objects within the virtual world. All communication between the separate CLIPS expert systems is done through VEOS. A graphical renderer generates camera views on X-Windows devices; Head Mounted Devices are not required. This allows more people to make use of this technology. We are experimenting with different types of virtual vehicles to give the user a sense that he or she is actually moving around inside the factory looking ahead through windows and virtual monitors.

  5. A Cooperative Approach to Virtual Machine Based Fault Injection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naughton III, Thomas J; Engelmann, Christian; Vallee, Geoffroy R

    Resilience investigations often employ fault injection (FI) tools to study the effects of simulated errors on a target system. It is important to keep the target system under test (SUT) isolated from the controlling environment in order to maintain control of the experiement. Virtual machines (VMs) have been used to aid these investigations due to the strong isolation properties of system-level virtualization. A key challenge in fault injection tools is to gain proper insight and context about the SUT. In VM-based FI tools, this challenge of target con- text is increased due to the separation between host and guest (VM).more » We discuss an approach to VM-based FI that leverages virtual machine introspection (VMI) methods to gain insight into the target s context running within the VM. The key to this environment is the ability to provide basic information to the FI system that can be used to create a map of the target environment. We describe a proof- of-concept implementation and a demonstration of its use to introduce simulated soft errors into an iterative solver benchmark running in user-space of a guest VM.« less

  6. ICCE/ICCAI 2000 Full & Short Papers (Virtual Lab/Classroom/School).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers on virtual laboratories, classrooms, and schools from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Collaborative Learning Support System Based on Virtual Environment Server for Multiple…

  7. Virtual Collaborative Environments for System of Systems Engineering and Applications for ISAT

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    2002-01-01

    This paper describes an system of systems or metasystems approach and models developed to help prepare engineering organizations for distributed engineering environments. These changes in engineering enterprises include competition in increasingly global environments; new partnering opportunities caused by advances in information and communication technologies, and virtual collaboration issues associated with dispersed teams. To help address challenges and needs in this environment, a framework is proposed that can be customized and adapted for NASA to assist in improved engineering activities conducted in distributed, enhanced engineering environments. The approach is designed to prepare engineers for such distributed collaborative environments by learning and applying e-engineering methods and tools to a real-world engineering development scenario. The approach consists of two phases: an e-engineering basics phase and e-engineering application phase. The e-engineering basics phase addresses skills required for e-engineering. The e-engineering application phase applies these skills in a distributed collaborative environment to system development projects.

  8. Research on elastic resource management for multi-queue under cloud computing environment

    NASA Astrophysics Data System (ADS)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  9. Design of an efficient framework for fast prototyping of customized human-computer interfaces and virtual environments for rehabilitation.

    PubMed

    Avola, Danilo; Spezialetti, Matteo; Placidi, Giuseppe

    2013-06-01

    Rehabilitation is often required after stroke, surgery, or degenerative diseases. It has to be specific for each patient and can be easily calibrated if assisted by human-computer interfaces and virtual reality. Recognition and tracking of different human body landmarks represent the basic features for the design of the next generation of human-computer interfaces. The most advanced systems for capturing human gestures are focused on vision-based techniques which, on the one hand, may require compromises from real-time and spatial precision and, on the other hand, ensure natural interaction experience. The integration of vision-based interfaces with thematic virtual environments encourages the development of novel applications and services regarding rehabilitation activities. The algorithmic processes involved during gesture recognition activity, as well as the characteristics of the virtual environments, can be developed with different levels of accuracy. This paper describes the architectural aspects of a framework supporting real-time vision-based gesture recognition and virtual environments for fast prototyping of customized exercises for rehabilitation purposes. The goal is to provide the therapist with a tool for fast implementation and modification of specific rehabilitation exercises for specific patients, during functional recovery. Pilot examples of designed applications and preliminary system evaluation are reported and discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Ames Lab 101: osgBullet

    ScienceCinema

    McCorkle, Doug

    2017-12-27

    Ames Laboratory scientist Doug McCorkle explains osgBullet, a 3-D virtual simulation software, and how it helps engineers design complex products and systems in a realistic, real-time virtual environment.

  11. A Virtual Research Environment for a Secondary Ion Mass Spectrometer (SIMS)

    NASA Astrophysics Data System (ADS)

    Wiedenbeck, M.; Schäfer, L.; Klump, J.; Galkin, A.

    2013-12-01

    Overview: This poster describes the development of a Virtual Research Environment for the Secondary Ion Mass Spectrometer (SIMS) at GFZ Potsdam. Background: Secondary Ion Mass Spectrometers (SIMS) are extremely sensitive instruments for analyzing the surfaces of solid and thin film samples. These instruments are rare, expensive and experienced operators are very highly sought after. As such, measurement time is a precious commodity, until now only accessible to small numbers of researchers. The challenge: The Virtual SIMS Project aims to set up a Virtual Research Environment for the operation of the CAMECA IMS 1280-HR instrument at the GFZ Potsdam. The objective of the VRE is to provide SIMS access not only to researchers locally present in Potsdam but also to scientists working with SIMS cooperation partners in e.g., South Africa, Brazil or India. The requirements: The system should address the complete spectrum of laboratory procedures - from online application for measurement time, to remote access for data acquisition to data archiving for the subsequent publication and for future reuse. The approach: The targeted Virtual SIMS Environment will consist of a: 1. Web Server running the Virtual SIMS website providing general information about the project, lab access proposal forms and calendar for the timing of project related tasks. 2. LIMS Server, responsible for scheduling procedures, data management and, if applicable, accounting and billing. 3. Remote SIMS Tool, devoted to the operation of the experiment within a remote control environment. 4. Publishing System, which supports the publication of results in cooperation with the GFZ Library services. 5. Training Simulator, which offers the opportunity to rehearse experiments and to prepare for possible events such as a power outages or interruptions to broadband services. First results: The SIMS Virtual Research Environment will be mainly based on open source software, the only exception being the CAMECA IMS 1280-HR SIMS operating under LabView. The Publishing System will be based on eSciDoc, which is already successfully used by the GFZ scientific library. For the LIMS Server we are currently testing various options. The challenge, however, is the successful integration of all the various components and, where necessary, the definition of useful interfaces between the modules.

  12. Validation of reaching in a virtual environment in typically developing children and children with mild unilateral cerebral palsy.

    PubMed

    Robert, Maxime T; Levin, Mindy F

    2018-04-01

    To compare three reaching movements made in two planes between a low-cost, game-based virtual reality and a matched physical environment in typically developing children and children with cerebral palsy (CP). To determine if differences in kinematics are related to sensory deficits. An observational study in which 27 children (typically developing, n=17, mean age 13y, [SD] 2y 2mo, range 9y 3mo-17y 2mo; CP, n=10, mean age 13y 8mo, [SD] 1y 8mo, range 11y 1mo-17y 1mo, Manual Ability Classification System levels I-II) performed 15 trials of three gestures in each of a virtual reality and a matched physical environment. Upper-limb and trunk kinematics were recorded using an electromagnetic system (G4, Polhemus, six markers, 120Hz). Compared to the physical environment, movements in virtual reality made by typically developing children were slower (p=0.002), and involved less trunk flexion (p=0.002) and rotation (p=0.026). Children with CP had more curved trajectories (p=0.005) and used less trunk flexion (p=0.003) and rotation (p=0.005). Elbow and shoulder kinematics differed from 2.8% to 155.4% between environments in both groups. Between groups, there were small, clinically insignificant differences with only the vertical gesture being longer in typically developing children. Children with CP who had greater tactile impairment used more trunk displacement. Clinicians and researchers need to be aware of differences in movement variables when setting goals or designing protocols for improving reaching in children with CP using low-cost, game-based virtual reality systems. Upper-limb kinematics differed in each group when reaching in physical versus virtual environments. There were small differences in movements made by children with mild unilateral cerebral palsy (CP) compared to typically developing children. Differences in reaching kinematics should be considered when goal setting using virtual reality interventions for children with mild unilateral CP. © 2018 Mac Keith Press.

  13. VREPAR projects: the use of virtual environments in psycho-neuro-physiological assessment and rehabilitation.

    PubMed

    Riva, G; Bacchetta, M; Baruffi, M; Borgomainerio, E; Defrance, C; Gatti, F; Galimberti, C; Fontaneto, S; Marchi, S; Molinari, E; Nugues, P; Rinaldi, S; Rovetta, A; Ferretti, G S; Tonci, A; Wann, J; Vincelli, F

    1999-01-01

    Due, in large part, to the significant advances in PC hardware that have been made over the last 3 years, PC-based virtual environments are approaching reality. Virtual Reality Environments for Psychoneurophysiological Assessment and Rehabilitation (VREPAR) are two European Community funded projects (Telematics for health-HC 1053/HC 1055, http:// www.psicologia.net) that are trying to develop a PC-based virtual reality system (PC-VRS) for the medical market that can be marketed at a price that is accessible to its possible endusers (hospitals, universities, and research centres) and that would have the modular, connectability, and interoperability characteristics that the existing systems lack. In particular, the projects are developing three hardware/software modules for the application of the PCVRS in psycho-neuro-physiological assessment and rehabilitation. The chosen development areas are eating disorders (bulimia, anorexia, and obesity), movement disorders (Parkinson's disease and torsion dystonia) and stroke disorders (unilateral neglect and hemiparesis). This article describes the rationale of the modules and the preliminary results obtained.

  14. Engagement of neural circuits underlying 2D spatial navigation in a rodent virtual reality system

    PubMed Central

    Aronov, Dmitriy; Tank, David W.

    2015-01-01

    SUMMARY Virtual reality (VR) enables precise control of an animal’s environment and otherwise impossible experimental manipulations. Neural activity in navigating rodents has been studied on virtual linear tracks. However, the spatial navigation system’s engagement in complete two-dimensional environments has not been shown. We describe a VR setup for rats, including control software and a large-scale electrophysiology system, which supports 2D navigation by allowing animals to rotate and walk in any direction. The entorhinal-hippocampal circuit, including place cells, grid cells, head direction cells and border cells, showed 2D activity patterns in VR similar to those in the real world. Hippocampal neurons exhibited various remapping responses to changes in the appearance or the shape of the virtual environment, including a novel form in which a VR-induced cue conflict caused remapping to lock to geometry rather than salient cues. These results suggest a general-purpose tool for novel types of experimental manipulations in navigating rats. PMID:25374363

  15. DWTP: a basis for networked VR on the Internet

    NASA Astrophysics Data System (ADS)

    Broll, Wolfgang; Schick, Daniel

    1998-04-01

    Shared virtual worlds are one of today's major research topics. While limited to particular application areas and high speed networks in the past, they become more and more available to a large number of users. One reason for this development was the introduction of VRML (the Virtual Reality Modeling Language), which has been established as a standard of the exchange of 3D worlds on the Internet. Although a number of prototype systems have been developed to realize shared multi-user worlds based on VRML, no suitable network protocol to support the demands of such environments has yet been established. In this paper we will introduce our approach of a network protocol for shared virtual environments: DWTP--the Distributed Worlds Transfer and communication Protocol. We will show how DWTP meets the demands of shared virtual environments on the Internet. We will further present SmallView, our prototype of a distributed multi-user VR system, to show how DWTP can be used to realize shared worlds.

  16. VERS: a virtual environment for reconstructive surgery planning

    NASA Astrophysics Data System (ADS)

    Montgomery, Kevin N.

    1997-05-01

    The virtual environment for reconstructive surgery (VERS) project at the NASA Ames Biocomputation Center is applying virtual reality technology to aid surgeons in planning surgeries. We are working with a craniofacial surgeon at Stanford to assemble and visualize the bone structure of patients requiring reconstructive surgery either through developmental abnormalities or trauma. This project is an extension of our previous work in 3D reconstruction, mesh generation, and immersive visualization. The current VR system, consisting of an SGI Onyx RE2, FakeSpace BOOM and ImmersiveWorkbench, Virtual Technologies CyberGlove and Ascension Technologies tracker, is currently in development and has already been used to visualize defects preoperatively. In the near future it will be used to more fully plan the surgery and compute the projected result to soft tissue structure. This paper presents the work in progress and details the production of a high-performance, collaborative, and networked virtual environment.

  17. Two-photon calcium imaging during fictive navigation in virtual environments

    PubMed Central

    Ahrens, Misha B.; Huang, Kuo Hua; Narayan, Sujatha; Mensh, Brett D.; Engert, Florian

    2013-01-01

    A full understanding of nervous system function requires recording from large populations of neurons during naturalistic behaviors. Here we enable paralyzed larval zebrafish to fictively navigate two-dimensional virtual environments while we record optically from many neurons with two-photon imaging. Electrical recordings from motor nerves in the tail are decoded into intended forward swims and turns, which are used to update a virtual environment displayed underneath the fish. Several behavioral features—such as turning responses to whole-field motion and dark avoidance—are well-replicated in this virtual setting. We readily observed neuronal populations in the hindbrain with laterally selective responses that correlated with right or left optomotor behavior. We also observed neurons in the habenula, pallium, and midbrain with response properties specific to environmental features. Beyond single-cell correlations, the classification of network activity in such virtual settings promises to reveal principles of brainwide neural dynamics during behavior. PMID:23761738

  18. Two-photon calcium imaging during fictive navigation in virtual environments.

    PubMed

    Ahrens, Misha B; Huang, Kuo Hua; Narayan, Sujatha; Mensh, Brett D; Engert, Florian

    2013-01-01

    A full understanding of nervous system function requires recording from large populations of neurons during naturalistic behaviors. Here we enable paralyzed larval zebrafish to fictively navigate two-dimensional virtual environments while we record optically from many neurons with two-photon imaging. Electrical recordings from motor nerves in the tail are decoded into intended forward swims and turns, which are used to update a virtual environment displayed underneath the fish. Several behavioral features-such as turning responses to whole-field motion and dark avoidance-are well-replicated in this virtual setting. We readily observed neuronal populations in the hindbrain with laterally selective responses that correlated with right or left optomotor behavior. We also observed neurons in the habenula, pallium, and midbrain with response properties specific to environmental features. Beyond single-cell correlations, the classification of network activity in such virtual settings promises to reveal principles of brainwide neural dynamics during behavior.

  19. Altering User Movement Behaviour in Virtual Environments.

    PubMed

    Simeone, Adalberto L; Mavridou, Ifigeneia; Powell, Wendy

    2017-04-01

    In immersive Virtual Reality systems, users tend to move in a Virtual Environment as they would in an analogous physical environment. In this work, we investigated how user behaviour is affected when the Virtual Environment differs from the physical space. We created two sets of four environments each, plus a virtual replica of the physical environment as a baseline. The first focused on aesthetic discrepancies, such as a water surface in place of solid ground. The second focused on mixing immaterial objects together with those paired to tangible objects. For example, barring an area with walls or obstacles. We designed a study where participants had to reach three waypoints laid out in such a way to prompt a decision on which path to follow based on the conflict between the mismatching visual stimuli and their awareness of the real layout of the room. We analysed their performances to determine whether their trajectories were altered significantly from the shortest route. Our results indicate that participants altered their trajectories in presence of surfaces representing higher walking difficulty (for example, water instead of grass). However, when the graphical appearance was found to be ambiguous, there was no significant trajectory alteration. The environments mixing immaterial with physical objects had the most impact on trajectories with a mean deviation from the shortest route of 60 cm against the 37 cm of environments with aesthetic alterations. The co-existance of paired and unpaired virtual objects was reported to support the idea that all objects participants saw were backed by physical props. From these results and our observations, we derive guidelines on how to alter user movement behaviour in Virtual Environments.

  20. Multi-Agent Framework for Virtual Learning Spaces.

    ERIC Educational Resources Information Center

    Sheremetov, Leonid; Nunez, Gustavo

    1999-01-01

    Discussion of computer-supported collaborative learning, distributed artificial intelligence, and intelligent tutoring systems focuses on the concept of agents, and describes a virtual learning environment that has a multi-agent system. Describes a model of interactions in collaborative learning and discusses agents for Web-based virtual…

  1. The Virtual Tablet: Virtual Reality as a Control System

    NASA Technical Reports Server (NTRS)

    Chronister, Andrew

    2016-01-01

    In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of hardware, write software to incorporate each piece of position or orientation data into a coherent description of the object's location in space, place the virtual analogue accordingly, and project the control interface onto it, resulting in a functioning object which has both a physical and a virtual presence. Additionally, the virtual environment was enhanced with two live video feeds from cameras mounted on the robotic device being used as an example target of the virtual interface. The working VT allows users to naturally interact with a control interface with little to no training and without the issues found in previous efforts.

  2. Teaching Network Security in a Virtual Learning Environment

    ERIC Educational Resources Information Center

    Bergstrom, Laura; Grahn, Kaj J.; Karlstrom, Krister; Pulkkis, Goran; Astrom, Peik

    2004-01-01

    This article presents a virtual course with the topic network security. The course has been produced by Arcada Polytechnic as a part of the production team Computer Networks, Telecommunication and Telecommunication Systems in the Finnish Virtual Polytechnic. The article begins with an introduction to the evolution of the information security…

  3. Building a Virtual Learning Network for Teachers in a Suburban School District

    ERIC Educational Resources Information Center

    Kurtzworth-Keen, Kristin A.

    2011-01-01

    Emerging research indicates that learning management systems such as Moodle can function as virtual, collaborative environments, where collegial interactions promote professional learning opportunities. This study deployed a mixed methods design in order to describe and analyze teacher participation in a virtual learning network (VLN) that was…

  4. The Virtual Classroom: A Catalyst for Institutional Transformation

    ERIC Educational Resources Information Center

    Subramaniam, Nantha Kumar; Kandasamy, Maheswari

    2011-01-01

    This study explores the use of the virtual classroom which has been created in "myVLE", a learning management system used by the Open University Malaysia (OUM). The virtual classroom in "myVLE" is an asynchronous-based online learning environment that delivers course materials to learners and provides collaboration and…

  5. Rapid prototyping 3D virtual world interfaces within a virtual factory environment

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.

  6. A computer-based training system combining virtual reality and multimedia

    NASA Technical Reports Server (NTRS)

    Stansfield, Sharon A.

    1993-01-01

    Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment. The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system.

  7. Educational Virtual Environments as a Lens for Understanding both Precise Repeatability and Specific Variation in Learning Ecologies

    ERIC Educational Resources Information Center

    Zuiker, Steven J.

    2012-01-01

    As a global cyberinfrastructure, the Internet makes authentic digital problem spaces like educational virtual environments (EVEs) available to a wide range of classrooms, schools and education systems operating under different circumstantial, practical, social and cultural conditions. And yet, if the makers and users of EVEs both have a hand in…

  8. Nursing Faculty Experiences of Virtual Learning Environments for Teaching Clinical Reasoning

    ERIC Educational Resources Information Center

    Zacharzuk-Marciano, Tara

    2017-01-01

    Nurses need sharp, clinical reasoning skills to respond to critical situations and to be successful at work in a complex and challenging healthcare system. While past research has focused on using virtual learning environments to teach clinical reasoning, there has been limited research on the experiences of nursing faculty and there is a need for…

  9. Assessing the Utility of a Virtual Environment for Enhancing Facial Affect Recognition in Adolescents with Autism

    ERIC Educational Resources Information Center

    Bekele, Esubalew; Crittendon, Julie; Zheng, Zhi; Swanson, Amy; Weitlauf, Amy; Warren, Zachary; Sarkar, Nilanjan

    2014-01-01

    Teenagers with autism spectrum disorder (ASD) and age-matched controls participated in a dynamic facial affect recognition task within a virtual reality (VR) environment. Participants identified the emotion of a facial expression displayed at varied levels of intensity by a computer generated avatar. The system assessed performance (i.e.,…

  10. Virtual Solar System Project: Learning through a Technology-Rich, Inquiry-Based, Participatory Learning Environment.

    ERIC Educational Resources Information Center

    Barab, Sasha A.; Hay, Kenneth E.; Squire, Kurt; Barnett, Michael; Schmidt, Rae; Karrigan, Kristen; Yamagata-Lynch, Lisa; Johnson, Christine

    2000-01-01

    Describes an introductory undergraduate astronomy course in which the large-lecture format was moved to one in which students were immersed in a technologically-rich, inquiry-based, participatory learning environment. Finds that virtual reality can be used effectively in regular undergraduate university courses as a tool through which students can…

  11. Use and Mastery of Virtual Learning Environment in Brazilian Open University

    ERIC Educational Resources Information Center

    Gomez, Margarita Victoria

    2014-01-01

    This paper describes and analyses the dynamics of the use and/or mastery of Virtual Learning Environments (VLEs) by educators and students Open University, important part of the Brazilian Educational System. A questionnaire with 32 items was answered by 174 students/instructors/coordinators of the Media in Education and Physics courses, of two…

  12. Advanced Technology for Portable Personal Visualization

    DTIC Science & Technology

    1993-01-01

    have no cable to drag. " We submitted a short article describing the ceiling tracker and the requirements demanded of trackers in see-through systems...Newspaper/Magazine Articles : "Virtual Reality: It’s All in the Mind," Atlanta Consnrution, 29 September 1992 "Virtual Reality: Exploring the Future...basic scientific investigation of the human haptic system or to serve as haptic interfaces for virtual environments and teleloperation. 2. Research

  13. An interactive VR system based on full-body tracking and gesture recognition

    NASA Astrophysics Data System (ADS)

    Zeng, Xia; Sang, Xinzhu; Chen, Duo; Wang, Peng; Guo, Nan; Yan, Binbin; Wang, Kuiru

    2016-10-01

    Most current virtual reality (VR) interactions are realized with the hand-held input device which leads to a low degree of presence. There is other solutions using sensors like Leap Motion to recognize the gestures of users in order to interact in a more natural way, but the navigation in these systems is still a problem, because they fail to map the actual walking to virtual walking only with a partial body of the user represented in the synthetic environment. Therefore, we propose a system in which users can walk around in the virtual environment as a humanoid model, selecting menu items and manipulating with the virtual objects using natural hand gestures. With a Kinect depth camera, the system tracks the joints of the user, mapping them to a full virtual body which follows the move of the tracked user. The movements of the feet can be detected to determine whether the user is in walking state, so that the walking of model in the virtual world can be activated and stopped by means of animation control in Unity engine. This method frees the hands of users comparing to traditional navigation way using hand-held device. We use the point cloud data getting from Kinect depth camera to recognize the gestures of users, such as swiping, pressing and manipulating virtual objects. Combining the full body tracking and gestures recognition using Kinect, we achieve our interactive VR system in Unity engine with a high degree of presence.

  14. Creation of virtual patients for midwifery education.

    PubMed

    Urbanová, Eva; Bašková, Martina; Maskálová, Erika; Kvaltínyová, Eva

    2018-07-01

    The objective of the study was to create several new, original virtual patients (VPs) in the Slovak language, especially for educational purposes in midwifery. Virtual patients have been created for the needs of university midwifery education in Slovakia. The creation of the six virtual patients basically consisted of three fixed stages: preparation, design and development, implementation into the virtual environment. We used the Open Labyrinth (OL) virtual environment, an open-source system for creating VPs. The VPs include six various scenarios of the most common problems seen in midwifery practice: preterm birth, perinatal loss, gestational diabetes, ineffective breastfeeding, postpartum bleeding and sudden home birth. Currently, six original virtual patients are used in university midwifery education in Slovakia. We use them for contact teaching as well as self-study of students. They present the first VPs in Slovakia and the Czech Republic created in academic settings in these countries. The future perspective of a virtual patient as an interactive process between the student and the medium is that it can deepen and improve learning outcomes, solve specific midwifery issues, and reduce mistakes in the clinical environment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. SAFARI: An Environment for Creating Tutoring Systems in Industrial Training.

    ERIC Educational Resources Information Center

    Gecsei, J.; Frasson, C.

    Safari is a cooperative project involving four Quebec universities, two industrial partners (Virtual Prototypes, Inc., providing the VAPS software package, and Novasys, Inc., a consulting firm specializing in artificial intelligence and training), and government. VAPS (Virtual Applications Prototyping System) is a commercial interface-building and…

  16. An integrated pipeline to create and experience compelling scenarios in virtual reality

    NASA Astrophysics Data System (ADS)

    Springer, Jan P.; Neumann, Carsten; Reiners, Dirk; Cruz-Neira, Carolina

    2011-03-01

    One of the main barriers to create and use compelling scenarios in virtual reality is the complexity and time-consuming efforts for modeling, element integration, and the software development to properly display and interact with the content in the available systems. Still today, most virtual reality applications are tedious to create and they are hard-wired to the specific display and interaction system available to the developers when creating the application. Furthermore, it is not possible to alter the content or the dynamics of the content once the application has been created. We present our research on designing a software pipeline that enables the creation of compelling scenarios with a fair degree of visual and interaction complexity in a semi-automated way. Specifically, we are targeting drivable urban scenarios, ranging from large cities to sparsely populated rural areas that incorporate both static components (e. g., houses, trees) and dynamic components (e. g., people, vehicles) as well as events, such as explosions or ambient noise. Our pipeline has four basic components. First, an environment designer, where users sketch the overall layout of the scenario, and an automated method constructs the 3D environment from the information in the sketch. Second, a scenario editor used for authoring the complete scenario, incorporate the dynamic elements and events, fine tune the automatically generated environment, define the execution conditions of the scenario, and set up any data gathering that may be necessary during the execution of the scenario. Third, a run-time environment for different virtual-reality systems provides users with the interactive experience as designed with the designer and the editor. And fourth, a bi-directional monitoring system that allows for capturing and modification of information from the virtual environment. One of the interesting capabilities of our pipeline is that scenarios can be built and modified on-the-fly as they are being presented in the virtual-reality systems. Users can quickly prototype the basic scene using the designer and the editor on a control workstation. More elements can then be introduced into the scene from both the editor and the virtual-reality display. In this manner, users are able to gradually increase the complexity of the scenario with immediate feedback. The main use of this pipeline is the rapid development of scenarios for human-factors studies. However, it is applicable in a much more general context.

  17. Architecture of web services in the enhancement of real-time 3D video virtualization in cloud environment

    NASA Astrophysics Data System (ADS)

    Bada, Adedayo; Wang, Qi; Alcaraz-Calero, Jose M.; Grecos, Christos

    2016-04-01

    This paper proposes a new approach to improving the application of 3D video rendering and streaming by jointly exploring and optimizing both cloud-based virtualization and web-based delivery. The proposed web service architecture firstly establishes a software virtualization layer based on QEMU (Quick Emulator), an open-source virtualization software that has been able to virtualize system components except for 3D rendering, which is still in its infancy. The architecture then explores the cloud environment to boost the speed of the rendering at the QEMU software virtualization layer. The capabilities and inherent limitations of Virgil 3D, which is one of the most advanced 3D virtual Graphics Processing Unit (GPU) available, are analyzed through benchmarking experiments and integrated into the architecture to further speed up the rendering. Experimental results are reported and analyzed to demonstrate the benefits of the proposed approach.

  18. Hybrid Cloud Computing Environment for EarthCube and Geoscience Community

    NASA Astrophysics Data System (ADS)

    Yang, C. P.; Qin, H.

    2016-12-01

    The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.

  19. Visual landmarks facilitate rodent spatial navigation in virtual reality environments

    PubMed Central

    Youngstrom, Isaac A.; Strowbridge, Ben W.

    2012-01-01

    Because many different sensory modalities contribute to spatial learning in rodents, it has been difficult to determine whether spatial navigation can be guided solely by visual cues. Rodents moving within physical environments with visual cues engage a variety of nonvisual sensory systems that cannot be easily inhibited without lesioning brain areas. Virtual reality offers a unique approach to ask whether visual landmark cues alone are sufficient to improve performance in a spatial task. We found that mice could learn to navigate between two water reward locations along a virtual bidirectional linear track using a spherical treadmill. Mice exposed to a virtual environment with vivid visual cues rendered on a single monitor increased their performance over a 3-d training regimen. Training significantly increased the percentage of time avatars controlled by the mice spent near reward locations in probe trials without water rewards. Neither improvement during training or spatial learning for reward locations occurred with mice operating a virtual environment without vivid landmarks or with mice deprived of all visual feedback. Mice operating the vivid environment developed stereotyped avatar turning behaviors when alternating between reward zones that were positively correlated with their performance on the probe trial. These results suggest that mice are able to learn to navigate to specific locations using only visual cues presented within a virtual environment rendered on a single computer monitor. PMID:22345484

  20. A novel augmented reality system of image projection for image-guided neurosurgery.

    PubMed

    Mahvash, Mehran; Besharati Tabrizi, Leila

    2013-05-01

    Augmented reality systems combine virtual images with a real environment. To design and develop an augmented reality system for image-guided surgery of brain tumors using image projection. A virtual image was created in two ways: (1) MRI-based 3D model of the head matched with the segmented lesion of a patient using MRIcro software (version 1.4, freeware, Chris Rorden) and (2) Digital photograph based model in which the tumor region was drawn using image-editing software. The real environment was simulated with a head phantom. For direct projection of the virtual image to the head phantom, a commercially available video projector (PicoPix 1020, Philips) was used. The position and size of the virtual image was adjusted manually for registration, which was performed using anatomical landmarks and fiducial markers position. An augmented reality system for image-guided neurosurgery using direct image projection has been designed successfully and implemented in first evaluation with promising results. The virtual image could be projected to the head phantom and was registered manually. Accurate registration (mean projection error: 0.3 mm) was performed using anatomical landmarks and fiducial markers position. The direct projection of a virtual image to the patients head, skull, or brain surface in real time is an augmented reality system that can be used for image-guided neurosurgery. In this paper, the first evaluation of the system is presented. The encouraging first visualization results indicate that the presented augmented reality system might be an important enhancement of image-guided neurosurgery.

  1. Using virtual environment technology for preadapting astronauts to the novel sensory conditions of microgravity

    NASA Technical Reports Server (NTRS)

    Duncan, K. M.; Harm, D. L.; Crosier, W. G.; Worthington, J. W.

    1993-01-01

    A unique training device is being developed at the Johnson Space Center Neurosciences Laboratory to help reduce or eliminate Space Motion Sickness (SMS) and spatial orientation disturbances that occur during spaceflight. The Device for Orientation and Motion Environments Preflight Adaptation Trainer (DOME PAT) uses virtual reality technology to simulate some sensory rearrangements experienced by astronauts in microgravity. By exposing a crew member to this novel environment preflight, it is expected that he/she will become partially adapted, and thereby suffer fewer symptoms inflight. The DOME PAT is a 3.7 m spherical dome, within which a 170 by 100 deg field of view computer-generated visual database is projected. The visual database currently in use depicts the interior of a Shuttle spacelab. The trainee uses a six degree-of-freedom, isometric force hand controller to navigate through the virtual environment. Alternatively, the trainee can be 'moved' about within the virtual environment by the instructor, or can look about within the environment by wearing a restraint that controls scene motion in response to head movements. The computer system is comprised of four personal computers that provide the real time control and user interface, and two Silicon Graphics computers that generate the graphical images. The image generator computers use custom algorithms to compensate for spherical image distortion, while maintaining a video update rate of 30 Hz. The DOME PAT is the first such system known to employ virtual reality technology to reduce the untoward effects of the sensory rearrangement associated with exposure to microgravity, and it does so in a very cost-effective manner.

  2. Usability and Feasibility of an Internet-Based Virtual Pedestrian Environment to Teach Children to Cross Streets Safely

    PubMed Central

    Schwebel, David C.; McClure, Leslie A.; Severson, Joan

    2013-01-01

    Child pedestrian injury is a preventable global health challenge. Successful training efforts focused on child behavior, including individualized streetside training and training in large virtual pedestrian environments, are laborious and expensive. This study considers the usability and feasibility of a virtual pedestrian environment “game” application to teach children safe street-crossing behavior via the internet, a medium that could be broadly disseminated at low cost. Ten 7- and 8-year-old children participated. They engaged in an internet-based virtual pedestrian environment and completed a brief assessment survey. Researchers rated children's behavior while engaged in the game. Both self-report and researcher observations indicated the internet-based system was readily used by the children without adult support. The youth understood how to engage in the system and used it independently and attentively. The program also was feasible. It provided multiple measures of pedestrian safety that could be used for research or training purposes. Finally, the program was rated by children as engaging and educational. Researcher ratings suggested children used the program with minimal fidgeting or boredom. The pilot test suggests an internet-based virtual pedestrian environment offers a usable, feasible, engaging, and educational environment for child pedestrian safety training. If future research finds children learn the cognitive and perceptual skills needed to cross streets safely within it, internet-based training may provide a low-cost medium to broadly disseminate child pedestrian safety training. The concept may be generalized to other domains of health-related functioning such as teen driving safety, adolescent sexual risk-taking, and adolescent substance use. PMID:24678263

  3. Assessing Student Learning in a Virtual Laboratory Environment

    ERIC Educational Resources Information Center

    Wolf, T.

    2010-01-01

    Laboratory experience is a key factor in technical and scientific education. Virtual laboratories have been proposed to reduce cost and simplify maintenance of lab facilities while still providing students with access to real systems. It is important to determine if such virtual labs are still effective for student learning. In the assessment of a…

  4. The Effect of Virtual versus Traditional Learning in Achieving Competency-Based Skills

    ERIC Educational Resources Information Center

    Mosalanejad, Leili; Shahsavari, Sakine; Sobhanian, Saeed; Dastpak, Mehdi

    2012-01-01

    Background: By rapid developing of the network technology, the internet-based learning methods are substituting the traditional classrooms making them expand to the virtual network learning environment. The purpose of this study was to determine the effectiveness of virtual systems on competency-based skills of first-year nursing students.…

  5. Combined virtual and real robotic test-bed for single operator control of multiple robots

    NASA Astrophysics Data System (ADS)

    Lee, Sam Y.-S.; Hunt, Shawn; Cao, Alex; Pandya, Abhilash

    2010-04-01

    Teams of heterogeneous robots with different dynamics or capabilities could perform a variety of tasks such as multipoint surveillance, cooperative transport and explorations in hazardous environments. In this study, we work with heterogeneous robots of semi-autonomous ground and aerial robots for contaminant localization. We developed a human interface system which linked every real robot to its virtual counterpart. A novel virtual interface has been integrated with Augmented Reality that can monitor the position and sensory information from video feed of ground and aerial robots in the 3D virtual environment, and improve user situational awareness. An operator can efficiently control the real multi-robots using the Drag-to-Move method on the virtual multi-robots. This enables an operator to control groups of heterogeneous robots in a collaborative way for allowing more contaminant sources to be pursued simultaneously. The advanced feature of the virtual interface system is guarded teleoperation. This can be used to prevent operators from accidently driving multiple robots into walls and other objects. Moreover, the feature of the image guidance and tracking is able to reduce operator workload.

  6. VERSE - Virtual Equivalent Real-time Simulation

    NASA Technical Reports Server (NTRS)

    Zheng, Yang; Martin, Bryan J.; Villaume, Nathaniel

    2005-01-01

    Distributed real-time simulations provide important timing validation and hardware in the- loop results for the spacecraft flight software development cycle. Occasionally, the need for higher fidelity modeling and more comprehensive debugging capabilities - combined with a limited amount of computational resources - calls for a non real-time simulation environment that mimics the real-time environment. By creating a non real-time environment that accommodates simulations and flight software designed for a multi-CPU real-time system, we can save development time, cut mission costs, and reduce the likelihood of errors. This paper presents such a solution: Virtual Equivalent Real-time Simulation Environment (VERSE). VERSE turns the real-time operating system RTAI (Real-time Application Interface) into an event driven simulator that runs in virtual real time. Designed to keep the original RTAI architecture as intact as possible, and therefore inheriting RTAI's many capabilities, VERSE was implemented with remarkably little change to the RTAI source code. This small footprint together with use of the same API allows users to easily run the same application in both real-time and virtual time environments. VERSE has been used to build a workstation testbed for NASA's Space Interferometry Mission (SIM PlanetQuest) instrument flight software. With its flexible simulation controls and inexpensive setup and replication costs, VERSE will become an invaluable tool in future mission development.

  7. Exploiting virtual synchrony in distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Joseph, Thomas A.

    1987-01-01

    Applications of a virtually synchronous environment are described for distributed programming, which underlies a collection of distributed programming tools in the ISIS2 system. A virtually synchronous environment allows processes to be structured into process groups, and makes events like broadcasts to the group as an entity, group membership changes, and even migration of an activity from one place to another appear to occur instantaneously, in other words, synchronously. A major advantage to this approach is that many aspects of a distributed application can be treated independently without compromising correctness. Moreover, user code that is designed as if the system were synchronous can often be executed concurrently. It is argued that this approach to building distributed and fault tolerant software is more straightforward, more flexible, and more likely to yield correct solutions than alternative approaches.

  8. An optical brain computer interface for environmental control.

    PubMed

    Ayaz, Hasan; Shewokis, Patricia A; Bunce, Scott; Onaral, Banu

    2011-01-01

    A brain computer interface (BCI) is a system that translates neurophysiological signals detected from the brain to supply input to a computer or to control a device. Volitional control of neural activity and its real-time detection through neuroimaging modalities are key constituents of BCI systems. The purpose of this study was to develop and test a new BCI design that utilizes intention-related cognitive activity within the dorsolateral prefrontal cortex using functional near infrared (fNIR) spectroscopy. fNIR is a noninvasive, safe, portable and affordable optical technique with which to monitor hemodynamic changes, in the brain's cerebral cortex. Because of its portability and ease of use, fNIR is amenable to deployment in ecologically valid natural working environments. We integrated a control paradigm in a computerized 3D virtual environment to augment interactivity. Ten healthy participants volunteered for a two day study in which they navigated a virtual environment with keyboard inputs, but were required to use the fNIR-BCI for interaction with virtual objects. Results showed that participants consistently utilized the fNIR-BCI with an overall success rate of 84% and volitionally increased their cerebral oxygenation level to trigger actions within the virtual environment.

  9. Comparing two types of navigational interfaces for Virtual Reality.

    PubMed

    Teixeira, Luís; Vilar, Elisângela; Duarte, Emília; Rebelo, Francisco; da Silva, Fernando Moreira

    2012-01-01

    Previous studies suggest significant differences between navigating virtual environments in a life-like walking manner (i.e., using treadmills or walk-in-place techniques) and virtual navigation (i.e., flying while really standing). The latter option, which usually involves hand-centric devices (e.g., joysticks), is the most common in Virtual Reality-based studies, mostly due to low costs, less space and technology demands. However, recently, new interaction devices, originally conceived for videogames have become available offering interesting potentialities for research. This study aimed to explore the potentialities of the Nintendo Wii Balance Board as a navigation interface in a Virtual Environment presented in an immersive Virtual Reality system. Comparing participants' performance while engaged in a simulated emergency egress allows determining the adequacy of such alternative navigation interface on the basis of empirical results. Forty university students participated in this study. Results show that participants were more efficient when performing navigation tasks using the Joystick than with the Balance Board. However there were no significantly differences in the behavioral compliance with exit signs. Therefore, this study suggests that, at least for tasks similar to the studied, the Balance Board have good potentiality to be used as a navigation interface for Virtual Reality systems.

  10. The Implementation and Validation of a Virtual Environment for Training Powered Wheelchair Manoeuvres.

    PubMed

    John, Nigel W; Pop, Serban R; Day, Thomas W; Ritsos, Panagiotis D; Headleand, Christopher J

    2018-05-01

    Navigating a powered wheelchair and avoiding collisions is often a daunting task for new wheelchair users. It takes time and practice to gain the coordination needed to become a competent driver and this can be even more of a challenge for someone with a disability. We present a cost-effective virtual reality (VR) application that takes advantage of consumer level VR hardware. The system can be easily deployed in an assessment centre or for home use, and does not depend on a specialized high-end virtual environment such as a Powerwall or CAVE. This paper reviews previous work that has used virtual environments technology for training tasks, particularly wheelchair simulation. We then describe the implementation of our own system and the first validation study carried out using thirty three able bodied volunteers. The study results indicate that at a significance level of 5 percent then there is an improvement in driving skills from the use of our VR system. We thus have the potential to develop the competency of a wheelchair user whilst avoiding the risks inherent to training in the real world. However, the occurrence of cybersickness is a particular problem in this application that will need to be addressed.

  11. The NASA Augmented/Virtual Reality Lab: The State of the Art at KSC

    NASA Technical Reports Server (NTRS)

    Little, William

    2017-01-01

    The NASA Augmented Virtual Reality (AVR) Lab at Kennedy Space Center is dedicated to the investigation of Augmented Reality (AR) and Virtual Reality (VR) technologies, with the goal of determining potential uses of these technologies as human-computer interaction (HCI) devices in an aerospace engineering context. Begun in 2012, the AVR Lab has concentrated on commercially available AR and VR devices that are gaining in popularity and use in a number of fields such as gaming, training, and telepresence. We are working with such devices as the Microsoft Kinect, the Oculus Rift, the Leap Motion, the HTC Vive, motion capture systems, and the Microsoft Hololens. The focus of our work has been on human interaction with the virtual environment, which in turn acts as a communications bridge to remote physical devices and environments which the operator cannot or should not control or experience directly. Particularly in reference to dealing with spacecraft and the oftentimes hazardous environments they inhabit, it is our hope that AR and VR technologies can be utilized to increase human safety and mission success by physically removing humans from those hazardous environments while virtually putting them right in the middle of those environments.

  12. Review of Enabling Technologies to Facilitate Secure Compute Customization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aderholdt, Ferrol; Caldwell, Blake A; Hicks, Susan Elaine

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data for a variety of users, often requiring strong separation between job allocations. There are many challenges to establishing these secure enclaves within the shared infrastructure of high-performance computing (HPC) environments. The isolation mechanisms in the system software are the basic building blocks for enabling secure compute enclaves. There are a variety of approaches and the focus of this report is to review the different virtualization technologies thatmore » facilitate the creation of secure compute enclaves. The report reviews current operating system (OS) protection mechanisms and modern virtualization technologies to better understand the performance/isolation properties. We also examine the feasibility of running ``virtualized'' computing resources as non-privileged users, and providing controlled administrative permissions for standard users running within a virtualized context. Our examination includes technologies such as Linux containers (LXC [32], Docker [15]) and full virtualization (KVM [26], Xen [5]). We categorize these different approaches to virtualization into two broad groups: OS-level virtualization and system-level virtualization. The OS-level virtualization uses containers to allow a single OS kernel to be partitioned to create Virtual Environments (VE), e.g., LXC. The resources within the host's kernel are only virtualized in the sense of separate namespaces. In contrast, system-level virtualization uses hypervisors to manage multiple OS kernels and virtualize the physical resources (hardware) to create Virtual Machines (VM), e.g., Xen, KVM. This terminology of VE and VM, detailed in Section 2, is used throughout the report to distinguish between the two different approaches to providing virtualized execution environments. As part of our technology review we analyzed several current virtualization solutions to assess their vulnerabilities. This included a review of common vulnerabilities and exposures (CVEs) for Xen, KVM, LXC and Docker to gauge their susceptibility to different attacks. The complete details are provided in Section 5 on page 33. Based on this review we concluded that system-level virtualization solutions have many more vulnerabilities than OS level virtualization solutions. As such, security mechanisms like sVirt (Section 3.3) should be considered when using system-level virtualization solutions in order to protect the host against exploits. The majority of vulnerabilities related to KVM, LXC, and Docker are in specific regions of the system. Therefore, future "zero day attacks" are likely to be in the same regions, which suggests that protecting these areas can simplify the protection of the host and maintain the isolation between users. The evaluations of virtualization technologies done thus far are discussed in Section 4. This includes experiments with 'user' namespaces in VEs, which provides the ability to isolate user privileges and allow a user to run with different UIDs within the container while mapping them to non-privileged UIDs in the host. We have identified Linux namespaces as a promising mechanism to isolate shared resources, while maintaining good performance. In Section 4.1 we describe our tests with LXC as a non-root user and leveraging namespaces to control UID/GID mappings and support controlled sharing of parallel file-systems. We highlight several of these namespace capabilities in Section 6.2.3. The other evaluations that were performed during this initial phase of work provide baseline performance data for comparing VEs and VMs to purely native execution. In Section 4.2 we performed tests using the High-Performance Computing Conjugate Gradient (HPCCG) benchmark to establish baseline performance for a scientific application when run on the Native (host) machine in contrast with execution under Docker and KVM. Our tests verified prior studies showing roughly 2-4% overheads in application execution time & MFlops when running in hypervisor-base environments (VMs) as compared to near native performance with VEs. For more details, see Figures 4.5 (page 28), 4.6 (page 28), and 4.7 (page 29). Additionally, in Section 4.3 we include network measurements for TCP bandwidth performance over the 10GigE interface in our testbed. The Native and Docker based tests achieved >= ~9Gbits/sec, while the KVM configuration only achieved 2.5Gbits/sec (Table 4.6 on page 32). This may be a configuration issue with our KVM installation, and is a point for further testing as we refine the network settings in the testbed. The initial network tests were done using a bridged networking configuration. The report outline is as follows: - Section 1 introduces the report and clarifies the scope of the proj...« less

  13. BIM Based Virtual Environment for Fire Emergency Evacuation

    PubMed Central

    Rezgui, Yacine; Ong, Hoang N.

    2014-01-01

    Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management. PMID:25197704

  14. Surgery applications of virtual reality

    NASA Technical Reports Server (NTRS)

    Rosen, Joseph

    1994-01-01

    Virtual reality is a computer-generated technology which allows information to be displayed in a simulated, bus lifelike, environment. In this simulated 'world', users can move and interact as if they were actually a part of that world. This new technology will be useful in many different fields, including the field of surgery. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations, simulate and perform surgical procedures (telesurgery), and predict the outcomes of surgery. The authors of this paper describe the basic components of a virtual reality surgical system. These components include: the virtual world, the virtual tools, the anatomical model, the software platform, the host computer, the interface, and the head-coupled display. In the chapter they also review the progress towards using virtual reality for surgical training, planning, telesurgery, and predicting outcomes. Finally, the authors present a training system being developed for the practice of new procedures in abdominal surgery.

  15. Virtual environment display for a 3D audio room simulation

    NASA Astrophysics Data System (ADS)

    Chapin, William L.; Foster, Scott

    1992-06-01

    Recent developments in virtual 3D audio and synthetic aural environments have produced a complex acoustical room simulation. The acoustical simulation models a room with walls, ceiling, and floor of selected sound reflecting/absorbing characteristics and unlimited independent localizable sound sources. This non-visual acoustic simulation, implemented with 4 audio ConvolvotronsTM by Crystal River Engineering and coupled to the listener with a Poihemus IsotrakTM, tracking the listener's head position and orientation, and stereo headphones returning binaural sound, is quite compelling to most listeners with eyes closed. This immersive effect should be reinforced when properly integrated into a full, multi-sensory virtual environment presentation. This paper discusses the design of an interactive, visual virtual environment, complementing the acoustic model and specified to: 1) allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; 2) reinforce the listener's feeling of telepresence into the acoustical environment with visual and proprioceptive sensations; 3) enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and 4) serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations. The installed system implements a head-coupled, wide-angle, stereo-optic tracker/viewer and multi-computer simulation control. The portable demonstration system implements a head-mounted wide-angle, stereo-optic display, separate head and pointer electro-magnetic position trackers, a heterogeneous parallel graphics processing system, and object oriented C++ program code.

  16. A virtual data language and system for scientific workflow management in data grid environments

    NASA Astrophysics Data System (ADS)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  17. The effect on lower spine muscle activation of walking on a narrow beam in virtual reality.

    PubMed

    Antley, Angus; Slater, Mel

    2011-02-01

    To what extent do people behave in immersive virtual environments as they would in similar situations in a physical environment? There are many ways to address this question, ranging from questionnaires, behavioral studies, and the use of physiological measures. Here, we compare the onsets of muscle activity using surface electromyography (EMG) while participants were walking under three different conditions: on a normal floor surface, on a narrow ribbon along the floor, and on a narrow platform raised off the floor. The same situation was rendered in an immersive virtual environment (IVE) Cave-like system, and 12 participants did the three types of walking in a counter-balanced within-groups design. The mean number of EMG activity onsets per unit time followed the same pattern in the virtual environment as in the physical environment-significantly higher for walking on the platform compared to walking on the floor. Even though participants knew that they were in fact really walking at floor level in the virtual environment condition, the visual illusion of walking on a raised platform was sufficient to influence their behavior in a measurable way. This opens up the door for this technique to be used in gait and posture related scenarios including rehabilitation.

  18. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    NASA Astrophysics Data System (ADS)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  19. A training platform for many-dimensional prosthetic devices using a virtual reality environment

    PubMed Central

    Putrino, David; Wong, Yan T.; Weiss, Adam; Pesaran, Bijan

    2014-01-01

    Brain machine interfaces (BMIs) have the potential to assist in the rehabilitation of millions of patients worldwide. Despite recent advancements in BMI technology for the restoration of lost motor function, a training environment to restore full control of the anatomical segments of an upper limb extremity has not yet been presented. Here, we develop a virtual upper limb prosthesis with 27 independent dimensions, the anatomical dimensions of the human arm and hand, and deploy the virtual prosthesis as an avatar in a virtual reality environment (VRE) that can be controlled in real-time. The prosthesis avatar accepts kinematic control inputs that can be captured from movements of the arm and hand as well as neural control inputs derived from processed neural signals. We characterize the system performance under kinematic control using a commercially available motion capture system. We also present the performance under kinematic control achieved by two non-human primates (Macaca Mulatta) trained to use the prosthetic avatar to perform reaching and grasping tasks. This is the first virtual prosthetic device that is capable of emulating all the anatomical movements of a healthy upper limb in real-time. Since the system accepts both neural and kinematic inputs for a variety of many-dimensional skeletons, we propose it provides a customizable training platform for the acquisition of many-dimensional neural prosthetic control. PMID:24726625

  20. Dynamic Extension of a Virtualized Cluster by using Cloud Resources

    NASA Astrophysics Data System (ADS)

    Oberst, Oliver; Hauth, Thomas; Kernert, David; Riedel, Stephan; Quast, Günter

    2012-12-01

    The specific requirements concerning the software environment within the HEP community constrain the choice of resource providers for the outsourcing of computing infrastructure. The use of virtualization in HPC clusters and in the context of cloud resources is therefore a subject of recent developments in scientific computing. The dynamic virtualization of worker nodes in common batch systems provided by ViBatch serves each user with a dynamically virtualized subset of worker nodes on a local cluster. Now it can be transparently extended by the use of common open source cloud interfaces like OpenNebula or Eucalyptus, launching a subset of the virtual worker nodes within the cloud. This paper demonstrates how a dynamically virtualized computing cluster is combined with cloud resources by attaching remotely started virtual worker nodes to the local batch system.

  1. Immersive virtual reality for visualization of abdominal CT

    NASA Astrophysics Data System (ADS)

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A.; Bodenheimer, Robert E.

    2013-03-01

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  2. Immersive Virtual Reality for Visualization of Abdominal CT.

    PubMed

    Lin, Qiufeng; Xu, Zhoubing; Li, Bo; Baucom, Rebeccah; Poulose, Benjamin; Landman, Bennett A; Bodenheimer, Robert E

    2013-03-28

    Immersive virtual environments use a stereoscopic head-mounted display and data glove to create high fidelity virtual experiences in which users can interact with three-dimensional models and perceive relationships at their true scale. This stands in stark contrast to traditional PACS-based infrastructure in which images are viewed as stacks of two-dimensional slices, or, at best, disembodied renderings. Although there has substantial innovation in immersive virtual environments for entertainment and consumer media, these technologies have not been widely applied in clinical applications. Here, we consider potential applications of immersive virtual environments for ventral hernia patients with abdominal computed tomography imaging data. Nearly a half million ventral hernias occur in the United States each year, and hernia repair is the most commonly performed general surgery operation worldwide. A significant problem in these conditions is communicating the urgency, degree of severity, and impact of a hernia (and potential repair) on patient quality of life. Hernias are defined by ruptures in the abdominal wall (i.e., the absence of healthy tissues) rather than a growth (e.g., cancer); therefore, understanding a hernia necessitates understanding the entire abdomen. Our environment allows surgeons and patients to view body scans at scale and interact with these virtual models using a data glove. This visualization and interaction allows users to perceive the relationship between physical structures and medical imaging data. The system provides close integration of PACS-based CT data with immersive virtual environments and creates opportunities to study and optimize interfaces for patient communication, operative planning, and medical education.

  3. LivePhantom: Retrieving Virtual World Light Data to Real Environments.

    PubMed

    Kolivand, Hoshang; Billinghurst, Mark; Sunar, Mohd Shahrizal

    2016-01-01

    To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera's position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems.

  4. LivePhantom: Retrieving Virtual World Light Data to Real Environments

    PubMed Central

    2016-01-01

    To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera’s position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems. PMID:27930663

  5. Virtual Vision

    NASA Astrophysics Data System (ADS)

    Terzopoulos, Demetri; Qureshi, Faisal Z.

    Computer vision and sensor networks researchers are increasingly motivated to investigate complex multi-camera sensing and control issues that arise in the automatic visual surveillance of extensive, highly populated public spaces such as airports and train stations. However, they often encounter serious impediments to deploying and experimenting with large-scale physical camera networks in such real-world environments. We propose an alternative approach called "Virtual Vision", which facilitates this type of research through the virtual reality simulation of populated urban spaces, camera sensor networks, and computer vision on commodity computers. We demonstrate the usefulness of our approach by developing two highly automated surveillance systems comprising passive and active pan/tilt/zoom cameras that are deployed in a virtual train station environment populated by autonomous, lifelike virtual pedestrians. The easily reconfigurable virtual cameras distributed in this environment generate synthetic video feeds that emulate those acquired by real surveillance cameras monitoring public spaces. The novel multi-camera control strategies that we describe enable the cameras to collaborate in persistently observing pedestrians of interest and in acquiring close-up videos of pedestrians in designated areas.

  6. Generation IV Nuclear Energy Systems Construction Cost Reductions through the Use of Virtual Environments - Task 4 Report: Virtual Mockup Maintenance Task Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy Shaw; Anthony Baratta; Vaughn Whisker

    2005-02-28

    Task 4 report of 3 year DOE NERI-sponsored effort evaluating immersive virtual reality (CAVE) technology for design review, construction planning, and maintenance planning and training for next generation nuclear power plants. Program covers development of full-scale virtual mockups generated from 3D CAD data presented in a CAVE visualization facility. This report focuses on using Full-scale virtual mockups for nuclear power plant training applications.

  7. Novel virtual reality system integrating online self-face viewing and mirror visual feedback for stroke rehabilitation: rationale and feasibility.

    PubMed

    Shiri, Shimon; Feintuch, Uri; Lorber-Haddad, Adi; Moreh, Elior; Twito, Dvora; Tuchner-Arieli, Maya; Meiner, Zeev

    2012-01-01

    To introduce the rationale of a novel virtual reality system based on self-face viewing and mirror visual feedback, and to examine its feasibility as a rehabilitation tool for poststroke patients. A novel motion capture virtual reality system integrating online self-face viewing and mirror visual feedback has been developed for stroke rehabilitation.The system allows the replacement of the impaired arm by a virtual arm. Upon making small movements of the paretic arm, patients view themselves virtually performing healthy full-range movements. A sample of 6 patients in the acute poststroke phase received the virtual reality treatment concomitantly with conservative rehabilitation treatment. Feasibility was assessed during 10 sessions for each participant. All participants succeeded in operating the system, demonstrating its feasibility in terms of adherence and improvement in task performance. Patients' performance within the virtual environment and a set of clinical-functional measures recorded before the virtual reality treatment, at 1 week, and after 3 months indicated neurological status and general functioning improvement. These preliminary results indicate that this newly developed virtual reality system is safe and feasible. Future randomized controlled studies are required to assess whether this system has beneficial effects in terms of enhancing upper limb function and quality of life in poststroke patients.

  8. Evaluation of historical museum interior lighting system using fully immersive virtual luminous environment

    NASA Astrophysics Data System (ADS)

    Navvab, Mojtaba; Bisegna, Fabio; Gugliermetti, Franco

    2013-05-01

    Saint Rocco Museum, a historical building in Venice, Italy is used as a case study to explore the performance of its' lighting system and visible light impact on viewing the large size art works. The transition from threedimensional architectural rendering to the three-dimensional virtual luminance mapping and visualization within a virtual environment is described as an integrated optical method for its application toward preservation of the cultural heritage of the space. Lighting simulation programs represent color as RGB triplets in a devicedependent color space such as ITU-R BT709. Prerequisite for this is a 3D-model which can be created within this computer aided virtual environment. The onsite measured surface luminance, chromaticity and spectral data were used as input to an established real-time indirect illumination and a physically based algorithms to produce the best approximation for RGB to be used as an input to generate the image of the objects. Conversion of RGB to and from spectra has been a major undertaking in order to match the infinite number of spectra to create the same colors that were defined by RGB in the program. The ability to simulate light intensity, candle power and spectral power distributions provide opportunity to examine the impact of color inter-reflections on historical paintings. VR offers an effective technique to quantify the visible light impact on human visual performance under precisely controlled representation of light spectrum that could be experienced in 3D format in a virtual environment as well as historical visual archives. The system can easily be expanded to include other measurements and stimuli.

  9. Development of virtual environment for treating acrophobia.

    PubMed

    Ku, J; Jang, D; Shin, M; Jo, H; Ahn, H; Lee, J; Cho, B; Kim, S I

    2001-01-01

    Virtual Reality (VR) is a new technology that makes humans communicate with computer. It allows the user to see, hear, feel and interact in a three-dimensional virtual world created graphically. Virtual Reality Therapy (VRT), based on this sophisticated technology, has been recently used in the treatment of subjects diagnosed with acrophobia, a disorder that is characterized by marked anxiety upon exposure to heights, avoidance of heights, and a resulting interference in functioning. Conventional virtual reality system for the treatment of acrophobia has a limitation in scope that it is based on over-costly devices or somewhat unrealistic graphic scene. The goal of this study was to develop a inexpensive and more realistic virtual environment for the exposure therapy of acrophobia. We constructed two types virtual environment. One is constituted a bungee-jump tower in the middle of a city. It includes the open lift surrounded by props beside tower that allowed the patient to feel sense of heights. Another is composed of diving boards which have various heights. It provides a view of a lower diving board and people swimming in the pool to serve the patient stimuli upon exposure to heights.

  10. Naver: a PC-cluster-based VR system

    NASA Astrophysics Data System (ADS)

    Park, ChangHoon; Ko, HeeDong; Kim, TaiYun

    2003-04-01

    In this paper, we present a new framework NAVER for virtual reality application. The NAVER is based on a cluster of low-cost personal computers. The goal of NAVER is to provide flexible, extensible, scalable and re-configurable framework for the virtual environments defined as the integration of 3D virtual space and external modules. External modules are various input or output devices and applications on the remote hosts. From the view of system, personal computers are divided into three servers according to its specific functions: Render Server, Device Server and Control Server. While Device Server contains external modules requiring event-based communication for the integration, Control Server contains external modules requiring synchronous communication every frame. And, the Render Server consists of 5 managers: Scenario Manager, Event Manager, Command Manager, Interaction Manager and Sync Manager. These managers support the declaration and operation of virtual environment and the integration with external modules on remote servers.

  11. Seamless 3D interaction for virtual tables, projection planes, and CAVEs

    NASA Astrophysics Data System (ADS)

    Encarnacao, L. M.; Bimber, Oliver; Schmalstieg, Dieter; Barton, Robert J., III

    2000-08-01

    The Virtual Table presents stereoscopic graphics to a user in a workbench-like setting. This device shares with other large- screen display technologies (such as data walls and surround- screen projection systems) the lack of human-centered unencumbered user interfaces and 3D interaction technologies. Such shortcomings present severe limitations to the application of virtual reality (VR) technology to time- critical applications as well as employment scenarios that involve heterogeneous groups of end-users without high levels of computer familiarity and expertise. Traditionally such employment scenarios are common in planning-related application areas such as mission rehearsal and command and control. For these applications, a high grade of flexibility with respect to the system requirements (display and I/O devices) as well as to the ability to seamlessly and intuitively switch between different interaction modalities and interaction are sought. Conventional VR techniques may be insufficient to meet this challenge. This paper presents novel approaches for human-centered interfaces to Virtual Environments focusing on the Virtual Table visual input device. It introduces new paradigms for 3D interaction in virtual environments (VE) for a variety of application areas based on pen-and-clipboard, mirror-in-hand, and magic-lens metaphors, and introduces new concepts for combining VR and augmented reality (AR) techniques. It finally describes approaches toward hybrid and distributed multi-user interaction environments and concludes by hypothesizing on possible use cases for defense applications.

  12. VEVI: A Virtual Reality Tool For Robotic Planetary Explorations

    NASA Technical Reports Server (NTRS)

    Piguet, Laurent; Fong, Terry; Hine, Butler; Hontalas, Phil; Nygren, Erik

    1994-01-01

    The Virtual Environment Vehicle Interface (VEVI), developed by the NASA Ames Research Center's Intelligent Mechanisms Group, is a modular operator interface for direct teleoperation and supervisory control of robotic vehicles. Virtual environments enable the efficient display and visualization of complex data. This characteristic allows operators to perceive and control complex systems in a natural fashion, utilizing the highly-evolved human sensory system. VEVI utilizes real-time, interactive, 3D graphics and position / orientation sensors to produce a range of interface modalities from the flat panel (windowed or stereoscopic) screen displays to head mounted/head-tracking stereo displays. The interface provides generic video control capability and has been used to control wheeled, legged, air bearing, and underwater vehicles in a variety of different environments. VEVI was designed and implemented to be modular, distributed and easily operated through long-distance communication links, using a communication paradigm called SYNERGY.

  13. Virtual reality in surgical skills training.

    PubMed

    Palter, Vanessa N; Grantcharov, Teodor P

    2010-06-01

    With recent concerns regarding patient safety, and legislation regarding resident work hours, it is accepted that a certain amount of surgical skills training will transition to the surgical skills laboratory. Virtual reality offers enormous potential to enhance technical and non-technical skills training outside the operating room. Virtual-reality systems range from basic low-fidelity devices to highly complex virtual environments. These systems can act as training and assessment tools, with the learned skills effectively transferring to an analogous clinical situation. Recent developments include expanding the role of virtual reality to allow for holistic, multidisciplinary team training in simulated operating rooms, and focusing on the role of virtual reality in evidence-based surgical curriculum design. Copyright 2010 Elsevier Inc. All rights reserved.

  14. Scripting human animations in a virtual environment

    NASA Technical Reports Server (NTRS)

    Goldsby, Michael E.; Pandya, Abhilash K.; Maida, James C.

    1994-01-01

    The current deficiencies of virtual environment (VE) are well known: annoying lag time in drawing the current view, drastically simplified environments to reduce that time lag, low resolution and narrow field of view. Animation scripting is an application of VE technology which can be carried out successfully despite these deficiencies. The final product is a smoothly moving high resolution animation displaying detailed models. In this system, the user is represented by a human computer model with the same body proportions. Using magnetic tracking, the motions of the model's upper torso, head and arms are controlled by the user's movements (18 degrees of freedom). The model's lower torso and global position and orientation are controlled by a spaceball and keypad (12 degrees of freedom). Using this system human motion scripts can be extracted from the user's movements while immersed in a simplified virtual environment. Recorded data is used to define key frames; motion is interpolated between them and post processing adds a more detailed environment. The result is a considerable savings in time and a much more natural-looking movement of a human figure in a smooth and seamless animation.

  15. Cranial implant design using augmented reality immersive system.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2007-01-01

    Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.

  16. Incorporating haptic effects into three-dimensional virtual environments to train the hemiparetic upper extremity

    PubMed Central

    Adamovich, Sergei; Fluet, Gerard G.; Merians, Alma S.; Mathai, Abraham; Qiu, Qinyin

    2010-01-01

    Current neuroscience has identified several constructs to increase the effectiveness of upper extremity rehabilitation. One is the use of progressive, skill acquisition-oriented training. Another approach emphasizes the use of bilateral activities. Building on these principles, this paper describes the design and feasibility testing of a robotic / virtual environment system designed to train the arm of persons who have had strokes. The system provides a variety of assistance modes, scalable workspaces and hand-robot interfaces allowing persons with strokes to train multiple joints in three dimensions. The simulations utilize assistance algorithms that adjust task difficulty both online and offline in relation to subject performance. Several distinctive haptic effects have been incorporated into the simulations. An adaptive master-slave relationship between the unimpaired and impaired arm encourages active movement of the subject's hemiparetic arm during a bimanual task. Adaptive anti-gravity support and damping stabilize the arm during virtual reaching and placement tasks. An adaptive virtual spring provides assistance to complete the movement if the subject is unable to complete the task in time. Finally, haptically rendered virtual objects help to shape the movement trajectory during a virtual placement task. A proof of concept study demonstrated this system to be safe, feasible and worthy of further study. PMID:19666345

  17. The Proposed Model of Collaborative Virtual Learning Environment for Introductory Programming Course

    ERIC Educational Resources Information Center

    Othman, Mahfudzah; Othman, Muhaini

    2012-01-01

    This paper discusses the proposed model of the collaborative virtual learning system for the introductory computer programming course which uses one of the collaborative learning techniques known as the "Think-Pair-Share". The main objective of this study is to design a model for an online learning system that facilitates the…

  18. Rats in Virtual Space: The development and implementation of a multimodal virtual reality system for small animals

    NASA Astrophysics Data System (ADS)

    Aharoni, Daniel Benjamin

    The integration of multimodal sensory information into a common neural code is a critical function of all complex nervous systems. This process is required for adaptive responding to incoming stimuli as well as the formation of a cognitive map of the external sensory environment. The underlying neural mechanisms of multimodal integration are poorly understood due, in part, to the technical difficulties of manipulating multimodal sensory information in combination with simultaneous in-vivo electrophysiological recording in awake behaving animals. We therefore developed a non-invasive multimodal virtual reality system that is conducive to wired electrophysiological recording techniques. This system allows for the dynamic presentation of highly immersive audiovisual virtual environments to rats maintained in a body fixed position on top of a quiet spherical treadmill. Notably, this allows the rats to remain at the same spatial location in the real world without the need for head fixation. This method opens the door for a wide array of future studies aimed at elucidating the underlying neural mechanisms of multimodal integration.

  19. Evaluating the Usability of Pinchigator, a system for Navigating Virtual Worlds using Pinch Gloves

    NASA Technical Reports Server (NTRS)

    Hamilton, George S.; Brookman, Stephen; Dumas, Joseph D. II; Tilghman, Neal

    2003-01-01

    Appropriate design of two dimensional user interfaces (2D U/I) utilizing the well known WIMP (Window, Icon, Menu, Pointing device) environment for computer software is well studied and guidance can be found in several standards. Three-dimensional U/I design is not nearly so mature as 2D U/I, and standards bodies have not reached consensus on what makes a usable interface. This is especially true when the tools for interacting with the virtual environment may include stereo viewing, real time trackers and pinch gloves instead of just a mouse & keyboard. Over the last several years the authors have created a 3D U/I system dubbed Pinchigator for navigating virtual worlds based on the dVise dV/Mockup visualization software, Fakespace Pinch Gloves and Pohlemus trackers. The current work is to test the usability of the system on several virtual worlds, suggest improvements to increase Pinchigator s usability, and then to generalize about what was learned and how those lessons might be applied to improve other 3D U/I systems.

  20. The Virtual Pelvic Floor, a tele-immersive educational environment.

    PubMed Central

    Pearl, R. K.; Evenhouse, R.; Rasmussen, M.; Dech, F.; Silverstein, J. C.; Prokasy, S.; Panko, W. B.

    1999-01-01

    This paper describes the development of the Virtual Pelvic Floor, a new method of teaching the complex anatomy of the pelvic region utilizing virtual reality and advanced networking technology. Virtual reality technology allows improved visualization of three-dimensional structures over conventional media because it supports stereo vision, viewer-centered perspective, large angles of view, and interactivity. Two or more ImmersaDesk systems, drafting table format virtual reality displays, are networked together providing an environment where teacher and students share a high quality three-dimensional anatomical model, and are able to converse, see each other, and to point in three dimensions to indicate areas of interest. This project was realized by the teamwork of surgeons, medical artists and sculptors, computer scientists, and computer visualization experts. It demonstrates the future of virtual reality for surgical education and applications for the Next Generation Internet. Images Figure 1 Figure 2 Figure 3 PMID:10566378

  1. Side Effects of Virtual Environments: A Review of the Literature

    DTIC Science & Technology

    2004-05-01

    Cybersickness symptoms are the unintended psychophysiological side effects of participation in virtual environments. Symptoms can occur both during...induced motion sickness, cybersickness is believed to result from sensory and perceptual mismatches between the visual and vestibular systems, and can...and the task carried out, can affect either incidence or severity of cybersickness . Taking account of these factors may avoid or minimize symptoms. This

  2. Awareware: Narrowcasting Attributes for Selective Attention, Privacy, and Multipresence

    NASA Astrophysics Data System (ADS)

    Cohen, Michael; Newton Fernando, Owen Noel

    The domain of cscw, computer-supported collaborative work, and DSC, distributed synchronous collaboration, spans real-time interactive multiuser systems, shared information spaces, and applications for teleexistence and artificial reality, including collaborative virtual environments ( cves) (Benford et al., 2001). As presence awareness systems emerge, it is important to develop appropriate interfaces and architectures for managing multimodal multiuser systems. Especially in consideration of the persistent connectivity enabled by affordable networked communication, shared distributed environments require generalized control of media streams, techniques to control source → sink transmissions in synchronous groupware, including teleconferences and chatspaces, online role-playing games, and virtual concerts.

  3. MASCARET: creating virtual learning environments from system modelling

    NASA Astrophysics Data System (ADS)

    Querrec, Ronan; Vallejo, Paola; Buche, Cédric

    2013-03-01

    The design process for a Virtual Learning Environment (VLE) such as that put forward in the SIFORAS project (SImulation FOR training and ASsistance) means that system specifications can be differentiated from pedagogical specifications. System specifications can also be obtained directly from the specialists' expertise; that is to say directly from Product Lifecycle Management (PLM) tools. To do this, the system model needs to be considered as a piece of VLE data. In this paper we present Mascaret, a meta-model which can be used to represent such system models. In order to ensure that the meta-model is capable of describing, representing and simulating such systems, MASCARET is based SysML1, a standard defined by Omg.

  4. Phenomenology tools on cloud infrastructures using OpenStack

    NASA Astrophysics Data System (ADS)

    Campos, I.; Fernández-del-Castillo, E.; Heinemeyer, S.; Lopez-Garcia, A.; Pahlen, F.; Borges, G.

    2013-04-01

    We present a new environment for computations in particle physics phenomenology employing recent developments in cloud computing. On this environment users can create and manage "virtual" machines on which the phenomenology codes/tools can be deployed easily in an automated way. We analyze the performance of this environment based on "virtual" machines versus the utilization of physical hardware. In this way we provide a qualitative result for the influence of the host operating system on the performance of a representative set of applications for phenomenology calculations.

  5. VRUSE--a computerised diagnostic tool: for usability evaluation of virtual/synthetic environment systems.

    PubMed

    Kalawsky, R S

    1999-02-01

    A special questionnaire (VRUSE) has been designed to measure the usability of a VR system according to the attitude and perception of its users. Important aspects of VR systems were carefully derived to produce key usability factors for the questionnaire. Unlike questionnaires designed for generic interfaces VRUSE is specifically designed to cater for evaluating virtual environments, being a diagnostic tool providing a wealth of information about a user's viewpoint of the interface. VRUSE can be used to great effect with other evaluation techniques to pinpoint problematical areas of a VR interface. Other applications include bench-marking of competitor VR systems.

  6. A Standard-Compliant Virtual Meeting System with Active Video Object Tracking

    NASA Astrophysics Data System (ADS)

    Lin, Chia-Wen; Chang, Yao-Jen; Wang, Chih-Ming; Chen, Yung-Chang; Sun, Ming-Ting

    2002-12-01

    This paper presents an H.323 standard compliant virtual video conferencing system. The proposed system not only serves as a multipoint control unit (MCU) for multipoint connection but also provides a gateway function between the H.323 LAN (local-area network) and the H.324 WAN (wide-area network) users. The proposed virtual video conferencing system provides user-friendly object compositing and manipulation features including 2D video object scaling, repositioning, rotation, and dynamic bit-allocation in a 3D virtual environment. A reliable, and accurate scheme based on background image mosaics is proposed for real-time extracting and tracking foreground video objects from the video captured with an active camera. Chroma-key insertion is used to facilitate video objects extraction and manipulation. We have implemented a prototype of the virtual conference system with an integrated graphical user interface to demonstrate the feasibility of the proposed methods.

  7. Virtual reality for health care: a survey.

    PubMed

    Moline, J

    1997-01-01

    This report surveys the state of the art in applications of virtual environments and related technologies for health care. Applications of these technologies are being developed for health care in the following areas: surgical procedures (remote surgery or telepresence, augmented or enhanced surgery, and planning and simulation of procedures before surgery); medical therapy; preventive medicine and patient education; medical education and training; visualization of massive medical databases; skill enhancement and rehabilitation; and architectural design for health-care facilities. To date, such applications have improved the quality of health care, and in the future they will result in substantial cost savings. Tools that respond to the needs of present virtual environment systems are being refined or developed. However, additional large-scale research is necessary in the following areas: user studies, use of robots for telepresence procedures, enhanced system reality, and improved system functionality.

  8. Using Virtual Reality to Improve Walking Post-Stroke: Translation to Individuals with Diabetes

    PubMed Central

    Deutsch, Judith E

    2011-01-01

    Use of virtual reality (VR) technology to improve walking for people post-stroke has been studied for its clinical application since 2004. The hardware and software used to create these systems has varied but has predominantly been constituted by projected environments with users walking on treadmills. Transfer of training from the virtual environment to real-world walking has modest but positive research support. Translation of the research findings to clinical practice has been hampered by commercial availability and costs of the VR systems. Suggestions for how the work for individuals post-stroke might be applied and adapted for individuals with diabetes and other impaired ambulatory conditions include involvement of the target user groups (both practitioners and clients) early in the design and integration of activity and education into the systems. PMID:21527098

  9. Using virtual reality to improve walking post-stroke: translation to individuals with diabetes.

    PubMed

    Deutsch, Judith E

    2011-03-01

    Use of virtual reality (VR) technology to improve walking for people post-stroke has been studied for its clinical application since 2004. The hardware and software used to create these systems has varied but has predominantly been constituted by projected environments with users walking on treadmills. Transfer of training from the virtual environment to real-world walking has modest but positive research support. Translation of the research findings to clinical practice has been hampered by commercial availability and costs of the VR systems. Suggestions for how the work for individuals post-stroke might be applied and adapted for individuals with diabetes and other impaired ambulatory conditions include involvement of the target user groups (both practitioners and clients) early in the design and integration of activity and education into the systems. © 2011 Diabetes Technology Society.

  10. SimCoach: An Intelligent Virtual Human System for Proving Healthcare Information and Support

    DTIC Science & Technology

    2010-08-01

    classrooms , offices, markets, etc.), the next important challenge will involve populating these environments with Virtual Human (VH) representations...in a virtual classroom (Parsons et al., 2007; Rizzo et al., 2006). Additionally, VHs have been used effectively for the conduct of social psychology...T Bowerly, J G Buckwalter and A A Rizzo (2007), A controlled clinical comparison of attention performance in children with ADHD in a virtual reality

  11. Virtual C Machine and Integrated Development Environment for ATMS Controllers.

    DOT National Transportation Integrated Search

    2000-04-01

    The overall objective of this project is to develop a prototype virtual machine that fits on current Advanced Traffic Management Systems (ATMS) controllers and provides functionality for complex traffic operations.;Prepared in cooperation with Utah S...

  12. Virtual environment display for a 3D audio room simulation

    NASA Technical Reports Server (NTRS)

    Chapin, William L.; Foster, Scott H.

    1992-01-01

    The development of a virtual environment simulation system integrating a 3D acoustic audio model with an immersive 3D visual scene is discussed. The system complements the acoustic model and is specified to: allow the listener to freely move about the space, a room of manipulable size, shape, and audio character, while interactively relocating the sound sources; reinforce the listener's feeling of telepresence in the acoustical environment with visual and proprioceptive sensations; enhance the audio with the graphic and interactive components, rather than overwhelm or reduce it; and serve as a research testbed and technology transfer demonstration. The hardware/software design of two demonstration systems, one installed and one portable, are discussed through the development of four iterative configurations.

  13. Rocinante, a virtual collaborative visualizer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, M.J.; Ice, L.G.

    1996-12-31

    With the goal of improving the ability of people around the world to share the development and use of intelligent systems, Sandia National Laboratories` Intelligent Systems and Robotics Center is developing new Virtual Collaborative Engineering (VCE) and Virtual Collaborative Control (VCC) technologies. A key area of VCE and VCC research is in shared visualization of virtual environments. This paper describes a Virtual Collaborative Visualizer (VCV), named Rocinante, that Sandia developed for VCE and VCC applications. Rocinante allows multiple participants to simultaneously view dynamic geometrically-defined environments. Each viewer can exclude extraneous detail or include additional information in the scene as desired.more » Shared information can be saved and later replayed in a stand-alone mode. Rocinante automatically scales visualization requirements with computer system capabilities. Models with 30,000 polygons and 4 Megabytes of texture display at 12 to 15 frames per second (fps) on an SGI Onyx and at 3 to 8 fps (without texture) on Indigo 2 Extreme computers. In its networked mode, Rocinante synchronizes its local geometric model with remote simulators and sensory systems by monitoring data transmitted through UDP packets. Rocinante`s scalability and performance make it an ideal VCC tool. Users throughout the country can monitor robot motions and the thinking behind their motion planners and simulators.« less

  14. The use of PC based VR in clinical medicine: the VREPAR projects.

    PubMed

    Riva, G; Bacchetta, M; Baruffi, M; Borgomainerio, E; Defrance, C; Gatti, F; Galimberti, C; Fontaneto, S; Marchi, S; Molinari, E; Nugues, P; Rinaldi, S; Rovetta, A; Ferretti, G S; Tonci, A; Wann, J; Vincelli, F

    1999-01-01

    Virtual reality (VR) is an emerging technology that alters the way individuals interact with computers: a 3D computer-generated environment in which a person can move about and interact as if he actually was inside it. Given to the high computational power required to create virtual environments, these are usually developed on expensive high-end workstations. However, the significant advances in PC hardware that have been made over the last three years, are making PC-based VR a possible solution for clinical assessment and therapy. VREPAR - Virtual Reality Environments for Psychoneurophysiological Assessment and Rehabilitation - are two European Community funded projects (Telematics for health - HC 1053/HC 1055 - http://www.psicologia.net) that are trying to develop a modular PC-based virtual reality system for the medical market. The paper describes the rationale of the developed modules and the preliminary results obtained.

  15. Classification of EMG signals using artificial neural networks for virtual hand prosthesis control.

    PubMed

    Mattioli, Fernando E R; Lamounier, Edgard A; Cardoso, Alexandre; Soares, Alcimar B; Andrade, Adriano O

    2011-01-01

    Computer-based training systems have been widely studied in the field of human rehabilitation. In health applications, Virtual Reality presents itself as an appropriate tool to simulate training environments without exposing the patients to risks. In particular, virtual prosthetic devices have been used to reduce the great mental effort needed by patients fitted with myoelectric prosthesis, during the training stage. In this paper, the application of Virtual Reality in a hand prosthesis training system is presented. To achieve this, the possibility of exploring Neural Networks in a real-time classification system is discussed. The classification technique used in this work resulted in a 95% success rate when discriminating 4 different hand movements.

  16. Implementing a Virtual Exchange Project for student nurses in Queensland and Nottingham.

    PubMed

    Todhunter, Fern; Hallawell, Bob; Pittaway, Debbie

    2013-09-01

    This paper discusses the design and initial progress made with a virtual learning environment to help student nurses develop an understanding of cultural awareness and globalization. Using a Web-based application the Virtual Exchange Project was designed to facilitate study-elsewhere experiences located in the student's own setting. Promotion of the United Nations' Millennium Development goals provided an opportunity for student nurses to explore changes in global health and disease patterns, nursing education and systems of nurse regulation in the United Kingdom and Australia. A pedagogical framework created for this activity acknowledges the social and academic identities that learners often use, when working together in a virtual environment. The architecture of the Virtual Exchange supports local conversations about nurse education and health and social issues across hemispheres. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Design Concerns in the Engineering of Virtual Worlds for Learning

    ERIC Educational Resources Information Center

    Rapanotti, Lucia; Hall, Jon G.

    2011-01-01

    The convergence of 3D simulation and social networking into current multi-user virtual environments has opened the door to new forms of interaction for learning in order to complement the face-to-face and Web 2.0-based systems. Yet, despite a growing user community, design knowledge for virtual worlds remains patchy, particularly when it comes to…

  18. Virtual Environments for People Who Are Visually Impaired Integrated into an Orientation and Mobility Program

    ERIC Educational Resources Information Center

    Lahav, Orly; Schloerb, David W.; Srinivasan, Mandayam A.

    2015-01-01

    Introduction: The BlindAid, a virtual system developed for orientation and mobility (O&M) training of people who are blind or have low vision, allows interaction with different virtual components (structures and objects) via auditory and haptic feedback. This research examined if and how the BlindAid that was integrated within an O&M…

  19. High Enrollment Course Success Factors in Virtual School: Factors Influencing Student Academic Achievement

    ERIC Educational Resources Information Center

    Liu, Feng; Cavanaugh, Cathy

    2011-01-01

    This paper describes a study of success factors in high enrollment courses in a K-12 virtual school learning environment. The influence of variables: time student spent in the learning management system (LMS), number of times logged into the LMS, teacher comment, participation in free or reduced lunch programs, student status in the virtual school…

  20. Control of an ER haptic master in a virtual slave environment for minimally invasive surgery applications

    NASA Astrophysics Data System (ADS)

    Han, Young-Min; Choi, Seung-Bok

    2008-12-01

    This paper presents the control performance of an electrorheological (ER) fluid-based haptic master device connected to a virtual slave environment that can be used for minimally invasive surgery (MIS). An already developed haptic joint featuring controllable ER fluid and a spherical joint mechanism is adopted for the master system. Medical forceps and an angular position measuring device are devised and integrated with the joint to establish the MIS master system. In order to embody a human organ in virtual space, a volumetric deformable object is used. The virtual object is then mathematically formulated by a shape-retaining chain-linked (S-chain) model. After evaluating the reflection force, computation time and compatibility with real-time control, the haptic architecture for MIS is established by incorporating the virtual slave with the master device so that the reflection force for the object of the virtual slave and the desired position for the master operator are transferred to each other. In order to achieve the desired force trajectories, a sliding mode controller is formulated and then experimentally realized. Tracking control performances for various force trajectories are evaluated and presented in the time domain.

  1. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases.

    PubMed

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-04-15

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients' brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies.

  2. Virtual reality interface devices in the reorganization of neural networks in the brain of patients with neurological diseases

    PubMed Central

    Gatica-Rojas, Valeska; Méndez-Rebolledo, Guillermo

    2014-01-01

    Two key characteristics of all virtual reality applications are interaction and immersion. Systemic interaction is achieved through a variety of multisensory channels (hearing, sight, touch, and smell), permitting the user to interact with the virtual world in real time. Immersion is the degree to which a person can feel wrapped in the virtual world through a defined interface. Virtual reality interface devices such as the Nintendo® Wii and its peripheral nunchuks-balance board, head mounted displays and joystick allow interaction and immersion in unreal environments created from computer software. Virtual environments are highly interactive, generating great activation of visual, vestibular and proprioceptive systems during the execution of a video game. In addition, they are entertaining and safe for the user. Recently, incorporating therapeutic purposes in virtual reality interface devices has allowed them to be used for the rehabilitation of neurological patients, e.g., balance training in older adults and dynamic stability in healthy participants. The improvements observed in neurological diseases (chronic stroke and cerebral palsy) have been shown by changes in the reorganization of neural networks in patients’ brain, along with better hand function and other skills, contributing to their quality of life. The data generated by such studies could substantially contribute to physical rehabilitation strategies. PMID:25206907

  3. Human-Computer Interaction and Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    1995-01-01

    The proceedings of the Workshop on Human-Computer Interaction and Virtual Environments are presented along with a list of attendees. The objectives of the workshop were to assess the state-of-technology and level of maturity of several areas in human-computer interaction and to provide guidelines for focused future research leading to effective use of these facilities in the design/fabrication and operation of future high-performance engineering systems.

  4. Modeling and performance analysis using extended fuzzy-timing Petri nets for networked virtual environments.

    PubMed

    Zhou, Y; Murata, T; Defanti, T A

    2000-01-01

    Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.

  5. Learning Intercultural Communication Skills with Virtual Humans: Feedback and Fidelity

    ERIC Educational Resources Information Center

    Lane, H. Chad; Hays, Matthew Jensen; Core, Mark G.; Auerbach, Daniel

    2013-01-01

    In the context of practicing intercultural communication skills, we investigated the role of fidelity in a game-based, virtual learning environment as well as the role of feedback delivered by an intelligent tutoring system. In 2 experiments, we compared variations on the game interface, use of the tutoring system, and the form of the feedback.…

  6. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  7. Perceptual Fidelity vs. Engineering Compromises In Virtual Acoustic Displays

    NASA Technical Reports Server (NTRS)

    Wenzel, Elizabeth M.; Ahumada, Albert (Technical Monitor)

    1997-01-01

    Immersive, three-dimensional displays are increasingly becoming a goal of advanced human-machine interfaces. While the technology for achieving truly useful multisensory environments is still being developed, techniques for generating three-dimensional sound are now both sophisticated and practical enough to be applied to acoustic displays. The ultimate goal of virtual acoustics is to simulate the complex acoustic field experienced by a listener freely moving around within an environment. Of course, such complexity, freedom of movement and interactively is not always possible in a "true" virtual environment, much less in lower-fidelity multimedia systems. However, many of the perceptual and engineering constraints (and frustrations) that researchers, engineers and listeners have experienced in virtual audio are relevant to multimedia. In fact, some of the problems that have been studied will be even more of an issue for lower fidelity systems that are attempting to address the requirements of a huge, diverse and ultimately unknown audience. Examples include individual differences in head-related transfer functions, a lack of real interactively (head-tracking) in many multimedia displays, and perceptual degradation due to low sampling rates and/or low-bit compression. This paper discusses some of the engineering Constraints faced during implementation of virtual acoustic environments and the perceptual consequences of these constraints. Specific examples are given for NASA applications such as telerobotic control, aeronautical displays, and shuttle launch communications. An attempt will also be made to relate these issues to low-fidelity implementations such as the internet.

  8. Perceptual Fidelity Versus Engineering Compromises in Virtual Acoustic Displays

    NASA Technical Reports Server (NTRS)

    Wenzel, Elizabeth M.; Ellis, Stephen R. (Technical Monitor); Frey, Mary Anne (Technical Monitor); Schneider, Victor S. (Technical Monitor)

    1997-01-01

    Immersive, three-dimensional displays are increasingly becoming a goal of advanced human-machine interfaces. While the technology for achieving truly useful multisensory environments is still being developed, techniques for generating three-dimensional sound are now both sophisticated and practical enough to be applied to acoustic displays. The ultimate goal of virtual acoustics is to simulate the complex acoustic field experienced by a listener freely moving around within an environment. Of course, such complexity, freedom of movement and interactivity is not always possible in a 'true' virtual environment, much less in lower-fidelity multimedia systems. However, many of the perceptual and engineering constraints (and frustrations) that researchers, engineers and listeners have experienced in virtual audio are relevant to multimedia. In fact, some of the problems that have been studied will be even more of an issue for lower fidelity systems that are attempting to address the requirements of a huge, diverse and ultimately unknown audience. Examples include individual differences in head-related transfer functions, A lack of real interactively (head-tracking) in many multimedia displays, and perceptual degradation due to low sampling rates and/or low-bit compression. This paper discusses some of the engineering constraints faced during implementation of virtual acoustic environments and the perceptual consequences of these constraints. Specific examples are given for NASA applications such as telerobotic control, aeronautical displays, and shuttle launch communications. An attempt will also be made to relate these issues to low-fidelity implementations such as the internet.

  9. Concept of Operations for Commercial and Business Aircraft Synthetic Vision Systems. 1.0

    NASA Technical Reports Server (NTRS)

    Williams Daniel M.; Waller, Marvin C.; Koelling, John H.; Burdette, Daniel W.; Capron, William R.; Barry, John S.; Gifford, Richard B.; Doyle, Thomas M.

    2001-01-01

    A concept of operations (CONOPS) for the Commercial and Business (CaB) aircraft synthetic vision systems (SVS) is described. The CaB SVS is expected to provide increased safety and operational benefits in normal and low visibility conditions. Providing operational benefits will promote SVS implementation in the Net, improve aviation safety, and assist in meeting the national aviation safety goal. SVS will enhance safety and enable consistent gate-to-gate aircraft operations in normal and low visibility conditions. The goal for developing SVS is to support operational minima as low as Category 3b in a variety of environments. For departure and ground operations, the SVS goal is to enable operations with a runway visual range of 300 feet. The system is an integrated display concept that provides a virtual visual environment. The SVS virtual visual environment is composed of three components: an enhanced intuitive view of the flight environment, hazard and obstacle defection and display, and precision navigation guidance. The virtual visual environment will support enhanced operations procedures during all phases of flight - ground operations, departure, en route, and arrival. The applications selected for emphasis in this document include low visibility departures and arrivals including parallel runway operations, and low visibility airport surface operations. These particular applications were selected because of significant potential benefits afforded by SVS.

  10. Haptic interfaces: Hardware, software and human performance

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandayam A.

    1995-01-01

    Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed.

  11. Motor learning from virtual reality to natural environments in individuals with Duchenne muscular dystrophy.

    PubMed

    Quadrado, Virgínia Helena; Silva, Talita Dias da; Favero, Francis Meire; Tonks, James; Massetti, Thais; Monteiro, Carlos Bandeira de Mello

    2017-11-10

    To examine whether performance improvements in the virtual environment generalize to the natural environment. we had 64 individuals, 32 of which were individuals with DMD and 32 were typically developing individuals. The groups practiced two coincidence timing tasks. In the more tangible button-press task, the individuals were required to 'intercept' a falling virtual object at the moment it reached the interception point by pressing a key on the computer. In the more abstract task, they were instructed to 'intercept' the virtual object by making a hand movement in a virtual environment using a webcam. For individuals with DMD, conducting a coincidence timing task in a virtual environment facilitated transfer to the real environment. However, we emphasize that a task practiced in a virtual environment should have higher rates of difficulties than a task practiced in a real environment. IMPLICATIONS FOR REHABILITATION Virtual environments can be used to promote improved performance in ?real-world? environments. Virtual environments offer the opportunity to create paradigms similar ?real-life? tasks, however task complexity and difficulty levels can be manipulated, graded and enhanced to increase likelihood of success in transfer of learning and performance. Individuals with DMD, in particular, showed immediate performance benefits after using virtual reality.

  12. A fast and flexible panoramic virtual reality system for behavioural and electrophysiological experiments.

    PubMed

    Takalo, Jouni; Piironen, Arto; Honkanen, Anna; Lempeä, Mikko; Aikio, Mika; Tuukkanen, Tuomas; Vähäsöyrinki, Mikko

    2012-01-01

    Ideally, neuronal functions would be studied by performing experiments with unconstrained animals whilst they behave in their natural environment. Although this is not feasible currently for most animal models, one can mimic the natural environment in the laboratory by using a virtual reality (VR) environment. Here we present a novel VR system based upon a spherical projection of computer generated images using a modified commercial data projector with an add-on fish-eye lens. This system provides equidistant visual stimulation with extensive coverage of the visual field, high spatio-temporal resolution and flexible stimulus generation using a standard computer. It also includes a track-ball system for closed-loop behavioural experiments with walking animals. We present a detailed description of the system and characterize it thoroughly. Finally, we demonstrate the VR system's performance whilst operating in closed-loop conditions by showing the movement trajectories of the cockroaches during exploratory behaviour in a VR forest.

  13. Virtualization in the Operations Environments

    NASA Technical Reports Server (NTRS)

    Pitts, Lee; Lankford, Kim; Felton, Larry; Pruitt, Robert

    2010-01-01

    Virtualization provides the opportunity to continue to do "more with less"---more computing power with fewer physical boxes, thus reducing the overall hardware footprint, power and cooling requirements, software licenses, and their associated costs. This paper explores the tremendous advantages and any disadvantages of virtualization in all of the environments associated with software and systems development to operations flow. It includes the use and benefits of the Intelligent Platform Management Interface (IPMI) specification, and identifies lessons learned concerning hardware and network configurations. Using the Huntsville Operations Support Center (HOSC) at NASA Marshall Space Flight Center as an example, we demonstrate that deploying virtualized servers as a means of managing computing resources is applicable and beneficial to many areas of application, up to and including flight operations.

  14. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gabert, Kasimir; Burns, Ian; Elliott, Steven

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model,more » either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.« less

  15. Shared virtual environments for telerehabilitation.

    PubMed

    Popescu, George V; Burdea, Grigore; Boian, Rares

    2002-01-01

    Current VR telerehabilitation systems use offline remote monitoring from the clinic and patient-therapist videoconferencing. Such "store and forward" and video-based systems cannot implement medical services involving patient therapist direct interaction. Real-time telerehabilitation applications (including remote therapy) can be developed using a shared Virtual Environment (VE) architecture. We developed a two-user shared VE for hand telerehabilitation. Each site has a telerehabilitation workstation with a videocamera and a Rutgers Master II (RMII) force feedback glove. Each user can control a virtual hand and interact hapticly with virtual objects. Simulated physical interactions between therapist and patient are implemented using hand force feedback. The therapist's graphic interface contains several virtual panels, which allow control over the rehabilitation process. These controls start a videoconferencing session, collect patient data, or apply therapy. Several experimental telerehabilitation scenarios were successfully tested on a LAN. A Web-based approach to "real-time" patient telemonitoring--the monitoring portal for hand telerehabilitation--was also developed. The therapist interface is implemented as a Java3D applet that monitors patient hand movement. The monitoring portal gives real-time performance on off-the-shelf desktop workstations.

  16. Application of advanced virtual reality and 3D computer assisted technologies in tele-3D-computer assisted surgery in rhinology.

    PubMed

    Klapan, Ivica; Vranjes, Zeljko; Prgomet, Drago; Lukinović, Juraj

    2008-03-01

    The real-time requirement means that the simulation should be able to follow the actions of the user that may be moving in the virtual environment. The computer system should also store in its memory a three-dimensional (3D) model of the virtual environment. In that case a real-time virtual reality system will update the 3D graphic visualization as the user moves, so that up-to-date visualization is always shown on the computer screen. Upon completion of the tele-operation, the surgeon compares the preoperative and postoperative images and models of the operative field, and studies video records of the procedure itself Using intraoperative records, animated images of the real tele-procedure performed can be designed. Virtual surgery offers the possibility of preoperative planning in rhinology. The intraoperative use of computer in real time requires development of appropriate hardware and software to connect medical instrumentarium with the computer and to operate the computer by thus connected instrumentarium and sophisticated multimedia interfaces.

  17. Tools virtualization for command and control systems

    NASA Astrophysics Data System (ADS)

    Piszczek, Marek; Maciejewski, Marcin; Pomianek, Mateusz; Szustakowski, Mieczysław

    2017-10-01

    Information management is an inseparable part of the command process. The result is that the person making decisions at the command post interacts with data providing devices in various ways. Tools virtualization process can introduce a number of significant modifications in the design of solutions for management and command. The general idea involves replacing physical devices user interface with their digital representation (so-called Virtual instruments). A more advanced level of the systems "digitalization" is to use the mixed reality environments. In solutions using Augmented reality (AR) customized HMI is displayed to the operator when he approaches to each device. Identification of device is done by image recognition of photo codes. Visualization is achieved by (optical) see-through head mounted display (HMD). Control can be done for example by means of a handheld touch panel. Using the immersive virtual environment, the command center can be digitally reconstructed. Workstation requires only VR system (HMD) and access to information network. Operator can interact with devices in such a way as it would perform in real world (for example with the virtual hands). Because of their procedures (an analysis of central vision, eye tracking) MR systems offers another useful feature of reducing requirements for system data throughput. Due to the fact that at the moment we focus on the single device. Experiments carried out using Moverio BT-200 and SteamVR systems and the results of experimental application testing clearly indicate the ability to create a fully functional information system with the use of mixed reality technology.

  18. Subjective visual vertical assessment with mobile virtual reality system.

    PubMed

    Ulozienė, Ingrida; Totilienė, Milda; Paulauskas, Andrius; Blažauskas, Tomas; Marozas, Vaidotas; Kaski, Diego; Ulozas, Virgilijus

    2017-01-01

    The subjective visual vertical (SVV) is a measure of a subject's perceived verticality, and a sensitive test of vestibular dysfunction. Despite this, and consequent upon technical and logistical limitations, SVV has not entered mainstream clinical practice. The aim of the study was to develop a mobile virtual reality based system for SVV test, evaluate the suitability of different controllers and assess the system's usability in practical settings. In this study, we describe a novel virtual reality based system that has been developed to test SVV using integrated software and hardware, and report normative values across healthy population. Participants wore a mobile virtual reality headset in order to observe a 3D stimulus presented across separate conditions - static, dynamic and an immersive real-world ("boat in the sea") SVV tests. The virtual reality environment was controlled by the tester using a Bluetooth connected controllers. Participants controlled the movement of a vertical arrow using either a gesture control armband or a general-purpose gamepad, to indicate perceived verticality. We wanted to compare 2 different methods for object control in the system, determine normal values and compare them with literature data, to evaluate the developed system with the help of the system usability scale questionnaire and evaluate possible virtually induced dizziness with the help of subjective visual analog scale. There were no statistically significant differences in SVV values during static, dynamic and virtual reality stimulus conditions, obtained using the two different controllers and the results are compared to those previously reported in the literature using alternative methodologies. The SUS scores for the system were high, with a median of 82.5 for the Myo controller and of 95.0 for the Gamepad controller, representing a statistically significant difference between the two controllers (P<0.01). The median of virtual reality-induced dizziness for both devices was 0.7. The mobile virtual reality based system for implementation of subjective visual vertical test, is accurate and applicable in the clinical environment. The gamepad-based virtual object control method was preferred by the users. The tests were well tolerated with low dizziness scores in the majority of patients. Copyright © 2018 The Lithuanian University of Health Sciences. Production and hosting by Elsevier Sp. z o.o. All rights reserved.

  19. Suitability of virtual prototypes to support human factors/ergonomics evaluation during the design.

    PubMed

    Aromaa, Susanna; Väänänen, Kaisa

    2016-09-01

    In recent years, the use of virtual prototyping has increased in product development processes, especially in the assessment of complex systems targeted at end-users. The purpose of this study was to evaluate the suitability of virtual prototyping to support human factors/ergonomics evaluation (HFE) during the design phase. Two different virtual prototypes were used: augmented reality (AR) and virtual environment (VE) prototypes of a maintenance platform of a rock crushing machine. Nineteen designers and other stakeholders were asked to assess the suitability of the prototype for HFE evaluation. Results indicate that the system model characteristics and user interface affect the experienced suitability. The VE system was valued as being more suitable to support the assessment of visibility, reach, and the use of tools than the AR system. The findings of this study can be used as a guidance for the implementing virtual prototypes in the product development process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.

  1. Interface Design Implications for Recalling the Spatial Configuration of Virtual Auditory Environments

    NASA Astrophysics Data System (ADS)

    McMullen, Kyla A.

    Although the concept of virtual spatial audio has existed for almost twenty-five years, only in the past fifteen years has modern computing technology enabled the real-time processing needed to deliver high-precision spatial audio. Furthermore, the concept of virtually walking through an auditory environment did not exist. The applications of such an interface have numerous potential uses. Spatial audio has the potential to be used in various manners ranging from enhancing sounds delivered in virtual gaming worlds to conveying spatial locations in real-time emergency response systems. To incorporate this technology in real-world systems, various concerns should be addressed. First, to widely incorporate spatial audio into real-world systems, head-related transfer functions (HRTFs) must be inexpensively created for each user. The present study further investigated an HRTF subjective selection procedure previously developed within our research group. Users discriminated auditory cues to subjectively select their preferred HRTF from a publicly available database. Next, the issue of training to find virtual sources was addressed. Listeners participated in a localization training experiment using their selected HRTFs. The training procedure was created from the characterization of successful search strategies in prior auditory search experiments. Search accuracy significantly improved after listeners performed the training procedure. Next, in the investigation of auditory spatial memory, listeners completed three search and recall tasks with differing recall methods. Recall accuracy significantly decreased in tasks that required the storage of sound source configurations in memory. To assess the impacts of practical scenarios, the present work assessed the performance effects of: signal uncertainty, visual augmentation, and different attenuation modeling. Fortunately, source uncertainty did not affect listeners' ability to recall or identify sound sources. The present study also found that the presence of visual reference frames significantly increased recall accuracy. Additionally, the incorporation of drastic attenuation significantly improved environment recall accuracy. Through investigating the aforementioned concerns, the present study made initial footsteps guiding the design of virtual auditory environments that support spatial configuration recall.

  2. Shared virtual environments for aerospace training

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen; Voss, Mark

    1994-01-01

    Virtual environments have the potential to significantly enhance the training of NASA astronauts and ground-based personnel for a variety of activities. A critical requirement is the need to share virtual environments, in real or near real time, between remote sites. It has been hypothesized that the training of international astronaut crews could be done more cheaply and effectively by utilizing such shared virtual environments in the early stages of mission preparation. The Software Technology Branch at NASA's Johnson Space Center has developed the capability for multiple users to simultaneously share the same virtual environment. Each user generates the graphics needed to create the virtual environment. All changes of object position and state are communicated to all users so that each virtual environment maintains its 'currency.' Examples of these shared environments will be discussed and plans for the utilization of the Department of Defense's Distributed Interactive Simulation (DIS) protocols for shared virtual environments will be presented. Finally, the impact of this technology on training and education in general will be explored.

  3. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  4. BioImg.org: A Catalog of Virtual Machine Images for the Life Sciences

    PubMed Central

    Dahlö, Martin; Haziza, Frédéric; Kallio, Aleksi; Korpelainen, Eija; Bongcam-Rudloff, Erik; Spjuth, Ola

    2015-01-01

    Virtualization is becoming increasingly important in bioscience, enabling assembly and provisioning of complete computer setups, including operating system, data, software, and services packaged as virtual machine images (VMIs). We present an open catalog of VMIs for the life sciences, where scientists can share information about images and optionally upload them to a server equipped with a large file system and fast Internet connection. Other scientists can then search for and download images that can be run on the local computer or in a cloud computing environment, providing easy access to bioinformatics environments. We also describe applications where VMIs aid life science research, including distributing tools and data, supporting reproducible analysis, and facilitating education. BioImg.org is freely available at: https://bioimg.org. PMID:26401099

  5. BioImg.org: A Catalog of Virtual Machine Images for the Life Sciences.

    PubMed

    Dahlö, Martin; Haziza, Frédéric; Kallio, Aleksi; Korpelainen, Eija; Bongcam-Rudloff, Erik; Spjuth, Ola

    2015-01-01

    Virtualization is becoming increasingly important in bioscience, enabling assembly and provisioning of complete computer setups, including operating system, data, software, and services packaged as virtual machine images (VMIs). We present an open catalog of VMIs for the life sciences, where scientists can share information about images and optionally upload them to a server equipped with a large file system and fast Internet connection. Other scientists can then search for and download images that can be run on the local computer or in a cloud computing environment, providing easy access to bioinformatics environments. We also describe applications where VMIs aid life science research, including distributing tools and data, supporting reproducible analysis, and facilitating education. BioImg.org is freely available at: https://bioimg.org.

  6. The specificity of memory enhancement during interaction with a virtual environment.

    PubMed

    Brooks, B M; Attree, E A; Rose, F D; Clifford, B R; Leadbetter, A G

    1999-01-01

    Two experiments investigated differences between active and passive participation in a computer-generated virtual environment in terms of spatial memory, object memory, and object location memory. It was found that active participants, who controlled their movements in the virtual environment using a joystick, recalled the spatial layout of the virtual environment better than passive participants, who merely watched the active participants' progress. Conversely, there were no significant differences between the active and passive participants' recall or recognition of the virtual objects, nor in their recall of the correct locations of objects in the virtual environment. These findings are discussed in terms of subject-performed task research and the specificity of memory enhancement in virtual environments.

  7. Low-cost telepresence for collaborative virtual environments.

    PubMed

    Rhee, Seon-Min; Ziegler, Remo; Park, Jiyoung; Naef, Martin; Gross, Markus; Kim, Myoung-Hee

    2007-01-01

    We present a novel low-cost method for visual communication and telepresence in a CAVE -like environment, relying on 2D stereo-based video avatars. The system combines a selection of proven efficient algorithms and approximations in a unique way, resulting in a convincing stereoscopic real-time representation of a remote user acquired in a spatially immersive display. The system was designed to extend existing projection systems with acquisition capabilities requiring minimal hardware modifications and cost. The system uses infrared-based image segmentation to enable concurrent acquisition and projection in an immersive environment without a static background. The system consists of two color cameras and two additional b/w cameras used for segmentation in the near-IR spectrum. There is no need for special optics as the mask and color image are merged using image-warping based on a depth estimation. The resulting stereo image stream is compressed, streamed across a network, and displayed as a frame-sequential stereo texture on a billboard in the remote virtual environment.

  8. A Virtual Mission Operations Center: Collaborative Environment

    NASA Technical Reports Server (NTRS)

    Medina, Barbara; Bussman, Marie; Obenschain, Arthur F. (Technical Monitor)

    2002-01-01

    The Virtual Mission Operations Center - Collaborative Environment (VMOC-CE) intent is to have a central access point for all the resources used in a collaborative mission operations environment to assist mission operators in communicating on-site and off-site in the investigation and resolution of anomalies. It is a framework that as a minimum incorporates online chat, realtime file sharing and remote application sharing components in one central location. The use of a collaborative environment in mission operations opens up the possibilities for a central framework for other project members to access and interact with mission operations staff remotely. The goal of the Virtual Mission Operations Center (VMOC) Project is to identify, develop, and infuse technology to enable mission control by on-call personnel in geographically dispersed locations. In order to achieve this goal, the following capabilities are needed: Autonomous mission control systems Automated systems to contact on-call personnel Synthesis and presentation of mission control status and history information Desktop tools for data and situation analysis Secure mechanism for remote collaboration commanding Collaborative environment for remote cooperative work The VMOC-CE is a collaborative environment that facilitates remote cooperative work. It is an application instance of the Virtual System Design Environment (VSDE), developed by NASA Goddard Space Flight Center's (GSFC) Systems Engineering Services & Advanced Concepts (SESAC) Branch. The VSDE is a web-based portal that includes a knowledge repository and collaborative environment to serve science and engineering teams in product development. It is a "one stop shop" for product design, providing users real-time access to product development data, engineering and management tools, and relevant design specifications and resources through the Internet. The initial focus of the VSDE has been to serve teams working in the early portion of the system/product lifecycle - concept development, proposal preparation, and formulation. The VMOC-CE expands the application of the VSDE into the operations portion of the system lifecycle. It will enable meaningful and real-time collaboration regardless of the geographical distribution of project team members. Team members will be able to interact in satellite operations, specifically for resolving anomalies, through access to a desktop computer and the Internet. Mission Operations Management will be able to participate and monitor up to the minute status of anomalies or other mission operations issues. In this paper we present the VMOC-CE project, system capabilities, and technologies.

  9. Validation of virtual reality as a tool to understand and prevent child pedestrian injury.

    PubMed

    Schwebel, David C; Gaines, Joanna; Severson, Joan

    2008-07-01

    In recent years, virtual reality has emerged as an innovative tool for health-related education and training. Among the many benefits of virtual reality is the opportunity for novice users to engage unsupervised in a safe environment when the real environment might be dangerous. Virtual environments are only useful for health-related research, however, if behavior in the virtual world validly matches behavior in the real world. This study was designed to test the validity of an immersive, interactive virtual pedestrian environment. A sample of 102 children and 74 adults was recruited to complete simulated road-crossings in both the virtual environment and the identical real environment. In both the child and adult samples, construct validity was demonstrated via significant correlations between behavior in the virtual and real worlds. Results also indicate construct validity through developmental differences in behavior; convergent validity by showing correlations between parent-reported child temperament and behavior in the virtual world; internal reliability of various measures of pedestrian safety in the virtual world; and face validity, as measured by users' self-reported perception of realism in the virtual world. We discuss issues of generalizability to other virtual environments, and the implications for application of virtual reality to understanding and preventing pediatric pedestrian injuries.

  10. Lessons about Virtual-Environment Software Systems from 20 years of VE building

    PubMed Central

    Taylor, Russell M.; Jerald, Jason; VanderKnyff, Chris; Wendt, Jeremy; Borland, David; Marshburn, David; Sherman, William R.; Whitton, Mary C.

    2010-01-01

    What are desirable and undesirable features of virtual-environment (VE) software architectures? What should be present (and absent) from such systems if they are to be optimally useful? How should they be structured? To help answer these questions we present experience from application designers, toolkit designers, and VE system architects along with examples of useful features from existing systems. Topics are organized under the major headings of: 3D space management, supporting display hardware, interaction, event management, time management, computation, portability, and the observation that less can be better. Lessons learned are presented as discussion of the issues, field experiences, nuggets of knowledge, and case studies. PMID:20567602

  11. From Virtual Environments to Physical Environments: Exploring Interactivity in Ubiquitous-Learning Systems

    ERIC Educational Resources Information Center

    Peng, Hsinyi; Chou, Chien; Chang, Chun-Yu

    2008-01-01

    Computing devices and applications are now used beyond the desktop, in diverse environments, and this trend toward ubiquitous computing is evolving. In this study, we re-visit the interactivity concept and its applications for interactive function design in a ubiquitous-learning system (ULS). Further, we compare interactivity dimensions and…

  12. Evolution of A Distributed Live, Virtual, Constructive Environment for Human in the Loop Unmanned Aircraft Testing

    NASA Technical Reports Server (NTRS)

    Murphy, James R.; Otto, Neil M.

    2017-01-01

    NASA's Unmanned Aircraft Systems Integration in the National Airspace System Project is conducting human in the loop simulations and flight testing intended to reduce barriers associated with enabling routine airspace access for unmanned aircraft. The primary focus of these tests is interaction of the unmanned aircraft pilot with the display of detect and avoid alerting and guidance information. The project's integrated test and evaluation team was charged with developing the test infrastructure. As with any development effort, compromises in the underlying system architecture and design were made to allow for the rapid prototyping and open-ended nature of the research. In order to accommodate these design choices, a distributed test environment was developed incorporating Live, Virtual, Constructive, (LVC) concepts. The LVC components form the core infrastructure support simulation of UAS operations by integrating live and virtual aircraft in a realistic air traffic environment. This LVC infrastructure enables efficient testing by leveraging the use of existing assets distributed across multiple NASA Centers. Using standard LVC concepts enable future integration with existing simulation infrastructure.

  13. Evolution of A Distributed Live, Virtual, Constructive Environment for Human in the Loop Unmanned Aircraft Testing

    NASA Technical Reports Server (NTRS)

    Murphy, Jim; Otto, Neil

    2017-01-01

    NASA's Unmanned Aircraft Systems Integration in the National Airspace System Project is conducting human in the loop simulations and flight testing intended to reduce barriers associated with enabling routine airspace access for unmanned aircraft. The primary focus of these tests is interaction of the unmanned aircraft pilot with the display of detect and avoid alerting and guidance information. The projects integrated test and evaluation team was charged with developing the test infrastructure. As with any development effort, compromises in the underlying system architecture and design were made to allow for the rapid prototyping and open-ended nature of the research. In order to accommodate these design choices, a distributed test environment was developed incorporating Live, Virtual, Constructive, (LVC) concepts. The LVC components form the core infrastructure support simulation of UAS operations by integrating live and virtual aircraft in a realistic air traffic environment. This LVC infrastructure enables efficient testing by leveraging the use of existing assets distributed across multiple NASA Centers. Using standard LVC concepts enable future integration with existing simulation infrastructure.

  14. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Li, C.; Wang, J.; Cui, C.; He, B.; Fan, D.; Yang, Y.; Chen, J.; Zhang, H.; Yu, C.; Xiao, J.; Wang, C.; Cao, Z.; Fan, Y.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Wang, J.; Yin, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on CloudStack, an open source software, we set up the cloud computing environment for AstroCloud Project. It consists of five distributed nodes across the mainland of China. Users can use and analysis data in this cloud computing environment. Based on GlusterFS, we built a scalable cloud storage system. Each user has a private space, which can be shared among different virtual machines and desktop systems. With this environments, astronomer can access to astronomical data collected by different telescopes and data centers easily, and data producers can archive their datasets safely.

  15. VIPER: Virtual Intelligent Planetary Exploration Rover

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Flueckiger, Lorenzo; Nguyen, Laurent; Washington, Richard

    2001-01-01

    Simulation and visualization of rover behavior are critical capabilities for scientists and rover operators to construct, test, and validate plans for commanding a remote rover. The VIPER system links these capabilities. using a high-fidelity virtual-reality (VR) environment. a kinematically accurate simulator, and a flexible plan executive to allow users to simulate and visualize possible execution outcomes of a plan under development. This work is part of a larger vision of a science-centered rover control environment, where a scientist may inspect and explore the environment via VR tools, specify science goals, and visualize the expected and actual behavior of the remote rover. The VIPER system is constructed from three generic systems, linked together via a minimal amount of customization into the integrated system. The complete system points out the power of combining plan execution, simulation, and visualization for envisioning rover behavior; it also demonstrates the utility of developing generic technologies. which can be combined in novel and useful ways.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.

  17. Parallel-distributed mobile robot simulator

    NASA Astrophysics Data System (ADS)

    Okada, Hiroyuki; Sekiguchi, Minoru; Watanabe, Nobuo

    1996-06-01

    The aim of this project is to achieve an autonomous learning and growth function based on active interaction with the real world. It should also be able to autonomically acquire knowledge about the context in which jobs take place, and how the jobs are executed. This article describes a parallel distributed movable robot system simulator with an autonomous learning and growth function. The autonomous learning and growth function which we are proposing is characterized by its ability to learn and grow through interaction with the real world. When the movable robot interacts with the real world, the system compares the virtual environment simulation with the interaction result in the real world. The system then improves the virtual environment to match the real-world result more closely. This the system learns and grows. It is very important that such a simulation is time- realistic. The parallel distributed movable robot simulator was developed to simulate the space of a movable robot system with an autonomous learning and growth function. The simulator constructs a virtual space faithful to the real world and also integrates the interfaces between the user, the actual movable robot and the virtual movable robot. Using an ultrafast CG (computer graphics) system (FUJITSU AG series), time-realistic 3D CG is displayed.

  18. Increasing Accessibility to the Blind of Virtual Environments, Using a Virtual Mobility Aid Based On the "EyeCane": Feasibility Study

    PubMed Central

    Maidenbaum, Shachar; Levy-Tzedek, Shelly; Chebat, Daniel-Robert; Amedi, Amir

    2013-01-01

    Virtual worlds and environments are becoming an increasingly central part of our lives, yet they are still far from accessible to the blind. This is especially unfortunate as such environments hold great potential for them for uses such as social interaction, online education and especially for use with familiarizing the visually impaired user with a real environment virtually from the comfort and safety of his own home before visiting it in the real world. We have implemented a simple algorithm to improve this situation using single-point depth information, enabling the blind to use a virtual cane, modeled on the “EyeCane” electronic travel aid, within any virtual environment with minimal pre-processing. Use of the Virtual-EyeCane, enables this experience to potentially be later used in real world environments with identical stimuli to those from the virtual environment. We show the fast-learned practical use of this algorithm for navigation in simple environments. PMID:23977316

  19. Virtual and Augmented Reality Systems for Renal Interventions: A Systematic Review.

    PubMed

    Detmer, Felicitas J; Hettig, Julian; Schindele, Daniel; Schostak, Martin; Hansen, Christian

    2017-01-01

    Many virtual and augmented reality systems have been proposed to support renal interventions. This paper reviews such systems employed in the treatment of renal cell carcinoma and renal stones. A systematic literature search was performed. Inclusion criteria were virtual and augmented reality systems for radical or partial nephrectomy and renal stone treatment, excluding systems solely developed or evaluated for training purposes. In total, 52 research papers were identified and analyzed. Most of the identified literature (87%) deals with systems for renal cell carcinoma treatment. About 44% of the systems have already been employed in clinical practice, but only 20% in studies with ten or more patients. Main challenges remaining for future research include the consideration of organ movement and deformation, human factor issues, and the conduction of large clinical studies. Augmented and virtual reality systems have the potential to improve safety and outcomes of renal interventions. In the last ten years, many technical advances have led to more sophisticated systems, which are already applied in clinical practice. Further research is required to cope with current limitations of virtual and augmented reality assistance in clinical environments.

  20. Closed Environment Module - Modularization and extension of the Virtual Habitat

    NASA Astrophysics Data System (ADS)

    Plötner, Peter; Czupalla, Markus; Zhukov, Anton

    2013-12-01

    The Virtual Habitat (V-HAB), is a Life Support System (LSS) simulation, created to perform dynamic simulation of LSS's for future human spaceflight missions. It allows the testing of LSS robustness by means of computer simulations, e.g. of worst case scenarios.

  1. The Virtual Test Bed Project

    NASA Technical Reports Server (NTRS)

    Rabelo, Luis C.

    2002-01-01

    This is a report of my activities as a NASA Fellow during the summer of 2002 at the NASA Kennedy Space Center (KSC). The core of these activities is the assigned project: the Virtual Test Bed (VTB) from the Spaceport Engineering and Technology Directorate. The VTB Project has its foundations in the NASA Ames Research Center (ARC) Intelligent Launch & Range Operations program. The objective of the VTB project is to develop a new and unique collaborative computing environment where simulation models can be hosted and integrated in a seamless fashion. This collaborative computing environment will be used to build a Virtual Range as well as a Virtual Spaceport. This project will work as a technology pipeline to research, develop, test and validate R&D efforts against real time operations without interfering with the actual operations or consuming the operational personnel s time. This report will also focus on the systems issues required to conceptualize and provide form to a systems architecture capable of handling the different demands.

  2. G2H--graphics-to-haptic virtual environment development tool for PC's.

    PubMed

    Acosta, E; Temkin, B; Krummel, T M; Heinrichs, W L

    2000-01-01

    For surgical training and preparations, the existing surgical virtual environments have shown great improvement. However, these improvements are more in the visual aspect. The incorporation of haptics into virtual reality base surgical simulations would enhance the sense of realism greatly. To aid in the development of the haptic surgical virtual environment we have created a graphics to haptic, G2H, virtual environment developer tool. G2H transforms graphical virtual environments (created or imported) to haptic virtual environments without programming. The G2H capability has been demonstrated using the complex 3D pelvic model of Lucy 2.0, the Stanford Visible Female. The pelvis was made haptic using G2H without any further programming effort.

  3. Interfacing modeling suite Physics Of Eclipsing Binaries 2.0 with a Virtual Reality Platform

    NASA Astrophysics Data System (ADS)

    Harriett, Edward; Conroy, Kyle; Prša, Andrej; Klassner, Frank

    2018-01-01

    To explore alternate methods for modeling eclipsing binary stars, we extrapolate upon PHOEBE’s (PHysics Of Eclipsing BinariEs) capabilities in a virtual reality (VR) environment to create an immersive and interactive experience for users. The application used is Vizard, a python-scripted VR development platform for environments such as Cave Automatic Virtual Environment (CAVE) and other off-the-shelf VR headsets. Vizard allows the freedom for all modeling to be precompiled without compromising functionality or usage on its part. The system requires five arguments to be precomputed using PHOEBE’s python front-end: the effective temperature, flux, relative intensity, vertex coordinates, and orbits; the user can opt to implement other features from PHOEBE to be accessed within the simulation as well. Here we present the method for making the data observables accessible in real time. An Occulus Rift will be available for a live showcase of various cases of VR rendering of PHOEBE binary systems including detached and contact binary stars.

  4. Virtual humans and formative assessment to train diagnostic skills in bulimia nervosa.

    PubMed

    Gutiérrez-Maldonado, José; Ferrer-Garcia, Marta; Pla, Joana; Andrés-Pueyo, Antonio

    2014-01-01

    Carrying out a diagnostic interview requires skills that need to be taught in a controlled environment. Virtual Reality (VR) environments are increasingly used in the training of professionals, as they offer the most realistic alternative while not requiring students to face situations for which they are yet unprepared. The results of the training of diagnostic skills can also be generalized to any other situation in which effective communication skills play a major role. Our aim with this study has been to develop a procedure of formative assessment in order to increment the effectiveness of virtual learning simulation systems and then to assess their efficacy.

  5. Development, implementation and pilot evaluation of a Web-based Virtual Patient Case Simulation environment – Web-SP

    PubMed Central

    Zary, Nabil; Johnson, Gunilla; Boberg, Jonas; Fors, Uno GH

    2006-01-01

    Background The Web-based Simulation of Patients (Web-SP) project was initiated in order to facilitate the use of realistic and interactive virtual patients (VP) in medicine and healthcare education. Web-SP focuses on moving beyond the technology savvy teachers, when integrating simulation-based education into health sciences curricula, by making the creation and use of virtual patients easier. The project strives to provide a common generic platform for design/creation, management, evaluation and sharing of web-based virtual patients. The aim of this study was to evaluate if it was possible to develop a web-based virtual patient case simulation environment where the entire case authoring process might be handled by teachers and which would be flexible enough to be used in different healthcare disciplines. Results The Web-SP system was constructed to support easy authoring, management and presentation of virtual patient cases. The case authoring environment was found to facilitate for teachers to create full-fledged patient cases without the assistance of computer specialists. Web-SP was successfully implemented at several universities by taking into account key factors such as cost, access, security, scalability and flexibility. Pilot evaluations in medical, dentistry and pharmacy courses shows that students regarded Web-SP as easy to use, engaging and to be of educational value. Cases adapted for all three disciplines were judged to be of significant educational value by the course leaders. Conclusion The Web-SP system seems to fulfil the aim of providing a common generic platform for creation, management and evaluation of web-based virtual patient cases. The responses regarding the authoring environment indicated that the system might be user-friendly enough to appeal to a majority of the academic staff. In terms of implementation strengths, Web-SP seems to fulfil most needs from course directors and teachers from various educational institutions and disciplines. The system is currently in use or under implementation in several healthcare disciplines at more than ten universities worldwide. Future aims include structuring the exchange of cases between teachers and academic institutions by building a VP library function. We intend to follow up the positive results presented in this paper with other studies looking at the learning outcomes, critical thinking and patient management. Studying the potential of Web-SP as an assessment tool will also be performed. More information about Web-SP: PMID:16504041

  6. Development, implementation and pilot evaluation of a Web-based Virtual Patient Case Simulation environment--Web-SP.

    PubMed

    Zary, Nabil; Johnson, Gunilla; Boberg, Jonas; Fors, Uno G H

    2006-02-21

    The Web-based Simulation of Patients (Web-SP) project was initiated in order to facilitate the use of realistic and interactive virtual patients (VP) in medicine and healthcare education. Web-SP focuses on moving beyond the technology savvy teachers, when integrating simulation-based education into health sciences curricula, by making the creation and use of virtual patients easier. The project strives to provide a common generic platform for design/creation, management, evaluation and sharing of web-based virtual patients. The aim of this study was to evaluate if it was possible to develop a web-based virtual patient case simulation environment where the entire case authoring process might be handled by teachers and which would be flexible enough to be used in different healthcare disciplines. The Web-SP system was constructed to support easy authoring, management and presentation of virtual patient cases. The case authoring environment was found to facilitate for teachers to create full-fledged patient cases without the assistance of computer specialists. Web-SP was successfully implemented at several universities by taking into account key factors such as cost, access, security, scalability and flexibility. Pilot evaluations in medical, dentistry and pharmacy courses shows that students regarded Web-SP as easy to use, engaging and to be of educational value. Cases adapted for all three disciplines were judged to be of significant educational value by the course leaders. The Web-SP system seems to fulfil the aim of providing a common generic platform for creation, management and evaluation of web-based virtual patient cases. The responses regarding the authoring environment indicated that the system might be user-friendly enough to appeal to a majority of the academic staff. In terms of implementation strengths, Web-SP seems to fulfil most needs from course directors and teachers from various educational institutions and disciplines. The system is currently in use or under implementation in several healthcare disciplines at more than ten universities worldwide. Future aims include structuring the exchange of cases between teachers and academic institutions by building a VP library function. We intend to follow up the positive results presented in this paper with other studies looking at the learning outcomes, critical thinking and patient management. Studying the potential of Web-SP as an assessment tool will also be performed. More information about Web-SP: http://websp.lime.ki.se.

  7. Using Virtual Reality Environment to Improve Joint Attention Associated with Pervasive Developmental Disorder

    ERIC Educational Resources Information Center

    Cheng, Yufang; Huang, Ruowen

    2012-01-01

    The focus of this study is using data glove to practice Joint attention skill in virtual reality environment for people with pervasive developmental disorder (PDD). The virtual reality environment provides a safe environment for PDD people. Especially, when they made errors during practice in virtual reality environment, there is no suffering or…

  8. Multisensory Integration in the Virtual Hand Illusion with Active Movement

    PubMed Central

    Satoh, Satoru; Hachimura, Kozaburo

    2016-01-01

    Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality. PMID:27847822

  9. Self-attitude awareness training: An aid to effective performance in microgravity and virtual environments

    NASA Technical Reports Server (NTRS)

    Parker, Donald E.; Harm, D. L.; Florer, Faith L.

    1993-01-01

    This paper describes ongoing development of training procedures to enhance self-attitude awareness in astronaut trainees. The procedures are based on observations regarding self-attitude (perceived self-orientation and self-motion) reported by astronauts. Self-attitude awareness training is implemented on a personal computer system and consists of lesson stacks programmed using Hypertalk with Macromind Director movie imports. Training evaluation will be accomplished by an active search task using the virtual Spacelab environment produced by the Device for Orientation and Motion Environments Preflight Adaptation Trainer (DOME-PAT) as well as by assessment of astronauts' performance and sense of well-being during orbital flight. The general purpose of self-attitude awareness training is to use as efficiently as possible the limited DOME-PAT training time available to astronauts prior to a space mission. We suggest that similar training procedures may enhance the performance of virtual environment operators.

  10. Virtual Reality: You Are There

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Telepresence or "virtual reality," allows a person, with assistance from advanced technology devices, to figuratively project himself into another environment. This technology is marketed by several companies, among them Fakespace, Inc., a former Ames Research Center contractor. Fakespace developed a teleoperational motion platform for transmitting sounds and images from remote locations. The "Molly" matches the user's head motion and, when coupled with a stereo viewing device and appropriate software, creates the telepresence experience. Its companion piece is the BOOM-the user's viewing device that provides the sense of involvement in the virtual environment. Either system may be used alone. Because suits, gloves, headphones, etc. are not needed, a whole range of commercial applications is possible, including computer-aided design techniques and virtual reality visualizations. Customers include Sandia National Laboratories, Stanford Research Institute and Mattel Toys.

  11. Intelligent Motion and Interaction Within Virtual Environments

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R. (Editor); Slater, Mel (Editor); Alexander, Thomas (Editor)

    2007-01-01

    What makes virtual actors and objects in virtual environments seem real? How can the illusion of their reality be supported? What sorts of training or user-interface applications benefit from realistic user-environment interactions? These are some of the central questions that designers of virtual environments face. To be sure simulation realism is not necessarily the major, or even a required goal, of a virtual environment intended to communicate specific information. But for some applications in entertainment, marketing, or aspects of vehicle simulation training, realism is essential. The following chapters will examine how a sense of truly interacting with dynamic, intelligent agents may arise in users of virtual environments. These chapters are based on presentations at the London conference on Intelligent Motion and Interaction within a Virtual Environments which was held at University College, London, U.K., 15-17 September 2003.

  12. PISCES: An environment for parallel scientific computation

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.

    1985-01-01

    The parallel implementation of scientific computing environment (PISCES) is a project to provide high-level programming environments for parallel MIMD computers. Pisces 1, the first of these environments, is a FORTRAN 77 based environment which runs under the UNIX operating system. The Pisces 1 user programs in Pisces FORTRAN, an extension of FORTRAN 77 for parallel processing. The major emphasis in the Pisces 1 design is in providing a carefully specified virtual machine that defines the run-time environment within which Pisces FORTRAN programs are executed. Each implementation then provides the same virtual machine, regardless of differences in the underlying architecture. The design is intended to be portable to a variety of architectures. Currently Pisces 1 is implemented on a network of Apollo workstations and on a DEC VAX uniprocessor via simulation of the task level parallelism. An implementation for the Flexible Computing Corp. FLEX/32 is under construction. An introduction to the Pisces 1 virtual computer and the FORTRAN 77 extensions is presented. An example of an algorithm for the iterative solution of a system of equations is given. The most notable features of the design are the provision for several granularities of parallelism in programs and the provision of a window mechanism for distributed access to large arrays of data.

  13. Virtual fixtures as tools to enhance operator performance in telepresence environments

    NASA Astrophysics Data System (ADS)

    Rosenberg, Louis B.

    1993-12-01

    This paper introduces the notion of virtual fixtures for use in telepresence systems and presents an empirical study which demonstrates that such virtual fixtures can greatly enhance operator performance within remote environments. Just as tools and fixtures in the real world can enhance human performance by guiding manual operations, providing localizing references, and reducing the mental processing required to perform a task, virtual fixtures are computer generated percepts overlaid on top of the reflection of a remote workspace which can provide similar benefits. Like a ruler guiding a pencil in a real manipulation task, a virtual fixture overlaid on top of a remote workspace can act to reduce the mental processing required to perform a task, limit the workload of certain sensory modalities, and most of all allow precision and performance to exceed natural human abilities. Because such perceptual overlays are virtual constructions they can be diverse in modality, abstract in form, and custom tailored to individual task or user needs. This study investigates the potential of virtual fixtures by implementing simple combinations of haptic and auditory sensations as perceptual overlays during a standardized telemanipulation task.

  14. Development of a Remote Accessibility Assessment System through three-dimensional reconstruction technology.

    PubMed

    Kim, Jong Bae; Brienza, David M

    2006-01-01

    A Remote Accessibility Assessment System (RAAS) that uses three-dimensional (3-D) reconstruction technology is being developed; it enables clinicians to assess the wheelchair accessibility of users' built environments from a remote location. The RAAS uses commercial software to construct 3-D virtualized environments from photographs. We developed custom screening algorithms and instruments for analyzing accessibility. Characteristics of the camera and 3-D reconstruction software chosen for the system significantly affect its overall reliability. In this study, we performed an accuracy assessment to verify that commercial hardware and software can construct accurate 3-D models by analyzing the accuracy of dimensional measurements in a virtual environment and a comparison of dimensional measurements from 3-D models created with four cameras/settings. Based on these two analyses, we were able to specify a consumer-grade digital camera and PhotoModeler (EOS Systems, Inc, Vancouver, Canada) software for this system. Finally, we performed a feasibility analysis of the system in an actual environment to evaluate its ability to assess the accessibility of a wheelchair user's typical built environment. The field test resulted in an accurate accessibility assessment and thus validated our system.

  15. Simulating the decentralized processes of the human immune system in a virtual anatomy model.

    PubMed

    Sarpe, Vladimir; Jacob, Christian

    2013-01-01

    Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.

  16. Internet-based distributed collaborative environment for engineering education and design

    NASA Astrophysics Data System (ADS)

    Sun, Qiuli

    2001-07-01

    This research investigates the use of the Internet for engineering education, design, and analysis through the presentation of a Virtual City environment. The main focus of this research was to provide an infrastructure for engineering education, test the concept of distributed collaborative design and analysis, develop and implement the Virtual City environment, and assess the environment's effectiveness in the real world. A three-tier architecture was adopted in the development of the prototype, which contains an online database server, a Web server as well as multi-user servers, and client browsers. The environment is composed of five components, a 3D virtual world, multiple Internet-based multimedia modules, an online database, a collaborative geometric modeling module, and a collaborative analysis module. The environment was designed using multiple Intenet-based technologies, such as Shockwave, Java, Java 3D, VRML, Perl, ASP, SQL, and a database. These various technologies together formed the basis of the environment and were programmed to communicate smoothly with each other. Three assessments were conducted over a period of three semesters. The Virtual City is open to the public at www.vcity.ou.edu. The online database was designed to manage the changeable data related to the environment. The virtual world was used to implement 3D visualization and tie the multimedia modules together. Students are allowed to build segments of the 3D virtual world upon completion of appropriate undergraduate courses in civil engineering. The end result is a complete virtual world that contains designs from all of their coursework and is viewable on the Internet. The environment is a content-rich educational system, which can be used to teach multiple engineering topics with the help of 3D visualization, animations, and simulations. The concept of collaborative design and analysis using the Internet was investigated and implemented. Geographically dispersed users can build the same geometric model simultaneously over the Internet and communicate with each other through a chat room. They can also conduct finite element analysis collaboratively on the same object over the Internet. They can mesh the same object, apply and edit the same boundary conditions and forces, obtain the same analysis results, and then discuss the results through the Internet.

  17. A Practical Guide, with Theoretical Underpinnings, for Creating Effective Virtual Reality Learning Environments

    ERIC Educational Resources Information Center

    O'Connor, Eileen A.; Domingo, Jelia

    2017-01-01

    With the advent of open source virtual environments, the associated cost reductions, and the more flexible options, avatar-based virtual reality environments are within reach of educators. By using and repurposing readily available virtual environments, instructors can bring engaging, community-building, and immersive learning opportunities to…

  18. Modular mechatronic system for stationary bicycles interfaced with virtual environment for rehabilitation.

    PubMed

    Ranky, Richard G; Sivak, Mark L; Lewis, Jeffrey A; Gade, Venkata K; Deutsch, Judith E; Mavroidis, Constantinos

    2014-06-05

    Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider's lower extremities. The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders.

  19. Evolution-based Virtual Content Insertion with Visually Virtual Interactions in Videos

    NASA Astrophysics Data System (ADS)

    Chang, Chia-Hu; Wu, Ja-Ling

    With the development of content-based multimedia analysis, virtual content insertion has been widely used and studied for video enrichment and multimedia advertising. However, how to automatically insert a user-selected virtual content into personal videos in a less-intrusive manner, with an attractive representation, is a challenging problem. In this chapter, we present an evolution-based virtual content insertion system which can insert virtual contents into videos with evolved animations according to predefined behaviors emulating the characteristics of evolutionary biology. The videos are considered not only as carriers of message conveyed by the virtual content but also as the environment in which the lifelike virtual contents live. Thus, the inserted virtual content will be affected by the videos to trigger a series of artificial evolutions and evolve its appearances and behaviors while interacting with video contents. By inserting virtual contents into videos through the system, users can easily create entertaining storylines and turn their personal videos into visually appealing ones. In addition, it would bring a new opportunity to increase the advertising revenue for video assets of the media industry and online video-sharing websites.

  20. Opportunities for leveraging OS virtualization in high-end supercomputing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Patrick G.; Pedretti, Kevin Thomas Tauke

    2010-11-01

    This paper examines potential motivations for incorporating virtualization support in the system software stacks of high-end capability supercomputers. We advocate that this will increase the flexibility of these platforms significantly and enable new capabilities that are not possible with current fixed software stacks. Our results indicate that compute, virtual memory, and I/O virtualization overheads are low and can be further mitigated by utilizing well-known techniques such as large paging and VMM bypass. Furthermore, since the addition of virtualization support does not affect the performance of applications using the traditional native environment, there is essentially no disadvantage to its addition.

  1. Evaluating and Implementing Learning Environments: A United Kingdom Experience.

    ERIC Educational Resources Information Center

    Ingraham, Bruce; Watson, Barbara; McDowell, Liz; Brockett, Adrian; Fitzpatrick, Simon

    2002-01-01

    Reports on ongoing work at five universities in northeastern England that have been evaluating and implementing online learning environments known as virtual learning environments (VLEs) or managed learning environments (MLEs). Discusses do-it-yourself versus commercial systems; transferability; Web-based versus client-server; integration with…

  2. A Dual Track Treadmill in a Virtual Reality Environment as a Countermeasure for Neurovestibular Adaptations in Microgravity

    NASA Technical Reports Server (NTRS)

    DAndrea, Susan E.; Kahelin, Michael W.; Horowitz, Jay G.; OConnor, Philip A.

    2004-01-01

    While the neurovestibular system is capable of adapting to altered environments such as microgravity, the adaptive state achieved in space in inadequate for 1G. This leads to giant and postural instabilities when returning to a gravity environment and may create serious problems in future mission to Mars. New methods are needed to improve the understanding of the adaptive capabilities of the human neurovestibular system and to develop more effective countermeasures. The concept behind the current study is that by challenging the neurovestibular system while walking or running a treadmill can help to read just the relationship between the visual, vestibular and proprioceptive signals that are altered in a microgravity environment. As a countermeasure, this device could also benefit the musculoskeletal and cardiovascular systems and at the same time decrease the overall time spent exercising. The overall goal of this research is to design, develop, build and test a dual track treadmill, which utilizes virtual reality, VR, displays.

  3. A Dual Track Treadmill in a Virtual Reality Environment as a Countermeasure for Neurovestibular Adaptations in Microgravity

    NASA Technical Reports Server (NTRS)

    DAndrea, Susan E.; Kahelin, Michael W.; Horowitz, Jay G.; OConnor, Philip A.

    2004-01-01

    While the neurovestibular system is capable of adapting to altered environments such as microgravity, the adaptive state achieved in space in inadequate for 1G. This leads to gait and postural instabilities when returning to a gravity environment and may create serious problems in future missions to Mars. New methods are needed to improve the understanding of the adaptive capabilities of the human neurovestibular system and to develop more effective countermeasures. The concept behind the current study is that by challenging the neurovestibular system while walking or running, a treadmill can help to readjust the relationship between the visual, vestibular and proprioceptive signals that are altered in a microgravity environment. As a countermeasure, this device could also benefit the musculoskeletal and cardiovascular systems and at the same time decrease the overall time spent exercising. The overall goal of this research is to design, develop, build and test a dual track treadmill, which utilizes virtual reality,

  4. Virtual Machine Modules for Use by DoD C4I Support Centers

    DTIC Science & Technology

    2010-09-01

    required for established environments and gets rid of storage requirements for older systems. This thesis focuses on the benefits and the methods...establish. Three, it drastically reduces the footprint required for established environments and gets rid of storage requirements for older systems...drastically reduces the footprint required for 2 established environments and gets rid of facility storage requirements for older systems. This

  5. Reactivity to Cannabis Cues in Virtual Reality Environments†

    PubMed Central

    Bordnick, Patrick S.; Copp, Hilary L.; Traylor, Amy; Graap, Ken M.; Carter, Brian L.; Walton, Alicia; Ferrer, Mirtha

    2014-01-01

    Virtual reality (VR) cue environments have been developed and successfully tested in nicotine, cocaine, and alcohol abusers. Aims in the current article include the development and testing of a novel VR cannabis cue reactivity assessment system. It was hypothesized that subjective craving levels and attention to cannabis cues would be higher in VR environments merits with cannabis cues compared to VR neutral environments. Twenty nontreatment-seeking current cannabis smokers participated in the VR cue trial. During the VR cue trial, participants were exposed to four virtual environments that contained audio, visual, olfactory, and vibrotactile sensory stimuli. Two VR environments contained cannabis cues that consisted of a party room in which people were smoking cannabis and a room containing cannabis paraphernalia without people. Two VR neutral rooms without cannabis cues consisted of a digital art gallery with nature videos. Subjective craving and attention to cues were significantly higher in the VR cannabis environments compared to the VR neutral environments. These findings indicate that VR cannabis cue reactivity may offer a new technology-based method to advance addiction research and treatment. PMID:19705672

  6. Virtual environment application with partial gravity simulation

    NASA Technical Reports Server (NTRS)

    Ray, David M.; Vanchau, Michael N.

    1994-01-01

    To support manned missions to the surface of Mars and missions requiring manipulation of payloads and locomotion in space, a training facility is required to simulate the conditions of both partial and microgravity. A partial gravity simulator (Pogo) which uses pneumatic suspension is being studied for use in virtual reality training. Pogo maintains a constant partial gravity simulation with a variation of simulated body force between 2.2 and 10 percent, depending on the type of locomotion inputs. this paper is based on the concept and application of a virtual environment system with Pogo including a head-mounted display and glove. The reality engine consists of a high end SGI workstation and PC's which drive Pogo's sensors and data acquisition hardware used for tracking and control. The tracking system is a hybrid of magnetic and optical trackers integrated for this application.

  7. The Fidelity of ’Feel’: Emotional Affordance in Virtual Environments

    DTIC Science & Technology

    2005-07-01

    The Fidelity of “Feel”: Emotional Affordance in Virtual Environments Jacquelyn Ford Morie, Josh Williams, Aimee Dozois, Donat-Pierre Luigi... environment but also the participant. We do this with the focus on what emotional affordances this manipulation will provide. Our first evaluation scenario...emotionally affective VEs. Keywords: Immersive Environments , Virtual Environments , VEs, Virtual Reality, emotion , affordance, fidelity, presence

  8. Coercive Narratives, Motivation and Role Playing in Virtual Worlds

    DTIC Science & Technology

    2002-01-01

    resource for making immersive virtual environments highly engaging. Interaction also appeals to our natural desire to discover. Reading a book contains...participation in an open-ended Virtual Environment (VE). I intend to take advantage of a participants’ natural tendency to prefer interaction when possible...I hope this work will expand the potential of experience within virtual worlds. K e y w o r d s : Immersive Environments , Virtual Environments

  9. Real and virtual explorations of the environment and interactive tracking of movable objects for the blind on the basis of tactile-acoustical maps and 3D environment models.

    PubMed

    Hub, Andreas; Hartter, Tim; Kombrink, Stefan; Ertl, Thomas

    2008-01-01

    PURPOSE.: This study describes the development of a multi-functional assistant system for the blind which combines localisation, real and virtual navigation within modelled environments and the identification and tracking of fixed and movable objects. The approximate position of buildings is determined with a global positioning sensor (GPS), then the user establishes exact position at a specific landmark, like a door. This location initialises indoor navigation, based on an inertial sensor, a step recognition algorithm and map. Tracking of movable objects is provided by another inertial sensor and a head-mounted stereo camera, combined with 3D environmental models. This study developed an algorithm based on shape and colour to identify objects and used a common face detection algorithm to inform the user of the presence and position of others. The system allows blind people to determine their position with approximately 1 metre accuracy. Virtual exploration of the environment can be accomplished by moving one's finger on a touch screen of a small portable tablet PC. The name of rooms, building features and hazards, modelled objects and their positions are presented acoustically or in Braille. Given adequate environmental models, this system offers blind people the opportunity to navigate independently and safely, even within unknown environments. Additionally, the system facilitates education and rehabilitation by providing, in several languages, object names, features and relative positions.

  10. Virtual reality: a reality for future military pilotage?

    NASA Astrophysics Data System (ADS)

    McIntire, John P.; Martinsen, Gary L.; Marasco, Peter L.; Havig, Paul R.

    2009-05-01

    Virtual reality (VR) systems provide exciting new ways to interact with information and with the world. The visual VR environment can be synthetic (computer generated) or be an indirect view of the real world using sensors and displays. With the potential opportunities of a VR system, the question arises about what benefits or detriments a military pilot might incur by operating in such an environment. Immersive and compelling VR displays could be accomplished with an HMD (e.g., imagery on the visor), large area collimated displays, or by putting the imagery on an opaque canopy. But what issues arise when, instead of viewing the world directly, a pilot views a "virtual" image of the world? Is 20/20 visual acuity in a VR system good enough? To deliver this acuity over the entire visual field would require over 43 megapixels (MP) of display surface for an HMD or about 150 MP for an immersive CAVE system, either of which presents a serious challenge with current technology. Additionally, the same number of sensor pixels would be required to drive the displays to this resolution (and formidable network architectures required to relay this information), or massive computer clusters are necessary to create an entirely computer-generated virtual reality with this resolution. Can we presently implement such a system? What other visual requirements or engineering issues should be considered? With the evolving technology, there are many technological issues and human factors considerations that need to be addressed before a pilot is placed within a virtual cockpit.

  11. Engagement of neural circuits underlying 2D spatial navigation in a rodent virtual reality system.

    PubMed

    Aronov, Dmitriy; Tank, David W

    2014-10-22

    Virtual reality (VR) enables precise control of an animal's environment and otherwise impossible experimental manipulations. Neural activity in rodents has been studied on virtual 1D tracks. However, 2D navigation imposes additional requirements, such as the processing of head direction and environment boundaries, and it is unknown whether the neural circuits underlying 2D representations can be sufficiently engaged in VR. We implemented a VR setup for rats, including software and large-scale electrophysiology, that supports 2D navigation by allowing rotation and walking in any direction. The entorhinal-hippocampal circuit, including place, head direction, and grid cells, showed 2D activity patterns similar to those in the real world. Furthermore, border cells were observed, and hippocampal remapping was driven by environment shape, suggesting functional processing of virtual boundaries. These results illustrate that 2D spatial representations can be engaged by visual and rotational vestibular stimuli alone and suggest a novel VR tool for studying rat navigation.

  12. Augmented Reality for Close Quarters Combat

    ScienceCinema

    None

    2018-01-16

    Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.

  13. Virtual reality hardware for use in interactive 3D data fusion and visualization

    NASA Astrophysics Data System (ADS)

    Gourley, Christopher S.; Abidi, Mongi A.

    1997-09-01

    Virtual reality has become a tool for use in many areas of research. We have designed and built a VR system for use in range data fusion and visualization. One major VR tool is the CAVE. This is the ultimate visualization tool, but comes with a large price tag. Our design uses a unique CAVE whose graphics are powered by a desktop computer instead of a larger rack machine making it much less costly. The system consists of a screen eight feet tall by twenty-seven feet wide giving a variable field-of-view currently set at 160 degrees. A silicon graphics Indigo2 MaxImpact with the impact channel option is used for display. This gives the capability to drive three projectors at a resolution of 640 by 480 for use in displaying the virtual environment and one 640 by 480 display for a user control interface. This machine is also the first desktop package which has built-in hardware texture mapping. This feature allows us to quickly fuse the range and intensity data and other multi-sensory data. The final goal is a complete 3D texture mapped model of the environment. A dataglove, magnetic tracker, and spaceball are to be used for manipulation of the data and navigation through the virtual environment. This system gives several users the ability to interactively create 3D models from multiple range images.

  14. Human Motion Tracking and Glove-Based User Interfaces for Virtual Environments in ANVIL

    NASA Technical Reports Server (NTRS)

    Dumas, Joseph D., II

    2002-01-01

    The Army/NASA Virtual Innovations Laboratory (ANVIL) at Marshall Space Flight Center (MSFC) provides an environment where engineers and other personnel can investigate novel applications of computer simulation and Virtual Reality (VR) technologies. Among the many hardware and software resources in ANVIL are several high-performance Silicon Graphics computer systems and a number of commercial software packages, such as Division MockUp by Parametric Technology Corporation (PTC) and Jack by Unigraphics Solutions, Inc. These hardware and software platforms are used in conjunction with various VR peripheral I/O (input / output) devices, CAD (computer aided design) models, etc. to support the objectives of the MSFC Engineering Systems Department/Systems Engineering Support Group (ED42) by studying engineering designs, chiefly from the standpoint of human factors and ergonomics. One of the more time-consuming tasks facing ANVIL personnel involves the testing and evaluation of peripheral I/O devices and the integration of new devices with existing hardware and software platforms. Another important challenge is the development of innovative user interfaces to allow efficient, intuitive interaction between simulation users and the virtual environments they are investigating. As part of his Summer Faculty Fellowship, the author was tasked with verifying the operation of some recently acquired peripheral interface devices and developing new, easy-to-use interfaces that could be used with existing VR hardware and software to better support ANVIL projects.

  15. Relative Panoramic Camera Position Estimation for Image-Based Virtual Reality Networks in Indoor Environments

    NASA Astrophysics Data System (ADS)

    Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.

    2017-09-01

    Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.

  16. Linking Immersive Virtual Field Trips with an Adaptive Learning Platform

    NASA Astrophysics Data System (ADS)

    Bruce, G.; Taylor, W.; Anbar, A. D.; Semken, S. C.; Buxner, S.; Mead, C.; El-Moujaber, E.; Summons, R. E.; Oliver, C.

    2016-12-01

    The use of virtual environments in science education has been constrained by the difficulty of guiding a learner's actions within the those environments. In this work, we demonstrate how advances in education software technology allow educators to create interactive learning experiences that respond and adapt intelligently to learner input within the virtual environment. This innovative technology provides a far greater capacity for delivering authentic inquiry-driven educational experiences in unique settings from around the world. Our immersive virtual field trips (iVFT) bring students virtually to geologically significant but inaccessible environments, where they learn through authentic practices of scientific inquiry. In one recent example, students explore the fossil beds in Nilpena, South Australia to learn about the Ediacaran fauna. Students interactively engage in 360° recreations of the environment, uncover the nature of the historical ecosystem by identifying fossils with a dichotomous key, explore actual fossil beds in high resolution imagery, and reconstruct what an ecosystem might have looked like millions of years ago in an interactive simulation. With the new capacity to connect actions within the iVFT to an intelligent tutoring system, these learning experiences can be tracked, guided, and tailored individually to the immediate actions of the student. This new capacity also has great potential for learning designers to take a data-driven approach to lesson improvement and for education researchers to study learning in virtual environments. Thus, we expect iVFT will be fertile ground for novel research. Such iVFT are currently in use in several introductory classes offered online at Arizona State University in anthropology, introductory biology, and astrobiology, reaching thousands of students to date. Drawing from these experiences, we are designing a curriculum for historical geology that will be built around iVFT-based exploration of Earth history.

  17. Innovative application of virtual display technique in virtual museum

    NASA Astrophysics Data System (ADS)

    Zhang, Jiankang

    2017-09-01

    Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.

  18. Productive confusions: learning from simulations of pandemic virus outbreaks in Second Life

    NASA Astrophysics Data System (ADS)

    Cárdenas, Micha; Greci, Laura S.; Hurst, Samantha; Garman, Karen; Hoffman, Helene; Huang, Ricky; Gates, Michael; Kho, Kristen; Mehrmand, Elle; Porteous, Todd; Calvitti, Alan; Higginbotham, Erin; Agha, Zia

    2011-03-01

    Users of immersive virtual reality environments have reported a wide variety of side and after effects including the confusion of characteristics of the real and virtual worlds. Perhaps this side effect of confusing the virtual and real can be turned around to explore the possibilities for immersion with minimal technological support in virtual world group training simulations. This paper will describe observations from my time working as an artist/researcher with the UCSD School of Medicine (SoM) and Veterans Administration San Diego Healthcare System (VASDHS) to develop trainings for nurses, doctors and Hospital Incident Command staff that simulate pandemic virus outbreaks. By examining moments of slippage between realities, both into and out of the virtual environment, moments of the confusion of boundaries between real and virtual, we can better understand methods for creating immersion. I will use the mixing of realities as a transversal line of inquiry, borrowing from virtual reality studies, game studies, and anthropological studies to better understand the mechanisms of immersion in virtual worlds. Focusing on drills conducted in Second Life, I will examine moments of training to learn the software interface, moments within the drill and interviews after the drill.

  19. Human responses to augmented virtual scaffolding models.

    PubMed

    Hsiao, Hongwei; Simeonov, Peter; Dotson, Brian; Ammons, Douglas; Kau, Tsui-Ying; Chiou, Sharon

    2005-08-15

    This study investigated the effect of adding real planks, in virtual scaffolding models of elevation, on human performance in a surround-screen virtual reality (SSVR) system. Twenty-four construction workers and 24 inexperienced controls performed walking tasks on real and virtual planks at three virtual heights (0, 6 m, 12 m) and two scaffolding-platform-width conditions (30, 60 cm). Gait patterns, walking instability measurements and cardiovascular reactivity were assessed. The results showed differences in human responses to real vs. virtual planks in walking patterns, instability score and heart-rate inter-beat intervals; it appeared that adding real planks in the SSVR virtual scaffolding model enhanced the quality of SSVR as a human - environment interface research tool. In addition, there were significant differences in performance between construction workers and the control group. The inexperienced participants were more unstable as compared to construction workers. Both groups increased their stride length with repetitions of the task, indicating a possibly confidence- or habit-related learning effect. The practical implications of this study are in the adoption of augmented virtual models of elevated construction environments for injury prevention research, and the development of programme for balance-control training to reduce the risk of falls at elevation before workers enter a construction job.

  20. Toward a Learner-Centered System for Adult Learning

    ERIC Educational Resources Information Center

    Hermans, Henry; Kalz, Marco; Koper, Rob

    2013-01-01

    Purpose: The purpose of this paper is to present an e-learning system that integrates the use of concepts of virtual learning environments, personal learning environments, and social network sites. The system is based on a learning model which comprises and integrates three learning contexts for the adult learner: the formal, instructional…

  1. The virtual morphology and the main movements of the human neck simulations used for car crash studies

    NASA Astrophysics Data System (ADS)

    Ciunel, St.; Tica, B.

    2016-08-01

    The paper presents the studies made on a similar biomechanical system composed by neck, head and thorax bones. The models were defined in a CAD environment which includes Adams algorithm for dynamic simulations. The virtual models and the entire morphology were obtained starting with CT images made on a living human subject. The main movements analyzed were: axial rotation (left-right), lateral bending (left-right) and flexion- extension movement. After simulation was obtained the entire biomechanical behavior based on data tables or diagrams. That virtual model composed by neck and head can be included in complex system (as a car system) and supposed to several impact simulations (virtual crash tests). Also, our research team built main components of a testing device for dummy car crash neck-head system using anatomical data.

  2. The development of a collaborative virtual environment for finite element simulation

    NASA Astrophysics Data System (ADS)

    Abdul-Jalil, Mohamad Kasim

    Communication between geographically distributed designers has been a major hurdle in traditional engineering design. Conventional methods of communication, such as video conferencing, telephone, and email, are less efficient especially when dealing with complex design models. Complex shapes, intricate features and hidden parts are often difficult to describe verbally or even using traditional 2-D or 3-D visual representations. Virtual Reality (VR) and Internet technologies have provided a substantial potential to bridge the present communication barrier. VR technology allows designers to immerse themselves in a virtual environment to view and manipulate this model just as in real-life. Fast Internet connectivity has enabled fast data transfer between remote locations. Although various collaborative virtual environment (CVE) systems have been developed in the past decade, they are limited to high-end technology that is not accessible to typical designers. The objective of this dissertation is to discover and develop a new approach to increase the efficiency of the design process, particularly for large-scale applications wherein participants are geographically distributed. A multi-platform and easily accessible collaborative virtual environment (CVRoom), is developed to accomplish the stated research objective. Geographically dispersed designers can meet in a single shared virtual environment to discuss issues pertaining to the engineering design process and to make trade-off decisions more quickly than before, thereby speeding the entire process. This 'faster' design process will be achieved through the development of capabilities to better enable the multidisciplinary and modeling the trade-off decisions that are so critical before launching into a formal detailed design. The features of the environment developed as a result of this research include the ability to view design models, use voice interaction, and to link engineering analysis modules (such as Finite Element Analysis module, such as is demonstrated in this work). One of the major issues in developing a CVE system for engineering design purposes is to obtain any pertinent simulation results in real-time. This is critical so that the designers can make decisions based on these results quickly. For example, in a finite element analysis, if a design model is changed or perturbed, the analysis results must be obtained in real-time or near real-time to make the virtual meeting environment realistic. In this research, the finite difference-based Design Sensitivity Analysis (DSA) approach is employed to approximate structural responses (i.e. stress, displacement, etc), so as to demonstrate the applicability of CVRoom for engineering design trade-offs. This DSA approach provides for fast approximation and is well-suited for the virtual meeting environment where fast response time is required. The DSA-based approach is tested on several example test problems to show its applicability and limitations. This dissertation demonstrates that an increase in efficiency and reduction of time required for a complex design processing can be accomplished using the approach developed in this dissertation research. Several implementations of CVRoom by students working on common design tasks were investigated. All participants confirmed the preference of using the collaborative virtual environment developed in this dissertation work (CVRoom) over other modes of interactions. It is proposed here that CVRoom is representative of the type of collaborative virtual environment that will be used by most designers in the future to reduce the time required in a design cycle and thereby reduce the associated cost.

  3. Consolidating Learning in Power Systems: Virtual Reality Applied to the Study of the Operation of Electric Power Transformers

    ERIC Educational Resources Information Center

    Barata, Pebertli Nils Alho; Filho, Manoel Ribeiro; Nunes, Marcus V. Alves

    2015-01-01

    Within the field of electric power systems, the study of electrical equipment can be frustrating and demotivating because of the lack of a clear vision of how this equipment functions and operates in a real environment. The use of virtual reality can provide a more concrete representation for students, who rarely have the opportunity to visit a…

  4. Effects of Exercise in Immersive Virtual Environments on Cortical Neural Oscillations and Mental State

    PubMed Central

    Vogt, Tobias; Herpers, Rainer; Askew, Christopher D.; Scherfgen, David; Strüder, Heiko K.; Schneider, Stefan

    2015-01-01

    Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence. PMID:26366305

  5. Effects of Exercise in Immersive Virtual Environments on Cortical Neural Oscillations and Mental State.

    PubMed

    Vogt, Tobias; Herpers, Rainer; Askew, Christopher D; Scherfgen, David; Strüder, Heiko K; Schneider, Stefan

    2015-01-01

    Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence.

  6. Semi-Immersive Virtual Turbine Engine Simulation System

    NASA Astrophysics Data System (ADS)

    Abidi, Mustufa H.; Al-Ahmari, Abdulrahman M.; Ahmad, Ali; Darmoul, Saber; Ameen, Wadea

    2018-05-01

    The design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.

  7. Exploring 4D Flow Data in an Immersive Virtual Environment

    NASA Astrophysics Data System (ADS)

    Stevens, A. H.; Butkiewicz, T.

    2017-12-01

    Ocean models help us to understand and predict a wide range of intricate physical processes which comprise the atmospheric and oceanic systems of the Earth. Because these models output an abundance of complex time-varying three-dimensional (i.e., 4D) data, effectively conveying the myriad information from a given model poses a significant visualization challenge. The majority of the research effort into this problem has concentrated around synthesizing and examining methods for representing the data itself; by comparison, relatively few studies have looked into the potential merits of various viewing conditions and virtual environments. We seek to improve our understanding of the benefits offered by current consumer-grade virtual reality (VR) systems through an immersive, interactive 4D flow visualization system. Our dataset is a Regional Ocean Modeling System (ROMS) model representing a 12-hour tidal cycle of the currents within New Hampshire's Great Bay estuary. The model data was loaded into a custom VR particle system application using the OpenVR software library and the HTC Vive hardware, which tracks a headset and two six-degree-of-freedom (6DOF) controllers within a 5m-by-5m area. The resulting visualization system allows the user to coexist in the same virtual space as the data, enabling rapid and intuitive analysis of the flow model through natural interactions with the dataset and within the virtual environment. Whereas a traditional computer screen typically requires the user to reposition a virtual camera in the scene to obtain the desired view of the data, in virtual reality the user can simply move their head to the desired viewpoint, completely eliminating the mental context switches from data exploration/analysis to view adjustment and back. The tracked controllers become tools to quickly manipulate (reposition, reorient, and rescale) the dataset and to interrogate it by, e.g., releasing dye particles into the flow field, probing scalar velocities, placing a cutting plane through a region of interest, etc. It is hypothesized that the advantages afforded by head-tracked viewing and 6DOF interaction devices will lead to faster and more efficient examination of 4D flow data. A human factors study is currently being prepared to empirically evaluate this method of visualization and interaction.

  8. Creating virtual humans for simulation-based training and planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stansfield, S.; Sobel, A.

    1998-05-12

    Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system formore » planning, rehearsing and training assault operations.« less

  9. My Science Tutor: A Conversational Multimedia Virtual Tutor

    ERIC Educational Resources Information Center

    Ward, Wayne; Cole, Ron; Bolaños, Daniel; Buchenroth-Martin, Cindy; Svirsky, Edward; Weston, Tim

    2013-01-01

    My Science Tutor (MyST) is an intelligent tutoring system designed to improve science learning by elementary school students through conversational dialogs with a virtual science tutor in an interactive multimedia environment. Marni, a lifelike 3-D character, engages individual students in spoken dialogs following classroom investigations using…

  10. Improving the Efficiency of Virtual Reality Training by Integrating Partly Observational Learning

    ERIC Educational Resources Information Center

    Yuviler-Gavish, Nirit; Rodríguez, Jorge; Gutiérrez, Teresa; Sánchez, Emilio; Casado, Sara

    2014-01-01

    The current study hypothesized that integrating partly observational learning into virtual reality training systems (VRTS) can enhance training efficiency for procedural tasks. A common approach in designing VRTS is the enactive approach, which stresses the importance of physical actions within the environment to enhance perception and improve…

  11. Intelligent Tutors in Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Yan, Peng; Slator, Brian M.; Vender, Bradley; Jin, Wei; Kariluoma, Matti; Borchert, Otto; Hokanson, Guy; Aggarwal, Vaibhav; Cosmano, Bob; Cox, Kathleen T.; Pilch, André; Marry, Andrew

    2013-01-01

    Research into virtual role-based learning has progressed over the past decade. Modern issues include gauging the difficulty of designing a goal system capable of meeting the requirements of students with different knowledge levels, and the reasonability and possibility of taking advantage of the well-designed formula and techniques served in other…

  12. 3D Virtual Environment Used to Support Lighting System Management in a Building

    NASA Astrophysics Data System (ADS)

    Sampaio, A. Z.; Ferreira, M. M.; Rosário, D. P.

    The main aim of the research project, which is in progress at the UTL, is to develop a virtual interactive model as a tool to support decision-making in the planning of construction maintenance and facilities management. The virtual model gives the capacity to allow the user to transmit, visually and interactively, information related to the components of a building, defined as a function of the time variable. In addition, the analysis of solutions for repair work/substitution and inherent cost are predicted, the results being obtained interactively and visualized in the virtual environment itself. The first component of the virtual prototype concerns the management of lamps in a lighting system. It was applied in a study case. The interactive application allows the examination of the physical model, visualizing, for each element modeled in 3D and linked to a database, the corresponding technical information concerned with the use of the material, calculated for different points in time during their life. The control of a lamp stock, the constant updating of lifetime information and the planning of periodical local inspections are attended on the prototype. This is an important mean of cooperation between collaborators involved in the building management.

  13. The development of virtual reality therapy (VRT) system for the treatment of acrophobia and therapeutic case.

    PubMed

    Jang, Dong P; Ku, Jeong H; Choi, Young H; Wiederhold, Brenda K; Nam, San W; Kim, In Y; Kim, Sun I

    2002-09-01

    Virtual reality therapy (VRT), based on this sophisticated technology, has been recently used in the treatment of subjects diagnosed with acrophobia, a disorder that is characterized by marked anxiety upon exposure to heights and avoidance of heights. Conventional VR systems for the treatment of acrophobia have limitations, over-costly devices or somewhat unrealistic graphic scenes. The goal of this study was to develop an inexpensive and more realistic virtual environment (VE) in which to perform exposure therapy for acrophobia. It is based on a personal computer, and a virtual scene of a bunge-jump tower in the middle of a large city. The virtual scenario includes an open lift surrounded by props beside a tower, which allows the patient to feel a sense of heights. The effectiveness of the VE was evaluated through the clinical treatment of a subject who was suffering from the fear of heights. As a result, it was proved that this VR environment was effective and realistic at overcoming acrophobia according not only to the comparison results of a variety of questionnaires before and after treatment but also to the subject's comments that the VE seemed to evoke more fearful feelings than the real situation.

  14. Modeling human behaviors and reactions under dangerous environment.

    PubMed

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed.

  15. Virtual Reality: The Future of Animated Virtual Instructor, the Technology and Its Emergence to a Productive E-Learning Environment.

    ERIC Educational Resources Information Center

    Jiman, Juhanita

    This paper discusses the use of Virtual Reality (VR) in e-learning environments where an intelligent three-dimensional (3D) virtual person plays the role of an instructor. With the existence of this virtual instructor, it is hoped that the teaching and learning in the e-environment will be more effective and productive. This virtual 3D animated…

  16. Cybersickness and Anxiety During Simulated Motion: Implications for VRET.

    PubMed

    Bruck, Susan; Watters, Paul

    2009-01-01

    Some clinicians have suggested using virtual reality environments to deliver psychological interventions to treat anxiety disorders. However, given a significant body of work on cybersickness symptoms which may arise in virtual environments - especially those involving simulated motion - we tested (a) whether being exposed to a virtual reality environment alone causes anxiety to increase, and (b) whether exposure to simulated motion in a virtual reality environment increases anxiety. Using a repeated measures design, we used Kim's Anxiety Scale questionnaire to compare baseline anxiety, anxiety after virtual environment exposure, and anxiety after simulated motion. While there was no significant effect on anxiety for being in a virtual environment with no simulated motion, the introduction of simulated motion caused anxiety to significantly increase, but not to a severe or extreme level. The implications of this work for virtual reality exposure therapy (VRET) are discussed.

  17. Active Dentate Granule Cells Encode Experience to Promote the Addition of Adult-Born Hippocampal Neurons

    PubMed Central

    Kirschen, Gregory W.; Shen, Jia; Wang, Jia; Man, Guoming; Wu, Song

    2017-01-01

    The continuous addition of new dentate granule cells (DGCs), which is regulated exquisitely by brain activity, renders the hippocampus plastic. However, how neural circuits encode experiences to affect the addition of adult-born neurons remains unknown. Here, we used endoscopic Ca2+ imaging to track the real-time activity of individual DGCs in freely behaving mice. For the first time, we found that active DGCs responded to a novel experience by increasing their Ca2+ event frequency preferentially. This elevated activity, which we found to be associated with object exploration, returned to baseline by 1 h in the same environment, but could be dishabituated via introduction to a novel environment. To transition seamlessly between environments, we next established a freely controllable virtual reality system for unrestrained mice. We again observed increased firing of active neurons in a virtual enriched environment. Interestingly, multiple novel virtual experiences increased the number of newborn neurons accumulatively compared with a single experience. Finally, optogenetic silencing of existing DGCs during novel environmental exploration perturbed experience-induced neuronal addition. Our study shows that the adult brain conveys novel, enriched experiences to increase the addition of adult-born hippocampal neurons by increasing the firing of active DGCs. SIGNIFICANCE STATEMENT Adult brains are constantly reshaping themselves from synapses to circuits as we encounter novel experiences from moment to moment. Importantly, this reshaping includes the addition of newborn hippocampal neurons. However, it remains largely unknown how our circuits encode experience-induced brain activity to govern the addition of new hippocampal neurons. By coupling in vivo Ca2+ imaging of dentate granule neurons with a novel, unrestrained virtual reality system for rodents, we discovered that a new experience increased firing of active dentate granule neurons rapidly and robustly. Exploration in multiple novel virtual environments, compared with a single environment, promoted dentate activation and enhanced the addition of new hippocampal neurons accumulatively. Finally, silencing this activation optogenetically during novel experiences perturbed experience-induced neuronal addition. PMID:28373391

  18. Marshall Engineers Use Virtual Reality

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  19. Telearch - Integrated visual simulation environment for collaborative virtual archaeology.

    NASA Astrophysics Data System (ADS)

    Kurillo, Gregorij; Forte, Maurizio

    Archaeologists collect vast amounts of digital data around the world; however, they lack tools for integration and collaborative interaction to support reconstruction and interpretation process. TeleArch software is aimed to integrate different data sources and provide real-time interaction tools for remote collaboration of geographically distributed scholars inside a shared virtual environment. The framework also includes audio, 2D and 3D video streaming technology to facilitate remote presence of users. In this paper, we present several experimental case studies to demonstrate the integration and interaction with 3D models and geographical information system (GIS) data in this collaborative environment.

  20. Digital fabrication of multi-material biomedical objects.

    PubMed

    Cheung, H H; Choi, S H

    2009-12-01

    This paper describes a multi-material virtual prototyping (MMVP) system for modelling and digital fabrication of discrete and functionally graded multi-material objects for biomedical applications. The MMVP system consists of a DMMVP module, an FGMVP module and a virtual reality (VR) simulation module. The DMMVP module is used to model discrete multi-material (DMM) objects, while the FGMVP module is for functionally graded multi-material (FGM) objects. The VR simulation module integrates these two modules to perform digital fabrication of multi-material objects, which can be subsequently visualized and analysed in a virtual environment to optimize MMLM processes for fabrication of product prototypes. Using the MMVP system, two biomedical objects, including a DMM human spine and an FGM intervertebral disc spacer are modelled and digitally fabricated for visualization and analysis in a VR environment. These studies show that the MMVP system is a practical tool for modelling, visualization, and subsequent fabrication of biomedical objects of discrete and functionally graded multi-materials for biomedical applications. The system may be adapted to control MMLM machines with appropriate hardware for physical fabrication of biomedical objects.

  1. Thundercloud: Domain specific information security training for the smart grid

    NASA Astrophysics Data System (ADS)

    Stites, Joseph

    In this paper, we describe a cloud-based virtual smart grid test bed: ThunderCloud, which is intended to be used for domain-specific security training applicable to the smart grid environment. The test bed consists of virtual machines connected using a virtual internal network. ThunderCloud is remotely accessible, allowing students to undergo educational exercises online. We also describe a series of practical exercises that we have developed for providing the domain-specific training using ThunderCloud. The training exercises and attacks are designed to be realistic and to reflect known vulnerabilities and attacks reported in the smart grid environment. We were able to use ThunderCloud to offer practical domain-specific security training for smart grid environment to computer science students at little or no cost to the department and no risk to any real networks or systems.

  2. High sensitivity to multisensory conflicts in agoraphobia exhibited by virtual reality.

    PubMed

    Viaud-Delmon, Isabelle; Warusfel, Olivier; Seguelas, Angeline; Rio, Emmanuel; Jouvent, Roland

    2006-10-01

    The primary aim of this study was to evaluate the effect of auditory feedback in a VR system planned for clinical use and to address the different factors that should be taken into account in building a bimodal virtual environment (VE). We conducted an experiment in which we assessed spatial performances in agoraphobic patients and normal subjects comparing two kinds of VEs, visual alone (Vis) and auditory-visual (AVis), during separate sessions. Subjects were equipped with a head-mounted display coupled with an electromagnetic sensor system and immersed in a virtual town. Their task was to locate different landmarks and become familiar with the town. In the AVis condition subjects were equipped with the head-mounted display and headphones, which delivered a soundscape updated in real-time according to their movement in the virtual town. While general performances remained comparable across the conditions, the reported feeling of immersion was more compelling in the AVis environment. However, patients exhibited more cybersickness symptoms in this condition. The result of this study points to the multisensory integration deficit of agoraphobic patients and underline the need for further research on multimodal VR systems for clinical use.

  3. Multi-modal virtual environment research at Armstrong Laboratory

    NASA Technical Reports Server (NTRS)

    Eggleston, Robert G.

    1995-01-01

    One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.

  4. Exploring Learning through Audience Interaction in Virtual Reality Dome Theaters

    NASA Astrophysics Data System (ADS)

    Apostolellis, Panagiotis; Daradoumis, Thanasis

    Informal learning in public spaces like museums, science centers and planetariums is increasingly popular during the last years. Recent advancements in large-scale displays allowed contemporary technology-enhanced museums to get equipped with digital domes, some with real-time capabilities like Virtual Reality systems. By conducting extensive literature review we have come to the conclusion that little to no research has been carried out on the leaning outcomes that the combination of VR and audience interaction can provide in the immersive environments of dome theaters. Thus, we propose that audience collaboration in immersive virtual reality environments presents a promising approach to support effective learning in groups of school aged children.

  5. A collaborative virtual reality environment for neurosurgical planning and training.

    PubMed

    Kockro, Ralf A; Stadie, Axel; Schwandt, Eike; Reisch, Robert; Charalampaki, Cleopatra; Ng, Ivan; Yeo, Tseng Tsai; Hwang, Peter; Serra, Luis; Perneczky, Axel

    2007-11-01

    We have developed a highly interactive virtual environment that enables collaborative examination of stereoscopic three-dimensional (3-D) medical imaging data for planning, discussing, or teaching neurosurgical approaches and strategies. The system consists of an interactive console with which the user manipulates 3-D data using hand-held and tracked devices within a 3-D virtual workspace and a stereoscopic projection system. The projection system displays the 3-D data on a large screen while the user is working with it. This setup allows users to interact intuitively with complex 3-D data while sharing this information with a larger audience. We have been using this system on a routine clinical basis and during neurosurgical training courses to collaboratively plan and discuss neurosurgical procedures with 3-D reconstructions of patient-specific magnetic resonance and computed tomographic imaging data or with a virtual model of the temporal bone. Working collaboratively with the 3-D information of a large, interactive, stereoscopic projection provides an unambiguous way to analyze and understand the anatomic spatial relationships of different surgical corridors. In our experience, the system creates a unique forum for open and precise discussion of neurosurgical approaches. We believe the system provides a highly effective way to work with 3-D data in a group, and it significantly enhances teaching of neurosurgical anatomy and operative strategies.

  6. A Web-Based Virtual Classroom System Model

    ERIC Educational Resources Information Center

    Adewale, Olumide S.; Ibam, Emmanuel O.; Alese, B. K.

    2012-01-01

    The population of students all over the world is growing without a proportionate increase in teaching/learning resources/infrastructure. There is also much quest for learning in an environment that provides equal opportunities to all learners. The need to provide an equal opportunity learning environment that will hitherto improve the system of…

  7. Novel design of interactive multimodal biofeedback system for neurorehabilitation.

    PubMed

    Huang, He; Chen, Y; Xu, W; Sundaram, H; Olson, L; Ingalls, T; Rikakis, T; He, Jiping

    2006-01-01

    A previous design of a biofeedback system for Neurorehabilitation in an interactive multimodal environment has demonstrated the potential of engaging stroke patients in task-oriented neuromotor rehabilitation. This report explores the new concept and alternative designs of multimedia based biofeedback systems. In this system, the new interactive multimodal environment was constructed with abstract presentation of movement parameters. Scenery images or pictures and their clarity and orientation are used to reflect the arm movement and relative position to the target instead of the animated arm. The multiple biofeedback parameters were classified into different hierarchical levels w.r.t. importance of each movement parameter to performance. A new quantified measurement for these parameters were developed to assess the patient's performance both real-time and offline. These parameters were represented by combined visual and auditory presentations with various distinct music instruments. Overall, the objective of newly designed system is to explore what information and how to feedback information in interactive virtual environment could enhance the sensorimotor integration that may facilitate the efficient design and application of virtual environment based therapeutic intervention.

  8. Latency and User Performance in Virtual Environments and Augmented Reality

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    2009-01-01

    System rendering latency has been recognized by senior researchers, such as Professor Fredrick Brooks of UNC (Turing Award 1999), as a major factor limiting the realism and utility of head-referenced displays systems. Latency has been shown to reduce the user's sense of immersion within a virtual environment, disturb user interaction with virtual objects, and to contribute to motion sickness during some simulation tasks. Latency, however, is not just an issue for external display systems since finite nerve conduction rates and variation in transduction times in the human body's sensors also pose problems for latency management within the nervous system. Some of the phenomena arising from the brain's handling of sensory asynchrony due to latency will be discussed as a prelude to consideration of the effects of latency in interactive displays. The causes and consequences of the erroneous movement that appears in displays due to latency will be illustrated with examples of the user performance impact provided by several experiments. These experiments will review the generality of user sensitivity to latency when users judge either object or environment stability. Hardware and signal processing countermeasures will also be discussed. In particular the tuning of a simple extrapolative predictive filter not using a dynamic movement model will be presented. Results show that it is possible to adjust this filter so that the appearance of some latencies may be hidden without the introduction of perceptual artifacts such as overshoot. Several examples of the effects of user performance will be illustrated by three-dimensional tracking and tracing tasks executed in virtual environments. These experiments demonstrate classic phenomena known from work on manual control and show the need for very responsive systems if they are indented to support precise manipulation. The practical benefits of removing interfering latencies from interactive systems will be emphasized with some classic final examples from surgical telerobotics, and human-computer interaction.

  9. Identification of Program Signatures from Cloud Computing System Telemetry Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, Nicole M.; Greaves, Mark T.; Smith, William P.

    Malicious cloud computing activity can take many forms, including running unauthorized programs in a virtual environment. Detection of these malicious activities while preserving the privacy of the user is an important research challenge. Prior work has shown the potential viability of using cloud service billing metrics as a mechanism for proxy identification of malicious programs. Previously this novel detection method has been evaluated in a synthetic and isolated computational environment. In this paper we demonstrate the ability of billing metrics to identify programs, in an active cloud computing environment, including multiple virtual machines running on the same hypervisor. The openmore » source cloud computing platform OpenStack, is used for private cloud management at Pacific Northwest National Laboratory. OpenStack provides a billing tool (Ceilometer) to collect system telemetry measurements. We identify four different programs running on four virtual machines under the same cloud user account. Programs were identified with up to 95% accuracy. This accuracy is dependent on the distinctiveness of telemetry measurements for the specific programs we tested. Future work will examine the scalability of this approach for a larger selection of programs to better understand the uniqueness needed to identify a program. Additionally, future work should address the separation of signatures when multiple programs are running on the same virtual machine.« less

  10. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  11. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  12. Challenges to the development of complex virtual reality surgical simulations.

    PubMed

    Seymour, N E; Røtnes, J S

    2006-11-01

    Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.

  13. Virtual Reality Training System for Anytime/Anywhere Acquisition of Surgical Skills: A Pilot Study.

    PubMed

    Zahiri, Mohsen; Booton, Ryan; Nelson, Carl A; Oleynikov, Dmitry; Siu, Ka-Chun

    2018-03-01

    This article presents a hardware/software simulation environment suitable for anytime/anywhere surgical skills training. It blends the advantages of physical hardware and task analogs with the flexibility of virtual environments. This is further enhanced by a web-based implementation of training feedback accessible to both trainees and trainers. Our training system provides a self-paced and interactive means to attain proficiency in basic tasks that could potentially be applied across a spectrum of trainees from first responder field medical personnel to physicians. This results in a powerful training tool for surgical skills acquisition relevant to helping injured warfighters.

  14. Dynamic control of a moving platform using the CAREN system to optimize walking in virtual reality environments.

    PubMed

    Makssoud, Hassan El; Richards, Carol L; Comeau, François

    2009-01-01

    Virtual reality (VR) technology offers the opportunity to expose patients to complex physical environments without physical danger and thus provides a wide range of opportunities for locomotor training or the study of human postural and walking behavior. A VR-based locomotor training system has been developed for gait rehabilitation post-stroke. A clinical study has shown that persons after stroke are able to adapt and benefit from this novel system wherein they walk into virtual environments (VEs) on a self-paced treadmill mounted on a platform with 6 degrees of freedom. This platform is programmed to mimic changes in the terrain encountered in the VEs. While engaging in these VEs, excessive trunk movements and speed alterations have been observed, especially during the pitch perturbations accompanying uphill or downhill terrain changes. An in-depth study of the subject's behavior in relation to the platform movements revealed that the platform rotational axes need to be modified, as previously shown by Barton et al, and in addition did not consider the subject's position on the treadmill. The aim of this study was to determine an optimal solution to simulate walking in real life when engaging in VEs.

  15. Hybrid Reality Lab Capabilities - Video 2

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2016-01-01

    Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.

  16. A Data Management System for International Space Station Simulation Tools

    NASA Technical Reports Server (NTRS)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  17. CaveCAD: a tool for architectural design in immersive virtual environments

    NASA Astrophysics Data System (ADS)

    Schulze, Jürgen P.; Hughes, Cathleen E.; Zhang, Lelin; Edelstein, Eve; Macagno, Eduardo

    2014-02-01

    Existing 3D modeling tools were designed to run on desktop computers with monitor, keyboard and mouse. To make 3D modeling possible with mouse and keyboard, many 3D interactions, such as point placement or translations of geometry, had to be mapped to the 2D parameter space of the mouse, possibly supported by mouse buttons or keyboard keys. We hypothesize that had the designers of these existing systems had been able to assume immersive virtual reality systems as their target platforms, they would have been able to design 3D interactions much more intuitively. In collaboration with professional architects, we created a simple, but complete 3D modeling tool for virtual environments from the ground up and use direct 3D interaction wherever possible and adequate. In this publication, we present our approaches for interactions for typical 3D modeling functions, such as geometry creation, modification of existing geometry, and assignment of surface materials. We also discuss preliminary user experiences with this system.

  18. Game-Based Virtual Worlds as Decentralized Virtual Activity Systems

    NASA Astrophysics Data System (ADS)

    Scacchi, Walt

    There is widespread interest in the development and use of decentralized systems and virtual world environments as possible new places for engaging in collaborative work activities. Similarly, there is widespread interest in stimulating new technological innovations that enable people to come together through social networking, file/media sharing, and networked multi-player computer game play. A decentralized virtual activity system (DVAS) is a networked computer supported work/play system whose elements and social activities can be both virtual and decentralized (Scacchi et al. 2008b). Massively multi-player online games (MMOGs) such as World of Warcraft and online virtual worlds such as Second Life are each popular examples of a DVAS. Furthermore, these systems are beginning to be used for research, deve-lopment, and education activities in different science, technology, and engineering domains (Bainbridge 2007, Bohannon et al. 2009; Rieber 2005; Scacchi and Adams 2007; Shaffer 2006), which are also of interest here. This chapter explores two case studies of DVASs developed at the University of California at Irvine that employ game-based virtual worlds to support collaborative work/play activities in different settings. The settings include those that model and simulate practical or imaginative physical worlds in different domains of science, technology, or engineering through alternative virtual worlds where players/workers engage in different kinds of quests or quest-like workflows (Jakobsson 2006).

  19. Evolving a Neural Olfactorimotor System in Virtual and Real Olfactory Environments

    PubMed Central

    Rhodes, Paul A.; Anderson, Todd O.

    2012-01-01

    To provide a platform to enable the study of simulated olfactory circuitry in context, we have integrated a simulated neural olfactorimotor system with a virtual world which simulates both computational fluid dynamics as well as a robotic agent capable of exploring the simulated plumes. A number of the elements which we developed for this purpose have not, to our knowledge, been previously assembled into an integrated system, including: control of a simulated agent by a neural olfactorimotor system; continuous interaction between the simulated robot and the virtual plume; the inclusion of multiple distinct odorant plumes and background odor; the systematic use of artificial evolution driven by olfactorimotor performance (e.g., time to locate a plume source) to specify parameter values; the incorporation of the realities of an imperfect physical robot using a hybrid model where a physical robot encounters a simulated plume. We close by describing ongoing work toward engineering a high dimensional, reversible, low power electronic olfactory sensor which will allow olfactorimotor neural circuitry evolved in the virtual world to control an autonomous olfactory robot in the physical world. The platform described here is intended to better test theories of olfactory circuit function, as well as provide robust odor source localization in realistic environments. PMID:23112772

  20. Effects of virtualization on a scientific application - Running a hyperspectral radiative transfer code on virtual machines.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tikotekar, Anand A; Vallee, Geoffroy R; Naughton III, Thomas J

    2008-01-01

    The topic of system-level virtualization has recently begun to receive interest for high performance computing (HPC). This is in part due to the isolation and encapsulation offered by the virtual machine. These traits enable applications to customize their environments and maintain consistent software configurations in their virtual domains. Additionally, there are mechanisms that can be used for fault tolerance like live virtual machine migration. Given these attractive benefits to virtualization, a fundamental question arises, how does this effect my scientific application? We use this as the premise for our paper and observe a real-world scientific code running on a Xenmore » virtual machine. We studied the effects of running a radiative transfer simulation, Hydrolight, on a virtual machine. We discuss our methodology and report observations regarding the usage of virtualization with this application.« less

  1. Virtual Education: Guidelines for Using Games Technology

    ERIC Educational Resources Information Center

    Schofield, Damian

    2014-01-01

    Advanced three-dimensional virtual environment technology, similar to that used by the film and computer games industry, can allow educational developers to rapidly create realistic online virtual environments. This technology has been used to generate a range of interactive Virtual Reality (VR) learning environments across a spectrum of…

  2. Virtual Rehabilitation with Children: Challenges for Clinical Adoption [From the Field].

    PubMed

    Glegg, Stephanie

    2017-01-01

    Virtual, augmented, and mixed reality environments are increasingly being developed and used to address functional rehabilitation goals related to physical, cognitive, social, and psychological impairments. For example, a child with an acquired brain injury may participate in virtual rehabilitation to address impairments in balance, attention, turn taking, and engagement in therapy. The trend toward virtual rehabilitation first gained momentum with the adoption of commercial off-the-shelf active video gaming consoles (e.g., Nintendo Wii and XBox). Now, we are seeing the rapid emergence of customized rehabilitation-specific systems that integrate technological advances in virtual reality, visual effects, motion tracking, physiological monitoring, and robotics.

  3. The Role of Semantics in Next-Generation Online Virtual World-Based Retail Store

    NASA Astrophysics Data System (ADS)

    Sharma, Geetika; Anantaram, C.; Ghosh, Hiranmay

    Online virtual environments are increasingly becoming popular for entrepreneurship. While interactions are primarily between avatars, some interactions could occur through intelligent chatbots. Such interactions require connecting to backend business applications to obtain information, carry out real-world transactions etc. In this paper, we focus on integrating business application systems with virtual worlds. We discuss the probable features of a next-generation online virtual world-based retail store and the technologies involved in realizing the features of such a store. In particular, we examine the role of semantics in integrating popular virtual worlds with business applications to provide natural language based interactions.

  4. Temporally coherent 4D video segmentation for teleconferencing

    NASA Astrophysics Data System (ADS)

    Ehmann, Jana; Guleryuz, Onur G.

    2013-09-01

    We develop an algorithm for 4-D (RGB+Depth) video segmentation targeting immersive teleconferencing ap- plications on emerging mobile devices. Our algorithm extracts users from their environments and places them onto virtual backgrounds similar to green-screening. The virtual backgrounds increase immersion and interac- tivity, relieving the users of the system from distractions caused by disparate environments. Commodity depth sensors, while providing useful information for segmentation, result in noisy depth maps with a large number of missing depth values. By combining depth and RGB information, our work signi¯cantly improves the other- wise very coarse segmentation. Further imposing temporal coherence yields compositions where the foregrounds seamlessly blend with the virtual backgrounds with minimal °icker and other artifacts. We achieve said improve- ments by correcting the missing information in depth maps before fast RGB-based segmentation, which operates in conjunction with temporal coherence. Simulation results indicate the e±cacy of the proposed system in video conferencing scenarios.

  5. Robot Teleoperation and Perception Assistance with a Virtual Holographic Display

    NASA Technical Reports Server (NTRS)

    Goddard, Charles O.

    2012-01-01

    Teleoperation of robots in space from Earth has historically been dfficult. Speed of light delays make direct joystick-type control infeasible, so it is desirable to command a robot in a very high-level fashion. However, in order to provide such an interface, knowledge of what objects are in the robot's environment and how they can be interacted with is required. In addition, many tasks that would be desirable to perform are highly spatial, requiring some form of six degree of freedom input. These two issues can be combined, allowing the user to assist the robot's perception by identifying the locations of objects in the scene. The zSpace system, a virtual holographic environment, provides a virtual three-dimensional space superimposed over real space and a stylus tracking position and rotation inside of it. Using this system, a possible interface for this sort of robot control is proposed.

  6. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    PubMed

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  7. Using EMG to anticipate head motion for virtual-environment applications

    NASA Technical Reports Server (NTRS)

    Barniv, Yair; Aguilar, Mario; Hasanbelliu, Erion

    2005-01-01

    In virtual environment (VE) applications, where virtual objects are presented in a see-through head-mounted display, virtual images must be continuously stabilized in space in response to user's head motion. Time delays in head-motion compensation cause virtual objects to "swim" around instead of being stable in space which results in misalignment errors when overlaying virtual and real objects. Visual update delays are a critical technical obstacle for implementing head-mounted displays in applications such as battlefield simulation/training, telerobotics, and telemedicine. Head motion is currently measurable by a head-mounted 6-degrees-of-freedom inertial measurement unit. However, even given this information, overall VE-system latencies cannot be reduced under about 25 ms. We present a novel approach to eliminating latencies, which is premised on the fact that myoelectric signals from a muscle precede its exertion of force, thereby limb or head acceleration. We thus suggest utilizing neck-muscles' myoelectric signals to anticipate head motion. We trained a neural network to map such signals onto equivalent time-advanced inertial outputs. The resulting network can achieve time advances of up to 70 ms.

  8. Using EMG to anticipate head motion for virtual-environment applications.

    PubMed

    Barniv, Yair; Aguilar, Mario; Hasanbelliu, Erion

    2005-06-01

    In virtual environment (VE) applications, where virtual objects are presented in a see-through head-mounted display, virtual images must be continuously stabilized in space in response to user's head motion. Time delays in head-motion compensation cause virtual objects to "swim" around instead of being stable in space which results in misalignment errors when overlaying virtual and real objects. Visual update delays are a critical technical obstacle for implementing head-mounted displays in applications such as battlefield simulation/training, telerobotics, and telemedicine. Head motion is currently measurable by a head-mounted 6-degrees-of-freedom inertial measurement unit. However, even given this information, overall VE-system latencies cannot be reduced under about 25 ms. We present a novel approach to eliminating latencies, which is premised on the fact that myoelectric signals from a muscle precede its exertion of force, thereby limb or head acceleration. We thus suggest utilizing neck-muscles' myoelectric signals to anticipate head motion. We trained a neural network to map such signals onto equivalent time-advanced inertial outputs. The resulting network can achieve time advances of up to 70 ms.

  9. Toward cognitive robotics

    NASA Astrophysics Data System (ADS)

    Laird, John E.

    2009-05-01

    Our long-term goal is to develop autonomous robotic systems that have the cognitive abilities of humans, including communication, coordination, adapting to novel situations, and learning through experience. Our approach rests on the recent integration of the Soar cognitive architecture with both virtual and physical robotic systems. Soar has been used to develop a wide variety of knowledge-rich agents for complex virtual environments, including distributed training environments and interactive computer games. For development and testing in robotic virtual environments, Soar interfaces to a variety of robotic simulators and a simple mobile robot. We have recently made significant extensions to Soar that add new memories and new non-symbolic reasoning to Soar's original symbolic processing, which should significantly improve Soar abilities for control of robots. These extensions include episodic memory, semantic memory, reinforcement learning, and mental imagery. Episodic memory and semantic memory support the learning and recalling of prior events and situations as well as facts about the world. Reinforcement learning provides the ability of the system to tune its procedural knowledge - knowledge about how to do things. Mental imagery supports the use of diagrammatic and visual representations that are critical to support spatial reasoning. We speculate on the future of unmanned systems and the need for cognitive robotics to support dynamic instruction and taskability.

  10. Around Marshall

    NASA Image and Video Library

    1993-09-15

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall SPace Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  11. Around Marshall

    NASA Image and Video Library

    1993-12-15

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  12. Wireless physiological monitoring and ocular tracking: 3D calibration in a fully-immersive virtual health care environment.

    PubMed

    Zhang, Lelin; Chi, Yu Mike; Edelstein, Eve; Schulze, Jurgen; Gramann, Klaus; Velasquez, Alvaro; Cauwenberghs, Gert; Macagno, Eduardo

    2010-01-01

    Wireless physiological/neurological monitoring in virtual reality (VR) offers a unique opportunity for unobtrusively quantifying human responses to precisely controlled and readily modulated VR representations of health care environments. Here we present such a wireless, light-weight head-mounted system for measuring electrooculogram (EOG) and electroencephalogram (EEG) activity in human subjects interacting with and navigating in the Calit2 StarCAVE, a five-sided immersive 3-D visualization VR environment. The system can be easily expanded to include other measurements, such as cardiac activity and galvanic skin responses. We demonstrate the capacity of the system to track focus of gaze in 3-D and report a novel calibration procedure for estimating eye movements from responses to the presentation of a set of dynamic visual cues in the StarCAVE. We discuss cyber and clinical applications that include a 3-D cursor for visual navigation in VR interactive environments, and the monitoring of neurological and ocular dysfunction in vision/attention disorders.

  13. Virtual reality simulation in neurosurgery: technologies and evolution.

    PubMed

    Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H

    2013-01-01

    Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.

  14. Cognitive factors associated with immersion in virtual environments

    NASA Technical Reports Server (NTRS)

    Psotka, Joseph; Davison, Sharon

    1993-01-01

    Immersion into the dataspace provided by a computer, and the feeling of really being there or 'presence', are commonly acknowledged as the uniquely important features of virtual reality environments. How immersed one feels appears to be determined by a complex set of physical components and affordances of the environment, and as yet poorly understood psychological processes. Pimentel and Teixeira say that the experience of being immersed in a computer-generated world involves the same mental shift of 'suspending your disbelief for a period of time' as 'when you get wrapped up in a good novel or become absorbed in playing a computer game'. That sounds as if it could be right, but it would be good to get some evidence for these important conclusions. It might be even better to try to connect these statements with theoretical positions that try to do justice to complex cognitive processes. The basic precondition for understanding Virtual Reality (VR) is understanding the spatial representation systems that localize our bodies or egocenters in space. The effort to understand these cognitive processes is being driven with new energy by the pragmatic demands of successful virtual reality environments, but the literature is largely sparse and anecdotal.

  15. Effects on Training Using Illumination in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian

    1999-01-01

    Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.

  16. Exploring the Strategies for a Community College Transition into a Cloud-Computing Environment

    ERIC Educational Resources Information Center

    DeBary, Narges

    2017-01-01

    The use of the Internet has resulted in the birth of an innovative virtualization technology called cloud computing. Virtualization can tremendously improve the instructional and operational systems of a community college. Although the incidental adoption of the cloud solutions in the community colleges of higher education has been increased,…

  17. Immersive Training Systems: Virtual Reality and Education and Training.

    ERIC Educational Resources Information Center

    Psotka, Joseph

    1995-01-01

    Describes virtual reality (VR) technology and VR research on education and training. Focuses on immersion as the key added value of VR, analyzes cognitive variables connected to immersion, how it is generated in synthetic environments and its benefits. Discusses value of tracked, immersive visual displays over nonimmersive simulations. Contains 78…

  18. An Adaptive Display to Treat Stress-Related Disorders: EMMA's World

    ERIC Educational Resources Information Center

    Banos, R. M.; Botella, C.; Guillen, V.; Garcia-Palacios, A.; Quero, S.; Breton-Lopez, J.; Alcaniz, M.

    2009-01-01

    Most of the virtual environments currently available in the field of psychological treatments were designed to solve a specific problem. Our research group has developed a versatile virtual reality system (an adaptive display) called "EMMA's world", which can address a wide range of problems. It was designed to assist in clinical situations where…

  19. Application of Semantic Approaches and Interactive Virtual Technology to Improve Teaching Effectiveness

    ERIC Educational Resources Information Center

    Jou, Min; Liu, Chi-Chia

    2012-01-01

    This article describes an implementation of interactive virtual environments that have been designed for supporting the education of technical skills in material processing technology. The developed web-based systems provide the capability to train students in the technical skills of material processing technology without the need to work on…

  20. Brain Activity on Navigation in Virtual Environments.

    ERIC Educational Resources Information Center

    Mikropoulos, Tassos A.

    2001-01-01

    Assessed the cognitive processing that takes place in virtual environments by measuring electrical brain activity using Fast Fourier Transform analysis. University students performed the same task in a real and a virtual environment, and eye movement measurements showed that all subjects were more attentive when navigating in the virtual world.…

  1. A Virtual Education: Guidelines for Using Games Technology

    ERIC Educational Resources Information Center

    Schofield, Damian

    2014-01-01

    Advanced three-dimensional virtual environment technology, similar to that used by the film and computer games industry, can allow educational developers to rapidly create realistic online vir-tual environments. This technology has been used to generate a range of interactive Virtual Real-ity (VR) learning environments across a spectrum of…

  2. Migrating EO/IR sensors to cloud-based infrastructure as service architectures

    NASA Astrophysics Data System (ADS)

    Berglie, Stephen T.; Webster, Steven; May, Christopher M.

    2014-06-01

    The Night Vision Image Generator (NVIG), a product of US Army RDECOM CERDEC NVESD, is a visualization tool used widely throughout Army simulation environments to provide fully attributed synthesized, full motion video using physics-based sensor and environmental effects. The NVIG relies heavily on contemporary hardware-based acceleration and GPU processing techniques, which push the envelope of both enterprise and commodity-level hypervisor support for providing virtual machines with direct access to hardware resources. The NVIG has successfully been integrated into fully virtual environments where system architectures leverage cloudbased technologies to various extents in order to streamline infrastructure and service management. This paper details the challenges presented to engineers seeking to migrate GPU-bound processes, such as the NVIG, to virtual machines and, ultimately, Cloud-Based IAS architectures. In addition, it presents the path that led to success for the NVIG. A brief overview of Cloud-Based infrastructure management tool sets is provided, and several virtual desktop solutions are outlined. A discrimination is made between general purpose virtual desktop technologies compared to technologies that expose GPU-specific capabilities, including direct rendering and hard ware-based video encoding. Candidate hypervisor/virtual machine configurations that nominally satisfy the virtualized hardware-level GPU requirements of the NVIG are presented , and each is subsequently reviewed in light of its implications on higher-level Cloud management techniques. Implementation details are included from the hardware level, through the operating system, to the 3D graphics APls required by the NVIG and similar GPU-bound tools.

  3. Mobile e-Learning for Next Generation Communication Environment

    ERIC Educational Resources Information Center

    Wu, Tin-Yu; Chao, Han-Chieh

    2008-01-01

    This article develops an environment for mobile e-learning that includes an interactive course, virtual online labs, an interactive online test, and lab-exercise training platform on the fourth generation mobile communication system. The Next Generation Learning Environment (NeGL) promotes the term "knowledge economy." Inter-networking…

  4. Comparison of grasping movements made by healthy subjects in a 3-dimensional immersive virtual versus physical environment.

    PubMed

    Magdalon, Eliane C; Michaelsen, Stella M; Quevedo, Antonio A; Levin, Mindy F

    2011-09-01

    Virtual reality (VR) technology is being used with increasing frequency as a training medium for motor rehabilitation. However, before addressing training effectiveness in virtual environments (VEs), it is necessary to identify if movements made in such environments are kinematically similar to those made in physical environments (PEs) and the effect of provision of haptic feedback on these movement patterns. These questions are important since reach-to-grasp movements may be inaccurate when visual or haptic feedback is altered or absent. Our goal was to compare kinematics of reaching and grasping movements to three objects performed in an immersive three-dimensional (3D) VE with haptic feedback (cyberglove/grasp system) viewed through a head-mounted display to those made in an equivalent physical environment (PE). We also compared movements in PE made with and without wearing the cyberglove/grasp haptic feedback system. Ten healthy subjects (8 women, 62.1±8.8years) reached and grasped objects requiring 3 different grasp types (can, diameter 65.6mm, cylindrical grasp; screwdriver, diameter 31.6mm, power grasp; pen, diameter 7.5mm, precision grasp) in PE and visually similar virtual objects in VE. Temporal and spatial arm and trunk kinematics were analyzed. Movements were slower and grip apertures were wider when wearing the glove in both the PE and the VE compared to movements made in the PE without the glove. When wearing the glove, subjects used similar reaching trajectories in both environments, preserved the coordination between reaching and grasping and scaled grip aperture to object size for the larger object (cylindrical grasp). However, in VE compared to PE, movements were slower and had longer deceleration times, elbow extension was greater when reaching to the smallest object and apertures were wider for the power and precision grip tasks. Overall, the differences in spatial and temporal kinematics of movements between environments were greater than those due only to wearing the cyberglove/grasp system. Differences in movement kinematics due to the viewing environment were likely due to a lack of prior experience with the virtual environment, an uncertainty of object location and the restricted field-of-view when wearing the head-mounted display. The results can be used to inform the design and disposition of objects within 3D VEs for the study of the control of prehension and for upper limb rehabilitation. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. A Fully Immersive Set-Up for Remote Interaction and Neurorehabilitation Based on Virtual Body Ownership

    PubMed Central

    Perez-Marcos, Daniel; Solazzi, Massimiliano; Steptoe, William; Oyekoya, Oyewole; Frisoli, Antonio; Weyrich, Tim; Steed, Anthony; Tecchia, Franco; Slater, Mel; Sanchez-Vives, Maria V.

    2012-01-01

    Although telerehabilitation systems represent one of the most technologically appealing clinical solutions for the immediate future, they still present limitations that prevent their standardization. Here we propose an integrated approach that includes three key and novel factors: (a) fully immersive virtual environments, including virtual body representation and ownership; (b) multimodal interaction with remote people and virtual objects including haptic interaction; and (c) a physical representation of the patient at the hospital through embodiment agents (e.g., as a physical robot). The importance of secure and rapid communication between the nodes is also stressed and an example implemented solution is described. Finally, we discuss the proposed approach with reference to the existing literature and systems. PMID:22787454

  6. Design strategies and functionality of the Visual Interface for Virtual Interaction Development (VIVID) tool

    NASA Technical Reports Server (NTRS)

    Nguyen, Lac; Kenney, Patrick J.

    1993-01-01

    Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.

  7. Derived virtual devices: a secure distributed file system mechanism

    NASA Technical Reports Server (NTRS)

    VanMeter, Rodney; Hotz, Steve; Finn, Gregory

    1996-01-01

    This paper presents the design of derived virtual devices (DVDs). DVDs are the mechanism used by the Netstation Project to provide secure shared access to network-attached peripherals distributed in an untrusted network environment. DVDs improve Input/Output efficiency by allowing user processes to perform I/O operations directly from devices without intermediate transfer through the controlling operating system kernel. The security enforced at the device through the DVD mechanism includes resource boundary checking, user authentication, and restricted operations, e.g., read-only access. To illustrate the application of DVDs, we present the interactions between a network-attached disk and a file system designed to exploit the DVD abstraction. We further discuss third-party transfer as a mechanism intended to provide for efficient data transfer in a typical NAP environment. We show how DVDs facilitate third-party transfer, and provide the security required in a more open network environment.

  8. Evaluation of the Next-Gen Exercise Software Interface in the NEEMO Analog

    NASA Technical Reports Server (NTRS)

    Hanson, Andrea; Kalogera, Kent; Sandor, Aniko; Hardy, Marc; Frank, Andrew; English, Kirk; Williams, Thomas; Perera, Jeevan; Amonette, William

    2017-01-01

    NSBRI (National Space Biomedical Research Institute) funded research grant to develop the 'NextGen' exercise software for the NEEMO (NASA Extreme Environment Mission Operations) analog. Develop a software architecture to integrate instructional, motivational and socialization techniques into a common portal to enhance exercise countermeasures in remote environments. Increase user efficiency and satisfaction, and institute commonality across multiple exercise systems. Utilized GUI (Graphical User Interface) design principals focused on intuitive ease of use to minimize training time and realize early user efficiency. Project requirement to test the software in an analog environment. Top Level Project Aims: 1) Improve the usability of crew interface software to exercise CMS (Crew Management System) through common app-like interfaces. 2) Introduce virtual instructional motion training. 3) Use virtual environment to provide remote socialization with family and friends, improve exercise technique, adherence, motivation and ultimately performance outcomes.

  9. Teleoperation with virtual force feedback

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, R.J.

    1993-08-01

    In this paper we describe an algorithm for generating virtual forces in a bilateral teleoperator system. The virtual forces are generated from a world model and are used to provide real-time obstacle avoidance and guidance capabilities. The algorithm requires that the slaves tool and every object in the environment be decomposed into convex polyhedral Primitives. Intrusion distance and extraction vectors are then derived at every time step by applying Gilbert`s polyhedra distance algorithm, which has been adapted for the task. This information is then used to determine the compression and location of nonlinear virtual spring-dampers whose total force is summedmore » and applied to the manipulator/teleoperator system. Experimental results validate the whole approach, showing that it is possible to compute the algorithm and generate realistic, useful psuedo forces for a bilateral teleoperator system using standard VME bus hardware.« less

  10. Usability Studies In Virtual And Traditional Computer Aided Design Environments For Spatial Awareness

    DTIC Science & Technology

    2017-08-08

    Usability Studies In Virtual And Traditional Computer Aided Design Environments For Spatial Awareness Dr. Syed Adeel Ahmed, Xavier University of...virtual environment with wand interfaces compared directly with a workstation non-stereoscopic traditional CAD interface with keyboard and mouse. In...navigate through a virtual environment. The wand interface provides a significantly improved means of interaction. This study quantitatively measures the

  11. A virtual reality environment for telescope operation

    NASA Astrophysics Data System (ADS)

    Martínez, Luis A.; Villarreal, José L.; Ángeles, Fernando; Bernal, Abel

    2010-07-01

    Astronomical observatories and telescopes are becoming increasingly large and complex systems, demanding to any potential user the acquirement of great amount of information previous to access them. At present, the most common way to overcome that information is through the implementation of larger graphical user interfaces and computer monitors to increase the display area. Tonantzintla Observatory has a 1-m telescope with a remote observing system. As a step forward in the improvement of the telescope software, we have designed a Virtual Reality (VR) environment that works as an extension of the remote system and allows us to operate the telescope. In this work we explore this alternative technology that is being suggested here as a software platform for the operation of the 1-m telescope.

  12. Scratchpads 2.0: a Virtual Research Environment supporting scholarly collaboration, communication and data publication in biodiversity science

    PubMed Central

    Smith, Vincent S.; Rycroft, Simon D.; Brake, Irina; Scott, Ben; Baker, Edward; Livermore, Laurence; Blagoderov, Vladimir; Roberts, David

    2011-01-01

    Abstract The Scratchpad Virtual Research Environment (http://scratchpads.eu/) is a flexible system for people to create their own research networks supporting natural history science. Here we describe Version 2 of the system characterised by the move to Drupal 7 as the Scratchpad core development framework and timed to coincide with the fifth year of the project’s operation in late January 2012. The development of Scratchpad 2 reflects a combination of technical enhancements that make the project more sustainable, combined with new features intended to make the system more functional and easier to use. A roadmap outlining strategic plans for development of the Scratchpad project over the next two years concludes this article. PMID:22207806

  13. Scratchpads 2.0: a Virtual Research Environment supporting scholarly collaboration, communication and data publication in biodiversity science.

    PubMed

    Smith, Vincent S; Rycroft, Simon D; Brake, Irina; Scott, Ben; Baker, Edward; Livermore, Laurence; Blagoderov, Vladimir; Roberts, David

    2011-01-01

    The Scratchpad Virtual Research Environment (http://scratchpads.eu/) is a flexible system for people to create their own research networks supporting natural history science. Here we describe Version 2 of the system characterised by the move to Drupal 7 as the Scratchpad core development framework and timed to coincide with the fifth year of the project's operation in late January 2012. The development of Scratchpad 2 reflects a combination of technical enhancements that make the project more sustainable, combined with new features intended to make the system more functional and easier to use. A roadmap outlining strategic plans for development of the Scratchpad project over the next two years concludes this article.

  14. Virtual Earth System Laboratory (VESL): A Virtual Research Environment for The Visualization of Earth System Data and Process Simulations

    NASA Astrophysics Data System (ADS)

    Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.

    2017-12-01

    The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.

  15. The RoboCup Mixed Reality League - A Case Study

    NASA Astrophysics Data System (ADS)

    Gerndt, Reinhard; Bohnen, Matthias; da Silva Guerra, Rodrigo; Asada, Minoru

    In typical mixed reality systems there is only a one-way interaction from real to virtual. A human user or the physics of a real object may influence the behavior of virtual objects, but real objects usually cannot be influenced by the virtual world. By introducing real robots into the mixed reality system, we allow a true two-way interaction between virtual and real worlds. Our system has been used since 2007 to implement the RoboCup mixed reality soccer games and other applications for research and edutainment. Our framework system is freely programmable to generate any virtual environment, which may then be further supplemented with virtual and real objects. The system allows for control of any real object based on differential drive robots. The robots may be adapted for different applications, e.g., with markers for identification or with covers to change shape and appearance. They may also be “equipped” with virtual tools. In this chapter we present the hardware and software architecture of our system and some applications. The authors believe this can be seen as a first implementation of Ivan Sutherland’s 1965 idea of the ultimate display: “The ultimate display would, of course, be a room within which the computer can control the existence of matter …” (Sutherland, 1965, Proceedings of IFIPS Congress 2:506-508).

  16. Validation of a method for real time foot position and orientation tracking with Microsoft Kinect technology for use in virtual reality and treadmill based gait training programs.

    PubMed

    Paolini, Gabriele; Peruzzi, Agnese; Mirelman, Anat; Cereatti, Andrea; Gaukrodger, Stephen; Hausdorff, Jeffrey M; Della Croce, Ugo

    2014-09-01

    The use of virtual reality for the provision of motor-cognitive gait training has been shown to be effective for a variety of patient populations. The interaction between the user and the virtual environment is achieved by tracking the motion of the body parts and replicating it in the virtual environment in real time. In this paper, we present the validation of a novel method for tracking foot position and orientation in real time, based on the Microsoft Kinect technology, to be used for gait training combined with virtual reality. The validation of the motion tracking method was performed by comparing the tracking performance of the new system against a stereo-photogrammetric system used as gold standard. Foot position errors were in the order of a few millimeters (average RMSD from 4.9 to 12.1 mm in the medio-lateral and vertical directions, from 19.4 to 26.5 mm in the anterior-posterior direction); the foot orientation errors were also small (average %RMSD from 5.6% to 8.8% in the medio-lateral and vertical directions, from 15.5% to 18.6% in the anterior-posterior direction). The results suggest that the proposed method can be effectively used to track feet motion in virtual reality and treadmill-based gait training programs.

  17. The perception of spatial layout in real and virtual worlds.

    PubMed

    Arthur, E J; Hancock, P A; Chrysler, S T

    1997-01-01

    As human-machine interfaces grow more immersive and graphically-oriented, virtual environment systems become more prominent as the medium for human-machine communication. Often, virtual environments (VE) are built to provide exact metrical representations of existing or proposed physical spaces. However, it is not known how individuals develop representational models of these spaces in which they are immersed and how those models may be distorted with respect to both the virtual and real-world equivalents. To evaluate the process of model development, the present experiment examined participant's ability to reproduce a complex spatial layout of objects having experienced them previously under different viewing conditions. The layout consisted of nine common objects arranged on a flat plane. These objects could be viewed in a free binocular virtual condition, a free binocular real-world condition, and in a static monocular view of the real world. The first two allowed active exploration of the environment while the latter condition allowed the participant only a passive opportunity to observe from a single viewpoint. Viewing conditions were a between-subject variable with 10 participants randomly assigned to each condition. Performance was assessed using mapping accuracy and triadic comparisons of relative inter-object distances. Mapping results showed a significant effect of viewing condition where, interestingly, the static monocular condition was superior to both the active virtual and real binocular conditions. Results for the triadic comparisons showed a significant interaction for gender by viewing condition in which males were more accurate than females. These results suggest that the situation model resulting from interaction with a virtual environment was indistinguishable from interaction with real objects at least within the constraints of the present procedure.

  18. Development of an Interactive Augmented Environment and Its Application to Autonomous Learning for Quadruped Robots

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hayato; Osaki, Tsugutoyo; Okuyama, Tetsuro; Gramm, Joshua; Ishino, Akira; Shinohara, Ayumi

    This paper describes an interactive experimental environment for autonomous soccer robots, which is a soccer field augmented by utilizing camera input and projector output. This environment, in a sense, plays an intermediate role between simulated environments and real environments. We can simulate some parts of real environments, e.g., real objects such as robots or a ball, and reflect simulated data into the real environments, e.g., to visualize the positions on the field, so as to create a situation that allows easy debugging of robot programs. The significant point compared with analogous work is that virtual objects are touchable in this system owing to projectors. We also show the portable version of our system that does not require ceiling cameras. As an application in the augmented environment, we address the learning of goalie strategies on real quadruped robots in penalty kicks. We make our robots utilize virtual balls in order to perform only quadruped locomotion in real environments, which is quite difficult to simulate accurately. Our robots autonomously learn and acquire more beneficial strategies without human intervention in our augmented environment than those in a fully simulated environment.

  19. Shoulder Kinematics and Spatial Pattern of Trapezius Electromyographic Activity in Real and Virtual Environments

    PubMed Central

    Samani, Afshin; Pontonnier, Charles; Dumont, Georges; Madeleine, Pascal

    2015-01-01

    The design of an industrial workstation tends to include ergonomic assessment steps based on a digital mock-up and a virtual reality setup. Lack of interaction and system fidelity is often reported as a main issue in such virtual reality applications. This limitation is a crucial issue as thorough ergonomic analysis is required for an investigation of the biomechanics. In the current study, we investigated the biomechanical responses of the shoulder joint in a simulated assembly task for comparison with the biomechanical responses in virtual environments. Sixteen male healthy novice subjects performed the task on three different platforms: real (RE), virtual (VE), and virtual environment with force feedback (VEF) with low and high precision demands. The subjects repeated the task 12 times (i.e., 12 cycles). High density electromyography from the upper trapezius and rotation angles of the shoulder joint were recorded and split into the cycles. The angular trajectories and velocity profiles of the shoulder joint angles over a cycle were computed in 3D. The inter-subject similarity in terms of normalized mutual information on kinematics and electromyography was investigated. Compared with RE the task in VE and VEF was characterized by lower kinematic maxima. The inter-subject similarity in RE compared with intra-subject similarity across the platforms was lower in terms of movement trajectories and greater in terms of trapezius muscle activation. The precision demand resulted in lower inter- and intra-subject similarity across platforms. The proposed approach identifies biomechanical differences in the shoulder joint in both VE and VEF compared with the RE platform, but these differences are less marked in VE mostly due to technical limitations of co-localizing the force feedback system in the VEF platform. PMID:25768123

  20. Virtual operating room for team training in surgery.

    PubMed

    Abelson, Jonathan S; Silverman, Elliott; Banfelder, Jason; Naides, Alexandra; Costa, Ricardo; Dakin, Gregory

    2015-09-01

    We proposed to develop a novel virtual reality (VR) team training system. The objective of this study was to determine the feasibility of creating a VR operating room to simulate a surgical crisis scenario and evaluate the simulator for construct and face validity. We modified ICE STORM (Integrated Clinical Environment; Systems, Training, Operations, Research, Methods), a VR-based system capable of modeling a variety of health care personnel and environments. ICE STORM was used to simulate a standardized surgical crisis scenario, whereby participants needed to correct 4 elements responsible for loss of laparoscopic visualization. The construct and face validity of the environment were measured. Thirty-three participants completed the VR simulation. Attendings completed the simulation in less time than trainees (271 vs 201 seconds, P = .032). Participants felt the training environment was realistic and had a favorable impression of the simulation. All participants felt the workload of the simulation was low. Creation of a VR-based operating room for team training in surgery is feasible and can afford a realistic team training environment. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Ecological validity of virtual environments to assess human navigation ability

    PubMed Central

    van der Ham, Ineke J. M.; Faber, Annemarie M. E.; Venselaar, Matthijs; van Kreveld, Marc J.; Löffler, Maarten

    2015-01-01

    Route memory is frequently assessed in virtual environments. These environments can be presented in a fully controlled manner and are easy to use. Yet they lack the physical involvement that participants have when navigating real environments. For some aspects of route memory this may result in reduced performance in virtual environments. We assessed route memory performance in four different environments: real, virtual, virtual with directional information (compass), and hybrid. In the hybrid environment, participants walked the route outside on an open field, while all route information (i.e., path, landmarks) was shown simultaneously on a handheld tablet computer. Results indicate that performance in the real life environment was better than in the virtual conditions for tasks relying on survey knowledge, like pointing to start and end point, and map drawing. Performance in the hybrid condition however, hardly differed from real life performance. Performance in the virtual environment did not benefit from directional information. Given these findings, the hybrid condition may offer the best of both worlds: the performance level is comparable to that of real life for route memory, yet it offers full control of visual input during route learning. PMID:26074831

  2. The Effect of Desktop Illumination Realism on a User's Sense of Presence in a Virtual Learning Environment

    ERIC Educational Resources Information Center

    Ehrlich, Justin

    2010-01-01

    The application of virtual reality is becoming ever more important as technology reaches new heights allowing virtual environments (VE) complete with global illumination. One successful application of virtual environments is educational interventions meant to treat individuals with autism spectrum disorder (ASD). VEs are effective with these…

  3. Virtual Virtuosos: A Case Study in Learning Music in Virtual Learning Environments in Spain

    ERIC Educational Resources Information Center

    Alberich-Artal, Enric; Sangra, Albert

    2012-01-01

    In recent years, the development of Information and Communication Technologies (ICT) has contributed to the generation of a number of interesting initiatives in the field of music education and training in virtual learning environments. However, music education initiatives employing virtual learning environments have replicated and perpetuated the…

  4. Eliciting nicotine craving with virtual smoking cues.

    PubMed

    Gamito, Pedro; Oliveira, Jorge; Baptista, André; Morais, Diogo; Lopes, Paulo; Rosa, Pedro; Santos, Nuno; Brito, Rodrigo

    2014-08-01

    Craving is a strong desire to consume that emerges in every case of substance addiction. Previous studies have shown that eliciting craving with an exposure cues protocol can be a useful option for the treatment of nicotine dependence. Thus, the main goal of this study was to develop a virtual platform in order to induce craving in smokers. Fifty-five undergraduate students were randomly assigned to two different virtual environments: high arousal contextual cues and low arousal contextual cues scenarios (17 smokers with low nicotine dependency were excluded). An eye-tracker system was used to evaluate attention toward these cues. Eye fixation on smoking-related cues differed between smokers and nonsmokers, indicating that smokers focused more often on smoking-related cues than nonsmokers. Self-reports of craving are in agreement with these results and suggest a significant increase in craving after exposure to smoking cues. In sum, these data support the use of virtual environments for eliciting craving.

  5. Using virtual reality technology for aircraft visual inspection training: presence and comparison studies.

    PubMed

    Vora, Jeenal; Nair, Santosh; Gramopadhye, Anand K; Duchowski, Andrew T; Melloy, Brian J; Kanki, Barbara

    2002-11-01

    The aircraft maintenance industry is a complex system consisting of several interrelated human and machine components. Recognizing this, the Federal Aviation Administration (FAA) has pursued human factors related research. In the maintenance arena the research has focused on the aircraft inspection process and the aircraft inspector. Training has been identified as the primary intervention strategy to improve the quality and reliability of aircraft inspection. If training is to be successful, it is critical that we provide aircraft inspectors with appropriate training tools and environment. In response to this need, the paper outlines the development of a virtual reality (VR) system for aircraft inspection training. VR has generated much excitement but little formal proof that it is useful. However, since VR interfaces are difficult and expensive to build, the computer graphics community needs to be able to predict which applications will benefit from VR. To address this important issue, this research measured the degree of immersion and presence felt by subjects in a virtual environment simulator. Specifically, it conducted two controlled studies using the VR system developed for visual inspection task of an aft-cargo bay at the VR Lab of Clemson University. Beyond assembling the visual inspection virtual environment, a significant goal of this project was to explore subjective presence as it affects task performance. The results of this study indicated that the system scored high on the issues related to the degree of presence felt by the subjects. As a next logical step, this study, then, compared VR to an existing PC-based aircraft inspection simulator. The results showed that the VR system was better and preferred over the PC-based training tool.

  6. Transfer of motor learning from virtual to natural environments in individuals with cerebral palsy.

    PubMed

    de Mello Monteiro, Carlos Bandeira; Massetti, Thais; da Silva, Talita Dias; van der Kamp, John; de Abreu, Luiz Carlos; Leone, Claudio; Savelsbergh, Geert J P

    2014-10-01

    With the growing accessibility of computer-assisted technology, rehabilitation programs for individuals with cerebral palsy (CP) increasingly use virtual reality environments to enhance motor practice. Thus, it is important to examine whether performance improvements in the virtual environment generalize to the natural environment. To examine this issue, we had 64 individuals, 32 of which were individuals with CP and 32 typically developing individuals, practice two coincidence-timing tasks. In the more tangible button-press task, the individuals were required to 'intercept' a falling virtual object at the moment it reached the interception point by pressing a key. In the more abstract, less tangible task, they were instructed to 'intercept' the virtual object by making a hand movement in a virtual environment. The results showed that individuals with CP timed less accurate than typically developing individuals, especially for the more abstract task in the virtual environment. The individuals with CP did-as did their typically developing peers-improve coincidence timing with practice on both tasks. Importantly, however, these improvements were specific to the practice environment; there was no transfer of learning. It is concluded that the implementation of virtual environments for motor rehabilitation in individuals with CP should not be taken for granted but needs to be considered carefully. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Defense applications of the CAVE (CAVE automatic virtual environment)

    NASA Astrophysics Data System (ADS)

    Isabelle, Scott K.; Gilkey, Robert H.; Kenyon, Robert V.; Valentino, George; Flach, John M.; Spenny, Curtis H.; Anderson, Timothy R.

    1997-07-01

    The CAVE is a multi-person, room-sized, high-resolution, 3D video and auditory environment, which can be used to present very immersive virtual environment experiences. This paper describes the CAVE technology and the capability of the CAVE system as originally developed at the Electronics Visualization Laboratory of the University of Illinois- Chicago and as more recently implemented by Wright State University (WSU) in the Armstrong Laboratory at Wright- Patterson Air Force Base (WPAFB). One planned use of the WSU/WPAFB CAVE is research addressing the appropriate design of display and control interfaces for controlling uninhabited aerial vehicles. The WSU/WPAFB CAVE has a number of features that make it well-suited to this work: (1) 360 degrees surround, plus floor, high resolution visual displays, (2) virtual spatialized audio, (3) the ability to integrate real and virtual objects, and (4) rapid and flexible reconfiguration. However, even though the CAVE is likely to have broad utility for military applications, it does have certain limitations that may make it less well- suited to applications that require 'natural' haptic feedback, vestibular stimulation, or an ability to interact with close detailed objects.

  8. Tele Hyper Virtuality

    NASA Technical Reports Server (NTRS)

    Terashima, Nobuyoshi

    1994-01-01

    In the future, remote images sent over communication lines will be reproduced in virtual reality (VR). This form of virtual telecommunications, which will allow observers to engage in an activity as though it were real, is the focus of considerable attention. Taken a step further, real and unreal objects will be placed in a single space to create an extremely realistic environment. Here, imaginary and other life forms as well as people and animals in remote locations will gather via telecommunication lines that create a common environment where life forms can work and interact together. Words, gestures, diagrams and other forms of communication will be used freely in performing work. Actual construction of a system based on this new concept will not only provide people with experiences that would have been impossible in the past, but will also inspire new applications in which people will function in environments where it would have been difficult if not impossible for them to function until now. This paper describes Tele Hyper Virtuality concept, its definition, applications, the key technologies to accomplish it and future prospects.

  9. A Three-Dimensional Virtual Simulator for Aircraft Flyover Presentation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Sullivan, Brenda M.; Sandridge, Christopher A.

    2003-01-01

    This paper presents a system developed at NASA Langley Research Center to render aircraft flyovers in a virtual reality environment. The present system uses monaural recordings of actual aircraft flyover noise and presents these binaurally using head tracking information. The three-dimensional audio is simultaneously rendered with a visual presentation using a head-mounted display (HMD). The final system will use flyover noise synthesized using data from various analytical and empirical modeling systems. This will permit presentation of flyover noise from candidate low-noise flight operations to subjects for psychoacoustical evaluation.

  10. Simulation Environment Synchronizing Real Equipment for Manufacturing Cell

    NASA Astrophysics Data System (ADS)

    Inukai, Toshihiro; Hibino, Hironori; Fukuda, Yoshiro

    Recently, manufacturing industries face various problems such as shorter product life cycle, more diversified customer needs. In this situation, it is very important to reduce lead-time of manufacturing system constructions. At the manufacturing system implementation stage, it is important to make and evaluate facility control programs for a manufacturing cell, such as ladder programs for programmable logical controllers (PLCs) rapidly. However, before the manufacturing systems are implemented, methods to evaluate the facility control programs for the equipment while mixing and synchronizing real equipment and virtual factory models on the computers have not been developed. This difficulty is caused by the complexity of the manufacturing system composed of a great variety of equipment, and stopped precise and rapid support of a manufacturing engineering process. In this paper, a manufacturing engineering environment (MEE) to support manufacturing engineering processes using simulation technologies is proposed. MEE consists of a manufacturing cell simulation environment (MCSE) and a distributed simulation environment (DSE). MCSE, which consists of a manufacturing cell simulator and a soft-wiring system, is emphatically proposed in detail. MCSE realizes making and evaluating facility control programs by using virtual factory models on computers before manufacturing systems are implemented.

  11. Development of a virtual speaking simulator using Image Based Rendering.

    PubMed

    Lee, J M; Kim, H; Oh, M J; Ku, J H; Jang, D P; Kim, I Y; Kim, S I

    2002-01-01

    The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology has enabled the use of virtual reality (VR) for the treatment of the fear of public speaking. There are two techniques for building virtual environments for the treatment of this fear: a model-based and a movie-based method. Both methods have the weakness that they are unrealistic and not controllable individually. To understand these disadvantages, this paper presents a virtual environment produced with Image Based Rendering (IBR) and a chroma-key simultaneously. IBR enables the creation of realistic virtual environments where the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma-keys puts virtual audience members under individual control in the environment. In addition, real time capture technique is used in constructing the virtual environments enabling spoken interaction between the subject and a therapist or another subject.

  12. Development of a system based on 3D vision, interactive virtual environments, ergonometric signals and a humanoid for stroke rehabilitation.

    PubMed

    Ibarra Zannatha, Juan Manuel; Tamayo, Alejandro Justo Malo; Sánchez, Angel David Gómez; Delgado, Jorge Enrique Lavín; Cheu, Luis Eduardo Rodríguez; Arévalo, Wilson Alexander Sierra

    2013-11-01

    This paper presents a stroke rehabilitation (SR) system for the upper limbs, developed as an interactive virtual environment (IVE) based on a commercial 3D vision system (a Microsoft Kinect), a humanoid robot (an Aldebaran's Nao), and devices producing ergonometric signals. In one environment, the rehabilitation routines, developed by specialists, are presented to the patient simultaneously by the humanoid and an avatar inside the IVE. The patient follows the rehabilitation task, while his avatar copies his gestures that are captured by the Kinect 3D vision system. The information of the patient movements, together with the signals obtained from the ergonometric measurement devices, is used also to supervise and to evaluate the rehabilitation progress. The IVE can also present an RGB image of the patient. In another environment, that uses the same base elements, four game routines--Touch the balls 1 and 2, Simon says, and Follow the point--are used for rehabilitation. These environments are designed to create a positive influence in the rehabilitation process, reduce costs, and engage the patient. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Binocular vision in a virtual world: visual deficits following the wearing of a head-mounted display.

    PubMed

    Mon-Williams, M; Wann, J P; Rushton, S

    1993-10-01

    The short-term effects on binocular stability of wearing a conventional head-mounted display (HMD) to explore a virtual reality environment were examined. Twenty adult subjects (aged 19-29 years) wore a commercially available HMD for 10 min while cycling around a computer generated 3-D world. The twin screen presentations were set to suit the average interpupillary distance of our subject population, to mimic the conditions of public access virtual reality systems. Subjects were examined before and after exposure to the HMD and there were clear signs of induced binocular stress for a number of the subjects. The implications of introducing such HMDs into the workplace and entertainment environments are discussed.

  14. Research on Modeling Technology of Virtual Robot Based on LabVIEW

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Huo, J. L.; Y Sun, L.; Y Hao, X.

    2017-12-01

    Because of the dangerous working environment, the underwater operation robot for nuclear power station needs manual teleoperation. In the process of operation, it is necessary to guide the position and orientation of the robot in real time. In this paper, the geometric modeling of the virtual robot and the working environment is accomplished by using SolidWorks software, and the accurate modeling and assembly of the robot are realized. Using LabVIEW software to read the model, and established the manipulator forward kinematics and inverse kinematics model, and realized the hierarchical modeling of virtual robot and computer graphics modeling. Experimental results show that the method studied in this paper can be successfully applied to robot control system.

  15. Modular mechatronic system for stationary bicycles interfaced with virtual environment for rehabilitation

    PubMed Central

    2014-01-01

    Background Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. Methods In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. Results The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider’s lower extremities. Conclusions The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders. PMID:24902780

  16. A comparative analysis of dynamic grids vs. virtual grids using the A3pviGrid framework.

    PubMed

    Shankaranarayanan, Avinas; Amaldas, Christine

    2010-11-01

    With the proliferation of Quad/Multi-core micro-processors in mainstream platforms such as desktops and workstations; a large number of unused CPU cycles can be utilized for running virtual machines (VMs) as dynamic nodes in distributed environments. Grid services and its service oriented business broker now termed cloud computing could deploy image based virtualization platforms enabling agent based resource management and dynamic fault management. In this paper we present an efficient way of utilizing heterogeneous virtual machines on idle desktops as an environment for consumption of high performance grid services. Spurious and exponential increases in the size of the datasets are constant concerns in medical and pharmaceutical industries due to the constant discovery and publication of large sequence databases. Traditional algorithms are not modeled at handing large data sizes under sudden and dynamic changes in the execution environment as previously discussed. This research was undertaken to compare our previous results with running the same test dataset with that of a virtual Grid platform using virtual machines (Virtualization). The implemented architecture, A3pviGrid utilizes game theoretic optimization and agent based team formation (Coalition) algorithms to improve upon scalability with respect to team formation. Due to the dynamic nature of distributed systems (as discussed in our previous work) all interactions were made local within a team transparently. This paper is a proof of concept of an experimental mini-Grid test-bed compared to running the platform on local virtual machines on a local test cluster. This was done to give every agent its own execution platform enabling anonymity and better control of the dynamic environmental parameters. We also analyze performance and scalability of Blast in a multiple virtual node setup and present our findings. This paper is an extension of our previous research on improving the BLAST application framework using dynamic Grids on virtualization platforms such as the virtual box.

  17. Implementation Issues of Virtual Desktop Infrastructure and Its Case Study for a Physician's Round at Seoul National University Bundang Hospital.

    PubMed

    Yoo, Sooyoung; Kim, Seok; Kim, Taegi; Kim, Jon Soo; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-12-01

    The cloud computing-based virtual desktop infrastructure (VDI) allows access to computing environments with no limitations in terms of time or place such that it can permit the rapid establishment of a mobile hospital environment. The objective of this study was to investigate the empirical issues to be considered when establishing a virtual mobile environment using VDI technology in a hospital setting and to examine the utility of the technology with an Apple iPad during a physician's rounds as a case study. Empirical implementation issues were derived from a 910-bed tertiary national university hospital that recently launched a VDI system. During the physicians' rounds, we surveyed patient satisfaction levels with the VDI-based mobile consultation service with the iPad and the relationship between these levels of satisfaction and hospital revisits, hospital recommendations, and the hospital brand image. Thirty-five inpatients (including their next-of-kin) and seven physicians participated in the survey. Implementation issues pertaining to the VDI system arose with regard to the highly availability system architecture, wireless network infrastructure, and screen resolution of the system. Other issues were related to privacy and security, mobile device management, and user education. When the system was used in rounds, patients and their next-of-kin expressed high satisfaction levels, and a positive relationship was noted as regards patients' decisions to revisit the hospital and whether the use of the VDI system improved the brand image of the hospital. Mobile hospital environments have the potential to benefit both physicians and patients. The issues related to the implementation of VDI system discussed here should be examined in advance for its successful adoption and implementation.

  18. Implementation Issues of Virtual Desktop Infrastructure and Its Case Study for a Physician's Round at Seoul National University Bundang Hospital

    PubMed Central

    Yoo, Sooyoung; Kim, Seok; Kim, Taegi; Kim, Jon Soo; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb

    2012-01-01

    Objectives The cloud computing-based virtual desktop infrastructure (VDI) allows access to computing environments with no limitations in terms of time or place such that it can permit the rapid establishment of a mobile hospital environment. The objective of this study was to investigate the empirical issues to be considered when establishing a virtual mobile environment using VDI technology in a hospital setting and to examine the utility of the technology with an Apple iPad during a physician's rounds as a case study. Methods Empirical implementation issues were derived from a 910-bed tertiary national university hospital that recently launched a VDI system. During the physicians' rounds, we surveyed patient satisfaction levels with the VDI-based mobile consultation service with the iPad and the relationship between these levels of satisfaction and hospital revisits, hospital recommendations, and the hospital brand image. Thirty-five inpatients (including their next-of-kin) and seven physicians participated in the survey. Results Implementation issues pertaining to the VDI system arose with regard to the highly availability system architecture, wireless network infrastructure, and screen resolution of the system. Other issues were related to privacy and security, mobile device management, and user education. When the system was used in rounds, patients and their next-of-kin expressed high satisfaction levels, and a positive relationship was noted as regards patients' decisions to revisit the hospital and whether the use of the VDI system improved the brand image of the hospital. Conclusions Mobile hospital environments have the potential to benefit both physicians and patients. The issues related to the implementation of VDI system discussed here should be examined in advance for its successful adoption and implementation. PMID:23346476

  19. Open Source Meets Virtual Reality--An Instructor's Journey Unearths New Opportunities for Learning, Community, and Academia

    ERIC Educational Resources Information Center

    O'Connor, Eileen A.

    2015-01-01

    Opening with the history, recent advances, and emerging ways to use avatar-based virtual reality, an instructor who has used virtual environments since 2007 shares how these environments bring more options to community building, teaching, and education. With the open-source movement, where the source code for virtual environments was made…

  20. Perturbed Communication in a Virtual Environment to Train Medical Team Leaders.

    PubMed

    Huguet, Lauriane; Lourdeaux, Domitile; Sabouret, Nicolas; Ferrer, Marie-Hélène

    2016-01-01

    The VICTEAMS project aims at designing a virtual environment for training medical team leaders to non-technical skills. The virtual environment is populated with autonomous virtual agents who are able to make mistakes (in action or communication) in order to train rescue team leaders and to make them adaptive with all kinds of situations or teams.

  1. Around Marshall

    NASA Image and Video Library

    1993-09-15

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.

  2. Around Marshall

    NASA Image and Video Library

    1993-09-15

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Center (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability providedgeneral visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.

  3. A haptic interface for virtual simulation of endoscopic surgery.

    PubMed

    Rosenberg, L B; Stredney, D

    1996-01-01

    Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.

  4. Enhancing Adaptability of U.S. Military Forces. Part A: Main Report

    DTIC Science & Technology

    2011-01-01

    virtual ...Research   and  Engineering,  Rapid  Capability  Fielding  Toolbox  Study,  stated  that   virtual  environment  tools  can...the   “approved”  basis   for   requirements   is   divorced   from   reality ,   and   setting   system  

  5. Teaching Students about Biodiversity by Studying the Correlation between Plants & Arthropods

    ERIC Educational Resources Information Center

    Richardson, Matthew L.; Hari, Janice

    2008-01-01

    On Earth there is a huge diversity of arthropods, many of which are highly adaptive and able to exploit virtually every terrestrial habitat. Because of their prevalence even in urban environments, they make an excellent model system for any life science class. Since plants also exploit virtually every terrestrial habitat, studying the relationship…

  6. Towards General Models of Effective Science Inquiry in Virtual Performance Assessments

    ERIC Educational Resources Information Center

    Baker, R. S.; Clarke-Midura, J.; Ocumpaugh, J.

    2016-01-01

    Recent interest in online assessment of scientific inquiry has led to several new online systems that attempt to assess these skills, but producing models that detect when students are successfully practising these skills can be challenging. In this paper, we study models that assess student inquiry in an immersive virtual environment, where a…

  7. Working Memory in Wayfinding--A Dual Task Experiment in a Virtual City

    ERIC Educational Resources Information Center

    Meilinger, Tobias; Knauff, Markus; Bulthoff, Heinrich H.

    2008-01-01

    This study examines the working memory systems involved in human wayfinding. In the learning phase, 24 participants learned two routes in a novel photorealistic virtual environment displayed on a 220 degrees screen while they were disrupted by a visual, a spatial, a verbal, or--in a control group--no secondary task. In the following wayfinding…

  8. Recalibrating Reference within a Dual-Space Interaction Environment

    ERIC Educational Resources Information Center

    Zemel, Alan; Koschmann, Timothy

    2013-01-01

    In this paper we examine how two groups of middle school students arrive at shared understandings of and solutions to mathematical problems. Our data consists of logs of student participation in the Virtual Math Teams (VMT) system as they work on math problems. The project supports interaction both through chat and through a virtual whiteboard. We…

  9. Mechanism for Promoting Motivation, Confidence, and Autonomy through Synchronic Communication Sessions in Virtual Learning Environments

    ERIC Educational Resources Information Center

    Valencia, Jorge Andrick Parra; Dallos, Adriana Rocío Lizcano; Ballesteros, Eliécer Pineda

    2017-01-01

    This study presents a mechanism which explains the effect of synchronous communication on students' perception of the training process in virtual learning methodology used in a postgraduate programme at the University of Santander. We use System Dynamics to design a mechanism that integrates motivation, confidence, trust, and autonomy in students.…

  10. Three dimensional tracking with misalignment between display and control axes

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.; Tyler, Mitchell; Kim, Won S.; Stark, Lawrence

    1992-01-01

    Human operators confronted with misaligned display and control frames of reference performed three dimensional, pursuit tracking in virtual environment and virtual space simulations. Analysis of the components of the tracking errors in the perspective displays presenting virtual space showed that components of the error due to visual motor misalignment may be linearly separated from those associated with the mismatch between display and control coordinate systems. Tracking performance improved with several hours practice despite previous reports that such improvement did not take place.

  11. Determining Virtual Environment "Fit": The Relationship between Navigation Style in a Virtual Field Trip, Student Self-Reported Desire to Visit the Field Trip Site in the Real World, and the Purposes of Science Education

    ERIC Educational Resources Information Center

    Tutwiler, M. Shane; Lin, Ming-Chao; Chang, Chun-Yen

    2013-01-01

    In this study, a follow-up analysis of the data reported in Lin et al. ("Learn Media Technol." doi: 10.1080/17439884.2011.629660 , 2011), we investigated the relationship between student use of a virtual field trip (VFT) system and the probability of students reporting wanting to visit the national park site upon which the VFT was modeled,…

  12. Aerospace applications of virtual environment technology.

    PubMed

    Loftin, R B

    1996-11-01

    The uses of virtual environment technology in the space program are examined with emphasis on training for the Hubble Space Telescope Repair and Maintenance Mission in 1993. Project ScienceSpace at the Virtual Environment Technology Lab is discussed.

  13. Master-slave system with force feedback based on dynamics of virtual model

    NASA Technical Reports Server (NTRS)

    Nojima, Shuji; Hashimoto, Hideki

    1994-01-01

    A master-slave system can extend manipulating and sensing capabilities of a human operator to a remote environment. But the master-slave system has two serious problems: one is the mechanically large impedance of the system; the other is the mechanical complexity of the slave for complex remote tasks. These two problems reduce the efficiency of the system. If the slave has local intelligence, it can help the human operator by using its good points like fast calculation and large memory. The authors suggest that the slave is a dextrous hand with many degrees of freedom able to manipulate an object of known shape. It is further suggested that the dimensions of the remote work space be shared by the human operator and the slave. The effect of the large impedance of the system can be reduced in a virtual model, a physical model constructed in a computer with physical parameters as if it were in the real world. A method to determine the damping parameter dynamically for the virtual model is proposed. Experimental results show that this virtual model is better than the virtual model with fixed damping.

  14. Some criteria for teleoperators and virtual environments from experiences with vehicle/operator simulation

    NASA Technical Reports Server (NTRS)

    Jex, Henry R.

    1991-01-01

    A review is given of a wide range of simulations in which operator steering control of a vehicle is involved and the dominant-clues, closed-loop bandwidth, measured operator effective time-delay, and ratio of bandwidth-to-inverse delay are summarized. A correlation of kinetosis with dynamic scene field-of-view is shown. The use of moving base simulators to improve the validity of locomotion teleoperations is discussed. some rules-of-thumb for good 'feel-system' simulation, such as for control manipulanda are given. Finally, simulation tests of teleoperators and virtual environments should include three types of measures: system performance, operator (or robot) 'behavior', and mental workload evaluations.

  15. NASA's Hybrid Reality Lab: One Giant Leap for Full Dive

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2017-01-01

    This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.

  16. Mental rotation - A key to mitigation of motion sickness in the virtual environments?

    NASA Technical Reports Server (NTRS)

    Parker, Donald E.; Harm, Deborah L.

    1992-01-01

    If mental rotation is important for the reduction of motion sickness and complement performance in virtual environments (VEs), the use of the Howard (1982) mental rotation test battery may identify individuals with lower susceptibility to VE-induced motion sickness and therefore a greater probability of success as VE operators. An apparatus and its associated procedures are currently under development for astronaut microgravity training aimed at reducing motion sickness; it is hypothesized that this system may be of significance to VE testing and training.

  17. Virtual reality systems

    NASA Technical Reports Server (NTRS)

    Johnson, David W.

    1992-01-01

    Virtual realities are a type of human-computer interface (HCI) and as such may be understood from a historical perspective. In the earliest era, the computer was a very simple, straightforward machine. Interaction was human manipulation of an inanimate object, little more than the provision of an explicit instruction set to be carried out without deviation. In short, control resided with the user. In the second era of HCI, some level of intelligence and control was imparted to the system to enable a dialogue with the user. Simple context sensitive help systems are early examples, while more sophisticated expert system designs typify this era. Control was shared more equally. In this, the third era of the HCI, the constructed system emulates a particular environment, constructed with rules and knowledge about 'reality'. Control is, in part, outside the realm of the human-computer dialogue. Virtual reality systems are discussed.

  18. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  19. Virtualizing Resources for the Application Services and Framework Team

    NASA Technical Reports Server (NTRS)

    Varner, Justin T.; Crawford, Linda K.

    2010-01-01

    Virtualization is an emerging technology that will undoubtedly have a major impact on the future of Information Technology. It allows for the centralization of resources in an enterprise system without the need to make any changes to the host operating system, file system, or registry. In turn, this significantly reduces cost and administration, and provides a much greater level of security, compatibility, and efficiency. This experiment examined the practicality, methodology, challenges, and benefits of implementing the technology for the Launch Control System (LCS), and more specifically the Application Services (AS) group of the National Aeronautics and Space Administration (NASA) at the Kennedy Space Center (KSC). In order to carry out this experiment, I used several tools from the virtualization company known as VMWare; these programs included VMWare ThinApp, VMWare Workstation, and VMWare ACE. Used in conjunction, these utilities provided the engine necessary to virtualize and deploy applications in a desktop environment on any Windows platform available. The results clearly show that virtualization is a viable technology that can, when implemented properly, dramatically cut costs, enhance stability and security, and provide easier management for administrators.

  20. Intra-operative 3D imaging system for robot-assisted fracture manipulation.

    PubMed

    Dagnino, G; Georgilas, I; Tarassoli, P; Atkins, R; Dogramadzi, S

    2015-01-01

    Reduction is a crucial step in the treatment of broken bones. Achieving precise anatomical alignment of bone fragments is essential for a good fast healing process. Percutaneous techniques are associated with faster recovery time and lower infection risk. However, deducing intra-operatively the desired reduction position is quite challenging due to the currently available technology. The 2D nature of this technology (i.e. the image intensifier) doesn't provide enough information to the surgeon regarding the fracture alignment and rotation, which is actually a three-dimensional problem. This paper describes the design and development of a 3D imaging system for the intra-operative virtual reduction of joint fractures. The proposed imaging system is able to receive and segment CT scan data of the fracture, to generate the 3D models of the bone fragments, and display them on a GUI. A commercial optical tracker was included into the system to track the actual pose of the bone fragments in the physical space, and generate the corresponding pose relations in the virtual environment of the imaging system. The surgeon virtually reduces the fracture in the 3D virtual environment, and a robotic manipulator connected to the fracture through an orthopedic pin executes the physical reductions accordingly. The system is here evaluated through fracture reduction experiments, demonstrating a reduction accuracy of 1.04 ± 0.69 mm (translational RMSE) and 0.89 ± 0.71 ° (rotational RMSE).

  1. STRIVE: Stress Resilience In Virtual Environments: a pre-deployment VR system for training emotional coping skills and assessing chronic and acute stress responses.

    PubMed

    Rizzo, Albert; Buckwalter, J Galen; John, Bruce; Newman, Brad; Parsons, Thomas; Kenny, Patrick; Williams, Josh

    2012-01-01

    The incidence of posttraumatic stress disorder (PTSD) in returning OEF/OIF military personnel is creating a significant healthcare challenge. This has served to motivate research on how to better develop and disseminate evidence-based treatments for PTSD. One emerging form of treatment for combat-related PTSD that has shown promise involves the delivery of exposure therapy using immersive Virtual Reality (VR). Initial outcomes from open clinical trials have been positive and fully randomized controlled trials are currently in progress to further validate this approach. Based on our research group's initial positive outcomes using VR to emotionally engage and successfully treat persons undergoing exposure therapy for PTSD, we have begun development in a similar VR-based approach to deliver stress resilience training with military service members prior to their initial deployment. The Stress Resilience In Virtual Environments (STRIVE) project aims to create a set of combat simulations (derived from our existing Virtual Iraq/Afghanistan exposure therapy system) that are part of a multi-episode narrative experience. Users can be immersed within challenging combat contexts and interact with virtual characters within these episodes as part of an experiential learning approach for training a range of psychoeducational and cognitive-behavioral emotional coping strategies believed to enhance stress resilience. The STRIVE project aims to present this approach to service members prior to deployment as part of a program designed to better prepare military personnel for the types of emotional challenges that are inherent in the combat environment. During these virtual training experiences users are monitored physiologically as part of a larger investigation into the biomarkers of the stress response. One such construct, Allostatic Load, is being directly investigated via physiological and neuro-hormonal analysis from specimen collections taken immediately before and after engagement in the STRIVE virtual experience.

  2. Social Interaction Development through Immersive Virtual Environments

    ERIC Educational Resources Information Center

    Beach, Jason; Wendt, Jeremy

    2014-01-01

    The purpose of this pilot study was to determine if participants could improve their social interaction skills by participating in a virtual immersive environment. The participants used a developing virtual reality head-mounted display to engage themselves in a fully-immersive environment. While in the environment, participants had an opportunity…

  3. Sonic intelligence as a virtual therapeutic environment.

    PubMed

    Tarnanas, Ioannis; Adam, Dimitrios

    2003-06-01

    This paper reports on the results of a research project, on comparing one virtual collaborative environment with a first-person visual immersion (first-perspective interaction) and a second one where the user interacts through a sound-kinetic virtual representation of himself (avatar), as a stress-coping environment in real-life situations. Recent developments in coping research are proposing a shift from a trait-oriented approach of coping to a more situation-specific treatment. We defined as real-life situation a target-oriented situation that demands a complex coping skills inventory of high self-efficacy and internal or external "locus of control" strategies. The participants were 90 normal adults with healthy or impaired coping skills, 25-40 years of age, randomly spread across two groups. There was the same number of participants across groups and gender balance within groups. All two groups went through two phases. In Phase I, Solo, one participant was assessed using a three-stage assessment inspired by the transactional stress theory of Lazarus and the stress inoculation theory of Meichenbaum. In Phase I, each participant was given a coping skills measurement within the time course of various hypothetical stressful encounters performed in two different conditions and a control group. In Condition A, the participant was given a virtual stress assessment scenario relative to a first-person perspective (VRFP). In Condition B, the participant was given a virtual stress assessment scenario relative to a behaviorally realistic motion controlled avatar with sonic feedback (VRSA). In Condition C, the No Treatment Condition (NTC), the participant received just an interview. In Phase II, all three groups were mixed and exercised the same tasks but with two participants in pairs. The results showed that the VRSA group performed notably better in terms of cognitive appraisals, emotions and attributions than the other two groups in Phase I (VRSA, 92%; VRFP, 85%; NTC, 34%). In Phase II, the difference again favored the VRSA group against the other two. These results indicate that a virtual collaborative environment seems to be a consistent coping environment, tapping two classes of stress: (a) aversive or ambiguous situations, and (b) loss or failure situations in relation to the stress inoculation theory. In terms of coping behaviors, a distinction is made between self-directed and environment-directed strategies. A great advantage of the virtual collaborative environment with the behaviorally enhanced sound-kinetic avatar is the consideration of team coping intentions in different stages. Even if the aim is to tap transactional processes in real-life situations, it might be better to conduct research using a sound-kinetic avatar based collaborative environment than a virtual first-person perspective scenario alone. The VE consisted of two dual-processor PC systems, a video splitter, a digital camera and two stereoscopic CRT displays. The system was programmed in C++ and VRScape Immersive Cluster from VRCO, which created an artificial environment that encodes the user's motion from a video camera, targeted at the face of the users and physiological sensors attached to the body.

  4. Virtual environments simulation in research reactor

    NASA Astrophysics Data System (ADS)

    Muhamad, Shalina Bt. Sheik; Bahrin, Muhammad Hannan Bin

    2017-01-01

    Virtual reality based simulations are interactive and engaging. It has the useful potential in improving safety training. Virtual reality technology can be used to train workers who are unfamiliar with the physical layout of an area. In this study, a simulation program based on the virtual environment at research reactor was developed. The platform used for virtual simulation is 3DVia software for which it's rendering capabilities, physics for movement and collision and interactive navigation features have been taken advantage of. A real research reactor was virtually modelled and simulated with the model of avatars adopted to simulate walking. Collision detection algorithms were developed for various parts of the 3D building and avatars to restrain the avatars to certain regions of the virtual environment. A user can control the avatar to move around inside the virtual environment. Thus, this work can assist in the training of personnel, as in evaluating the radiological safety of the research reactor facility.

  5. Real-time, interactive, visually updated simulator system for telepresence

    NASA Technical Reports Server (NTRS)

    Schebor, Frederick S.; Turney, Jerry L.; Marzwell, Neville I.

    1991-01-01

    Time delays and limited sensory feedback of remote telerobotic systems tend to disorient teleoperators and dramatically decrease the operator's performance. To remove the effects of time delays, key components were designed and developed of a prototype forward simulation subsystem, the Global-Local Environment Telerobotic Simulator (GLETS) that buffers the operator from the remote task. GLETS totally immerses an operator in a real-time, interactive, simulated, visually updated artificial environment of the remote telerobotic site. Using GLETS, the operator will, in effect, enter into a telerobotic virtual reality and can easily form a gestalt of the virtual 'local site' that matches the operator's normal interactions with the remote site. In addition to use in space based telerobotics, GLETS, due to its extendable architecture, can also be used in other teleoperational environments such as toxic material handling, construction, and undersea exploration.

  6. NEDE: an open-source scripting suite for developing experiments in 3D virtual environments.

    PubMed

    Jangraw, David C; Johri, Ansh; Gribetz, Meron; Sajda, Paul

    2014-09-30

    As neuroscientists endeavor to understand the brain's response to ecologically valid scenarios, many are leaving behind hyper-controlled paradigms in favor of more realistic ones. This movement has made the use of 3D rendering software an increasingly compelling option. However, mastering such software and scripting rigorous experiments requires a daunting amount of time and effort. To reduce these startup costs and make virtual environment studies more accessible to researchers, we demonstrate a naturalistic experimental design environment (NEDE) that allows experimenters to present realistic virtual stimuli while still providing tight control over the subject's experience. NEDE is a suite of open-source scripts built on the widely used Unity3D game development software, giving experimenters access to powerful rendering tools while interfacing with eye tracking and EEG, randomizing stimuli, and providing custom task prompts. Researchers using NEDE can present a dynamic 3D virtual environment in which randomized stimulus objects can be placed, allowing subjects to explore in search of these objects. NEDE interfaces with a research-grade eye tracker in real-time to maintain precise timing records and sync with EEG or other recording modalities. Python offers an alternative for experienced programmers who feel comfortable mastering and integrating the various toolboxes available. NEDE combines many of these capabilities with an easy-to-use interface and, through Unity's extensive user base, a much more substantial body of assets and tutorials. Our flexible, open-source experimental design system lowers the barrier to entry for neuroscientists interested in developing experiments in realistic virtual environments. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Virtual environment and computer-aided technologies used for system prototyping and requirements development

    NASA Technical Reports Server (NTRS)

    Logan, Cory; Maida, James; Goldsby, Michael; Clark, Jim; Wu, Liew; Prenger, Henk

    1993-01-01

    The Space Station Freedom (SSF) Data Management System (DMS) consists of distributed hardware and software which monitor and control the many onboard systems. Virtual environment and off-the-shelf computer technologies can be used at critical points in project development to aid in objectives and requirements development. Geometric models (images) coupled with off-the-shelf hardware and software technologies were used in The Space Station Mockup and Trainer Facility (SSMTF) Crew Operational Assessment Project. Rapid prototyping is shown to be a valuable tool for operational procedure and system hardware and software requirements development. The project objectives, hardware and software technologies used, data gained, current activities, future development and training objectives shall be discussed. The importance of defining prototyping objectives and staying focused while maintaining schedules are discussed along with project pitfalls.

  8. Evaluation of the cognitive effects of travel technique in complex real and virtual environments.

    PubMed

    Suma, Evan A; Finkelstein, Samantha L; Reid, Myra; V Babu, Sabarish; Ulinski, Amy C; Hodges, Larry F

    2010-01-01

    We report a series of experiments conducted to investigate the effects of travel technique on information gathering and cognition in complex virtual environments. In the first experiment, participants completed a non-branching multilevel 3D maze at their own pace using either real walking or one of two virtual travel techniques. In the second experiment, we constructed a real-world maze with branching pathways and modeled an identical virtual environment. Participants explored either the real or virtual maze for a predetermined amount of time using real walking or a virtual travel technique. Our results across experiments suggest that for complex environments requiring a large number of turns, virtual travel is an acceptable substitute for real walking if the goal of the application involves learning or reasoning based on information presented in the virtual world. However, for applications that require fast, efficient navigation or travel that closely resembles real-world behavior, real walking has advantages over common joystick-based virtual travel techniques.

  9. Building intuitive 3D interfaces for virtual reality systems

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Seitel, Mathias; Mullick, Rakesh

    2007-03-01

    An exploration of techniques for developing intuitive, and efficient user interfaces for virtual reality systems. Work seeks to understand which paradigms from the better-understood world of 2D user interfaces remain viable within 3D environments. In order to establish this a new user interface was created that applied various understood principles of interface design. A user study was then performed where it was compared with an earlier interface for a series of medical visualization tasks.

  10. Butterfly valve in a virtual environment

    NASA Astrophysics Data System (ADS)

    Talekar, Aniruddha; Patil, Saurabh; Thakre, Prashant; Rajkumar, E.

    2017-11-01

    Assembly of components is one of the processes involved in product design and development. The present paper deals with the assembly of a simple butterfly valve components in a virtual environment. The assembly has been carried out using virtual reality software by trial and error methods. The parts are modelled using parametric software (SolidWorks), meshed accordingly, and then called into virtual environment for assembly.

  11. The feasibility and acceptability of virtual environments in the treatment of childhood social anxiety disorder.

    PubMed

    Sarver, Nina Wong; Beidel, Deborah C; Spitalnick, Josh S

    2014-01-01

    Two significant challenges for the dissemination of social skills training programs are the need to assure generalizability and provide sufficient practice opportunities. In the case of social anxiety disorder, virtual environments may provide one strategy to address these issues. This study evaluated the utility of an interactive virtual school environment for the treatment of social anxiety disorder in preadolescent children. Eleven children with a primary diagnosis of social anxiety disorder between 8 to 12 years old participated in this initial feasibility trial. All children were treated with Social Effectiveness Therapy for Children, an empirically supported treatment for children with social anxiety disorder. However, the in vivo peer generalization sessions and standard parent-assisted homework assignments were substituted by practice in a virtual environment. Overall, the virtual environment programs were acceptable, feasible, and credible treatment components. Both children and clinicians were satisfied with using the virtual environment technology, and children believed it was a high-quality program overall. In addition, parents were satisfied with the virtual environment augmented treatment and indicated that they would recommend the program to family and friends. Findings indicate that the virtual environments are viewed as acceptable and credible by potential recipients. Furthermore, they are easy to implement by even novice users and appear to be useful adjunctive elements for the treatment of childhood social anxiety disorder.

  12. Human Rights and Private Ordering in Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Oosterbaan, Olivier

    This paper explores the application of human rights in (persistent) virtual world environments. The paper begins with describing a number of elements that most virtual environments share and that are relevant for the application of human rights in such a setting; and by describing in a general nature the application of human rights between private individuals. The paper then continues by discussing the application in virtual environments of two universally recognized human rights, namely freedom of expression, and freedom from discrimination. As these specific rights are discussed, a number of more general conclusions on the application of human rights in virtual environments are drawn. The first general conclusion being that, because virtual worlds are private environments, participants are subject to private ordering. The second general conclusion being that participants and non-participants alike have to accept at times that in-world expressions are to an extent private speech. The third general conclusion is that, where participants represent themselves in-world, other participants cannot assume that such in-world representation share the characteristics of the human player; and that where virtual environments contain game elements, participants and non-participants alike should not take everything that happens in the virtual environment at face value or literally, which does however not amount to having to accept a higher level of infringement on their rights for things that happen in such an environment.

  13. The feasibility and acceptability of virtual environments in the treatment of childhood social anxiety disorder

    PubMed Central

    Wong, Nina; Beidel, Deborah C.; Spitalnick, Josh

    2013-01-01

    Objective Two significant challenges for the dissemination of social skills training programs are the need to assure generalizability and provide sufficient practice opportunities. In the case of social anxiety disorder, virtual environments may provide one strategy to address these issues. This study evaluated the utility of an interactive virtual school environment for the treatment of social anxiety disorder in preadolescent children. Method Eleven children with a primary diagnosis of social anxiety disorder between 8 to 12 years old participated in this initial feasibility trial. All children were treated with Social Effectiveness Therapy for Children, an empirically supported treatment for children with social anxiety disorder. However, the in vivo peer generalization sessions and standard parent-assisted homework assignments were substituted by practice in a virtual environment. Results Overall, the virtual environment programs were acceptable, feasible, and credible treatment components. Both children and clinicians were satisfied with using the virtual environment technology, and children believed it was a high quality program overall. Additionally, parents were satisfied with the virtual environment augmented treatment and indicated that they would recommend the program to family and friends. Conclusion Virtual environments are viewed as acceptable and credible by potential recipients. Furthermore, they are easy to implement by even novice users and appear to be useful adjunctive elements for the treatment of childhood social anxiety disorder. PMID:24144182

  14. 3D Virtual Reality for Teaching Astronomy

    NASA Astrophysics Data System (ADS)

    Speck, Angela; Ruzhitskaya, L.; Laffey, J.; Ding, N.

    2012-01-01

    We are developing 3D virtual learning environments (VLEs) as learning materials for an undergraduate astronomy course, in which will utilize advances both in technologies available and in our understanding of the social nature of learning. These learning materials will be used to test whether such VLEs can indeed augment science learning so that it is more engaging, active, visual and effective. Our project focuses on the challenges and requirements of introductory college astronomy classes. Here we present our virtual world of the Jupiter system and how we plan to implement it to allow students to learn course material - physical laws and concepts in astronomy - while engaging them into exploration of the Jupiter's system, encouraging their imagination, curiosity, and motivation. The VLE can allow students to work individually or collaboratively. The 3D world also provides an opportunity for research in astronomy education to investigate impact of social interaction, gaming features, and use of manipulatives offered by a learning tool on students’ motivation and learning outcomes. Use of this VLE is also a valuable source for exploration of how the learners’ spatial awareness can be enhanced by working in 3D environment. We will present the Jupiter-system environment along with a preliminary study of the efficacy and usability of our Jupiter 3D VLE.

  15. Applying Virtual Reality to commercial Edutainment

    NASA Technical Reports Server (NTRS)

    Grissom, F.; Goza, Sharon P.; Goza, S. Michael

    1994-01-01

    Virtual reality (VR) when defined as a computer generated, immersive, three dimensional graphics environment which provides varying degrees of interactivity, remains an expensive, highly specialized application, yet to find its way into the school, home, or business. As a novel approach to a theme park-type attraction, though, its use can be justified. This paper describes how a virtual reality 'tour of the human digestive system' was created for the Omniplex Science Museum of Oklahoma City, Oklahoma. The customers main objectives were: (1) to educate; (2) to entertain; (3) to draw visitors; and (4) to generate revenue. The 'Edutainment' system ultimately delivered met these goals. As more such systems come into existence the resulting library of licensable programs will greatly reduce development costs to individual institutions.

  16. Soft Systems Methodology for Personalized Learning Environment

    ERIC Educational Resources Information Center

    Nair, Uday

    2015-01-01

    There are two sides to a coin when it comes to implementing technology at universities; on one side, there is the university using technologies via the virtual learning environment that seems to be outdated with the digital needs of the students, and on the other side, while implementing technology at the university learning environment the focus…

  17. A synthetic computational environment: To control the spread of respiratory infections in a virtual university

    NASA Astrophysics Data System (ADS)

    Ge, Yuanzheng; Chen, Bin; liu, Liang; Qiu, Xiaogang; Song, Hongbin; Wang, Yong

    2018-02-01

    Individual-based computational environment provides an effective solution to study complex social events by reconstructing scenarios. Challenges remain in reconstructing the virtual scenarios and reproducing the complex evolution. In this paper, we propose a framework to reconstruct a synthetic computational environment, reproduce the epidemic outbreak, and evaluate management interventions in a virtual university. The reconstructed computational environment includes 4 fundamental components: the synthetic population, behavior algorithms, multiple social networks, and geographic campus environment. In the virtual university, influenza H1N1 transmission experiments are conducted, and gradually enhanced interventions are evaluated and compared quantitatively. The experiment results indicate that the reconstructed virtual environment provides a solution to reproduce complex emergencies and evaluate policies to be executed in the real world.

  18. Designing of a technological line in the context of controlling with the use of integration of the virtual controller with the mechatronics concept designer module of the PLM Siemens NX software

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is examined the sequential control system of a technological line in the form of the final part of a system of an internal transport. The process of designing this technological line using the computer-aided approach ran concurrently in two different program environments. In the Mechatronics Concept Designer module of the PLM Siemens NX software was developed the 3D model of the technological line prepared for verification the logic interrelations implemented in the control system. For this purpose, from the whole system of the technological line, it was distinguished the sub-system of actuators and sensors, because their correct operation determines the correct operation of the whole system. Whereas in the application of the virtual controller have been implemented the algorithms of work of the planned line. Then both program environments have been integrated using the OPC server, which enables the exchange of data between the considered systems. The data on the state of the object and the data defining the way and sequence of operation of the technological line are exchanged between the virtual controller and the 3D model of the technological line in real time.

  19. Evaluation of a low-cost 3D sound system for immersive virtual reality training systems.

    PubMed

    Doerr, Kai-Uwe; Rademacher, Holger; Huesgen, Silke; Kubbat, Wolfgang

    2007-01-01

    Since Head Mounted Displays (HMD), datagloves, tracking systems, and powerful computer graphics resources are nowadays in an affordable price range, the usage of PC-based "Virtual Training Systems" becomes very attractive. However, due to the limited field of view of HMD devices, additional modalities have to be provided to benefit from 3D environments. A 3D sound simulation can improve the capabilities of VR systems dramatically. Unfortunately, realistic 3D sound simulations are expensive and demand a tremendous amount of computational power to calculate reverberation, occlusion, and obstruction effects. To use 3D sound in a PC-based training system as a way to direct and guide trainees to observe specific events in 3D space, a cheaper alternative has to be provided, so that a broader range of applications can take advantage of this modality. To address this issue, we focus in this paper on the evaluation of a low-cost 3D sound simulation that is capable of providing traceable 3D sound events. We describe our experimental system setup using conventional stereo headsets in combination with a tracked HMD device and present our results with regard to precision, speed, and used signal types for localizing simulated sound events in a virtual training environment.

  20. An exploration of the ecological validity of the Virtual Action Planning-Supermarket (VAP-S) with people with schizophrenia.

    PubMed

    Aubin, Ginette; Béliveau, Marie-France; Klinger, Evelyne

    2018-07-01

    People with schizophrenia often have functional limitations that affect their daily activities due to executive function deficits. One way to assess these deficits is through the use of virtual reality programmes that reproduce real-life instrumental activities of daily living (IADLs). One such programme is the Virtual Action Planning-Supermarket (VAP-S). This exploratory study aimed to examine the ecological validity of this programme, specifically, how task performance in both virtual and natural environments compares. Case studies were used and involved five participants with schizophrenia, who were familiar with grocery shopping. They were assessed during both the VAP-S shopping task and a real-life grocery shopping task using an observational assessment tool, the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. The results show that when difficulties were present in the virtual task, difficulties were also observed in the real-life task. For some participants, greater difficulties were observed in the virtual task. These difficulties could be explained by the presence of perceptual deficits and problems remembering the required sequenced actions in the virtual task. In conclusion, performance on the VAP-S by these five participants was generally comparable to the performance in a natural environment.

  1. Virtual Scavenger Hunt: An AI-Powered Virtual Environment Designed for Training Individuals in Effective Teamwork, and Analyzing Cross-Cultural Behavior

    DTIC Science & Technology

    2009-03-20

    involved the development of an environment within the Multiverse virtual world, oriented toward allowing individuals to acquire and reinforce skills via...PetBrain software G2: Creation of a scavenger hunt scenario in the Multiverse virtual world, in which humans and AIs can collaboratively play scavenger...carried out by Novamente LLC for AOARD during June 2008 ? February 2009. It involved the development of an environment within the Multiverse virtual world

  2. Validation of smoking-related virtual environments for cue exposure therapy.

    PubMed

    García-Rodríguez, Olaya; Pericot-Valverde, Irene; Gutiérrez-Maldonado, José; Ferrer-García, Marta; Secades-Villa, Roberto

    2012-06-01

    Craving is considered one of the main factors responsible for relapse after smoking cessation. Cue exposure therapy (CET) consists of controlled and repeated exposure to drug-related stimuli in order to extinguish associated responses. The main objective of this study was to assess the validity of 7 virtual reality environments for producing craving in smokers that can be used within the CET paradigm. Forty-six smokers and 44 never-smokers were exposed to 7 complex virtual environments with smoking-related cues that reproduce typical situations in which people smoke, and to a neutral virtual environment without smoking cues. Self-reported subjective craving and psychophysiological measures were recorded during the exposure. All virtual environments with smoking-related cues were able to generate subjective craving in smokers, while no increase was observed for the neutral environment. The most sensitive psychophysiological variable to craving increases was heart rate. The findings provide evidence of the utility of virtual reality for simulating real situations capable of eliciting craving. We also discuss how CET for smoking cessation can be improved through these virtual tools. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Blended learning in paediatric emergency medicine: preliminary analysis of a virtual learning environment.

    PubMed

    Spedding, Ruth; Jenner, Rachel; Potier, Katherine; Mackway-Jones, Kevin; Carley, Simon

    2013-04-01

    Paediatric emergency medicine (PEM) currently faces many competing educational challenges. Recent changes to the working patterns have made the delivery of effective teaching to trainees extremely difficult. We developed a virtual learning environment, on the basis of socioconstructivist principles, which allows learning to take place regardless of time or location. The aim was to evaluate the effectiveness of a blended e-learning approach for PEM training. We evaluated the experiences of ST3 trainees in PEM using a multimodal approach. We classified and analysed message board discussions over a 6-month period to look for evidence of practice change and learning. We conducted semistructured qualitative interviews with trainees approximately 5 months after they completed the course. Trainees embraced the virtual learning environment and had positive experiences of the blended approach to learning. Socioconstructivist learning did take place through the use of message boards on the virtual learning environment. Despite their initial unfamiliarity with the online learning system, the participants found it easy to access and use. The participants found the learning relevant and there was an overlap between shop floor learning and the online content. Clinical discussion was often led by trainees on the forums and these were described as enjoyable and informative. A blended approach to e-learning in basic PEM is effective and enjoyable to trainees.

  4. The Design and Evaluation of a Large-Scale Real-Walking Locomotion Interface

    PubMed Central

    Peck, Tabitha C.; Fuchs, Henry; Whitton, Mary C.

    2014-01-01

    Redirected Free Exploration with Distractors (RFED) is a large-scale real-walking locomotion interface developed to enable people to walk freely in virtual environments that are larger than the tracked space in their facility. This paper describes the RFED system in detail and reports on a user study that evaluated RFED by comparing it to walking-in-place and joystick interfaces. The RFED system is composed of two major components, redirection and distractors. This paper discusses design challenges, implementation details, and lessons learned during the development of two working RFED systems. The evaluation study examined the effect of the locomotion interface on users’ cognitive performance on navigation and wayfinding measures. The results suggest that participants using RFED were significantly better at navigating and wayfinding through virtual mazes than participants using walking-in-place and joystick interfaces. Participants traveled shorter distances, made fewer wrong turns, pointed to hidden targets more accurately and more quickly, and were able to place and label targets on maps more accurately, and more accurately estimate the virtual environment size. PMID:22184262

  5. A Concept for Optimizing Behavioural Effectiveness & Efficiency

    NASA Astrophysics Data System (ADS)

    Barca, Jan Carlo; Rumantir, Grace; Li, Raymond

    Both humans and machines exhibit strengths and weaknesses that can be enhanced by merging the two entities. This research aims to provide a broader understanding of how closer interactions between these two entities can facilitate more optimal goal-directed performance through the use of artificial extensions of the human body. Such extensions may assist us in adapting to and manipulating our environments in a more effective way than any system known today. To demonstrate this concept, we have developed a simulation where a semi interactive virtual spider can be navigated through an environment consisting of several obstacles and a virtual predator capable of killing the spider. The virtual spider can be navigated through the use of three different control systems that can be used to assist in optimising overall goal directed performance. The first two control systems use, an onscreen button interface and a touch sensor, respectively to facilitate human navigation of the spider. The third control system is an autonomous navigation system through the use of machine intelligence embedded in the spider. This system enables the spider to navigate and react to changes in its local environment. The results of this study indicate that machines should be allowed to override human control in order to maximise the benefits of collaboration between man and machine. This research further indicates that the development of strong machine intelligence, sensor systems that engage all human senses, extra sensory input systems, physical remote manipulators, multiple intelligent extensions of the human body, as well as a tighter symbiosis between man and machine, can support an upgrade of the human form.

  6. Development and early evaluation of the Virtual Iraq/Afghanistan exposure therapy system for combat-related PTSD.

    PubMed

    Rizzo, Albert Skip; Difede, JoAnn; Rothbaum, Barbara O; Reger, Greg; Spitalnick, Josh; Cukor, Judith; McLay, Rob

    2010-10-01

    Numerous reports indicate that the growing incidence of posttraumatic stress disorder (PTSD) in returning Operation Enduring Freedom (OEF)/Operation Iraqi Freedom (OIF) military personnel is creating a significant health care and economic challenge. These findings have served to motivate research on how to better develop and disseminate evidence-based treatments for PTSD. Virtual reality-delivered exposure therapy for PTSD has been previously used with reports of positive outcomes. The current paper will detail the development and early results from use of the Virtual Iraq/Afghanistan exposure therapy system. The system consists of a series of customizable virtual scenarios designed to represent relevant Middle Eastern contexts for exposure therapy, including a city and desert road convoy environment. The process for gathering user-centered design feedback from returning OEF/OIF military personnel and from a system deployed in Iraq (as was needed to iteratively evolve the system) will be discussed, along with a brief summary of results from an open clinical trial using Virtual Iraq with 20 treatment completers, which indicated that 16 no longer met PTSD checklist-military criteria for PTSD after treatment. © 2010 Association for Research in Nervous and Mental Disease.

  7. Virtual Habitat -a dynamic simulation of closed life support systems -human model status

    NASA Astrophysics Data System (ADS)

    Markus Czupalla, M. Sc.; Zhukov, Anton; Hwang, Su-Au; Schnaitmann, Jonas

    In order to optimize Life Support Systems on a system level, stability questions must be in-vestigated. To do so the exploration group of the Technical University of Munich (TUM) is developing the "Virtual Habitat" (V-HAB) dynamic LSS simulation software. V-HAB shall provide the possibility to conduct dynamic simulations of entire mission scenarios for any given LSS configuration. The Virtual Habitat simulation tool consists of four main modules: • Closed Environment Module (CEM) -monitoring of compounds in a closed environment • Crew Module (CM) -dynamic human simulation • P/C Systems Module (PCSM) -dynamic P/C subsystems • Plant Module (PM) -dynamic plant simulation The core module of the simulation is the dynamic and environment sensitive human module. Introduced in its basic version in 2008, the human module has been significantly updated since, increasing its capabilities and maturity significantly. In this paper three newly added human model subsystems (thermal regulation, digestion and schedule controller) are introduced touching also on the human stress subsystem which is cur-rently under development. Upon the introduction of these new subsystems, the integration of these into the overall V-HAB human model is discussed, highlighting the impact on the most important I/F. The overall human model capabilities shall further be summarized and presented based on meaningful test cases. In addition to the presentation of the results, the correlation strategy for the Virtual Habitat human model shall be introduced assessing the models current confidence level and giving an outlook on the future correlation strategy. Last but not least, the remaining V-HAB mod-ules shall be introduced shortly showing how the human model is integrated into the overall simulation.

  8. Virtual Reality Training: "Cybersickness" and Effects on Sensorimotor Functions

    NASA Technical Reports Server (NTRS)

    Harm, Deborah L.; Taylor, Laura C.

    2003-01-01

    The overall goal of this study is to examine the extent to which exposure to virtual reality (VR) systems produces motion sickness and disrupts sensorimotor functions. Two of the major problems in using VRs are: 1) potential "cybersickness", a form of motion sickness, and 2) maladaptive sensorimotor coordination following virtual environment (VE) training. It is likely that users will eventually adapt to any unpleasant perceptual experiences in a virtual environment. However the most critical problem for training applications is that sensorimotor coordination strategies learned in the VE may not be similar to the responses required in the real environment. This study will evaluate and compare responses to the two types of VR delivery systems (head-mounted display [HMD] and a dome-projection system [DOME]), two exposure duration periods (30 minutes or 60 minutes), and repeated exposures (3 sessions). Specific responses that we will examine include cybersickness severity and symptom patterns, and several sensorimotor functions (eye-hea.d and eye-head-hand coordination, and postural equilibrium). To date, all hardware and software acquisition, development, integration and testing has been completed. A database has been developed and tested for the input, management and storage of all questionnaire data. All data analysis scripts have been developed and tested. Data was collected from 20 subjects in a pilot study that was conducted to determine the amount of training necessary to achieve a stable performance level. Seven subjects are currently enrolled in the study designed to examine the effects of exposure to VE systems on postural control. Data has been collected from two subjects, and it is expected that the results from ten subjects will be presented.

  9. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    NASA Astrophysics Data System (ADS)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  10. Virtual environment architecture for rapid application development

    NASA Technical Reports Server (NTRS)

    Grinstein, Georges G.; Southard, David A.; Lee, J. P.

    1993-01-01

    We describe the MITRE Virtual Environment Architecture (VEA), a product of nearly two years of investigations and prototypes of virtual environment technology. This paper discusses the requirements for rapid prototyping, and an architecture we are developing to support virtual environment construction. VEA supports rapid application development by providing a variety of pre-built modules that can be reconfigured for each application session. The modules supply interfaces for several types of interactive I/O devices, in addition to large-screen or head-mounted displays.

  11. Simple force feedback for small virtual environments

    NASA Astrophysics Data System (ADS)

    Schiefele, Jens; Albert, Oliver; van Lier, Volker; Huschka, Carsten

    1998-08-01

    In today's civil flight training simulators only the cockpit and all its interaction devices exist as physical mockups. All other elements such as flight behavior, motion, sound, and the visual system are virtual. As an extension to this approach `Virtual Flight Simulation' tries to subsidize the cockpit mockup by a 3D computer generated image. The complete cockpit including the exterior view is displayed on a Head Mounted Display (HMD), a BOOM, or a Cave Animated Virtual Environment. In most applications a dataglove or virtual pointers are used as input devices. A basic problem of such a Virtual Cockpit simulation is missing force feedback. A pilot cannot touch and feel buttons, knobs, dials, etc. he tries to manipulate. As a result, it is very difficult to generate realistic inputs into VC systems. `Seating Bucks' are used in automotive industry to overcome the problem of missing force feedback. Only a seat, steering wheel, pedal, stick shift, and radio panel are physically available. All other geometry is virtual and therefore untouchable but visible in the output device. In extension to this concept a `Seating Buck' for commercial transport aircraft cockpits was developed. Pilot seat, side stick, pedals, thrust-levers, and flaps lever are physically available. All other panels are simulated by simple flat plastic panels. They are located at the same location as their real counterparts only lacking the real input devices. A pilot sees the entire photorealistic cockpit in a HMD as 3D geometry but can only touch the physical parts and plastic panels. In order to determine task performance with the developed Seating Buck, a test series was conducted. Users press buttons, adapt dials, and turn knobs. In a first test, a complete virtual environment was used. The second setting had a plastic panel replacing all input devices. Finally, as cross reference the participants had to repeat the test with a complete physical mockup of the input devices. All panels and physical devices can be easily relocated to simulate a different type of cockpit. Maximal 30 minutes are needed for a complete adaptation. So far, an Airbus A340 and a generic cockpit are supported.

  12. Clandestine Message Passing in Virtual Environments

    DTIC Science & Technology

    2008-09-01

    accessed April 4, 2008). Weir, Laila. “Boring Game? Outsorce It.” (August 24, 2004). http://www.wired.com/ entertainment / music /news/2004/08/ 64638...Multiplayer Online MOVES - Modeling Virtual Environments and Simulation MTV – Music Television NPS - Naval Postgraduate School PAN – Personal Area...Network PSP - PlayStation Portable RPG – Role-playing Game SL - Second Life SVN - Subversion VE – Virtual Environments vMTV – Virtual Music

  13. Improved image guidance technique for minimally invasive mitral valve repair using real-time tracked 3D ultrasound

    NASA Astrophysics Data System (ADS)

    Rankin, Adam; Moore, John; Bainbridge, Daniel; Peters, Terry

    2016-03-01

    In the past ten years, numerous new surgical and interventional techniques have been developed for treating heart valve disease without the need for cardiopulmonary bypass. Heart valve repair is now being performed in a blood-filled environment, reinforcing the need for accurate and intuitive imaging techniques. Previous work has demonstrated how augmenting ultrasound with virtual representations of specific anatomical landmarks can greatly simplify interventional navigation challenges and increase patient safety. These techniques often complicate interventions by requiring additional steps taken to manually define and initialize virtual models. Furthermore, overlaying virtual elements into real-time image data can also obstruct the view of salient image information. To address these limitations, a system was developed that uses real-time volumetric ultrasound alongside magnetically tracked tools presented in an augmented virtuality environment to provide a streamlined navigation guidance platform. In phantom studies simulating a beating-heart navigation task, procedure duration and tool path metrics have achieved comparable performance to previous work in augmented virtuality techniques, and considerable improvement over standard of care ultrasound guidance.

  14. Brain activity during a lower limb functional task in a real and virtual environment: A comparative study.

    PubMed

    Pacheco, Thaiana Barbosa Ferreira; Oliveira Rego, Isabelle Ananda; Campos, Tania Fernandes; Cavalcanti, Fabrícia Azevedo da Costa

    2017-01-01

    Virtual Reality (VR) has been contributing to Neurological Rehabilitation because of its interactive and multisensory nature, providing the potential of brain reorganization. Given the use of mobile EEG devices, there is the possibility of investigating how the virtual therapeutic environment can influence brain activity. To compare theta, alpha, beta and gamma power in healthy young adults during a lower limb motor task in a virtual and real environment. Ten healthy adults were submitted to an EEG assessment while performing a one-minute task consisted of going up and down a step in a virtual environment - Nintendo Wii virtual game "Basic step" - and in a real environment. Real environment caused an increase in theta and alpha power, with small to large size effects mainly in the frontal region. VR caused a greater increase in beta and gamma power, however, with small or negligible effects on a variety of regions regarding beta frequency, and medium to very large effects on the frontal and the occipital regions considering gamma frequency. Theta, alpha, beta and gamma activity during the execution of a motor task differs according to the environment that the individual is exposed - real or virtual - and may have varying size effects if brain area activation and frequency spectrum in each environment are taken into consideration.

  15. Virtual to Reality: Teaching Mathematics and Aerospace Concepts to Undergraduates Using Unmanned Aerial Systems and Flight Simulation Software

    ERIC Educational Resources Information Center

    Aji, Chadia Affane; Khan, M. Javed

    2015-01-01

    Student engagement is an essential element for learning. Active learning has been consistently shown to increase student engagement and hence learning. Hands-on activities are one of the many active learning approaches. These activities vary from structured laboratory experiments on one end of the spectrum to virtual gaming environments and to for…

  16. Virtual Machine Language Controls Remote Devices

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Kennedy Space Center worked with Blue Sun Enterprises, based in Boulder, Colorado, to enhance the company's virtual machine language (VML) to control the instruments on the Regolith and Environment Science and Oxygen and Lunar Volatiles Extraction mission. Now the NASA-improved VML is available for crewed and uncrewed spacecraft, and has potential applications on remote systems such as weather balloons, unmanned aerial vehicles, and submarines.

  17. Virtual Laparoscopic Training System Based on VCH Model.

    PubMed

    Tang, Jiangzhou; Xu, Lang; He, Longjun; Guan, Songluan; Ming, Xing; Liu, Qian

    2017-04-01

    Laparoscopy has been widely used to perform abdominal surgeries, as it is advantageous in that the patients experience lower post-surgical trauma, shorter convalescence, and less pain as compared to traditional surgery. Laparoscopic surgeries require precision; therefore, it is imperative to train surgeons to reduce the risk of operation. Laparoscopic simulators offer a highly realistic surgical environment by using virtual reality technology, and it can improve the training efficiency of laparoscopic surgery. This paper presents a virtual Laparoscopic surgery system. The proposed system utilizes the Visible Chinese Human (VCH) to construct the virtual models and simulates real-time deformation with both improved special mass-spring model and morph target animation. Meanwhile, an external device that integrates two five-degrees-of-freedom (5-DOF) manipulators was designed and made to interact with the virtual system. In addition, the proposed system provides a modular tool based on Unity3D to define the functions and features of instruments and organs, which could help users to build surgical training scenarios quickly. The proposed virtual laparoscopic training system offers two kinds of training mode, skills training and surgery training. In the skills training mode, the surgeons are mainly trained for basic operations, such as laparoscopic camera, needle, grasp, electric coagulation, and suturing. In the surgery-training mode, the surgeons can practice cholecystectomy and removal of hepatic cysts by guided or non-guided teaching.

  18. WeaVR: a self-contained and wearable immersive virtual environment simulation system.

    PubMed

    Hodgson, Eric; Bachmann, Eric R; Vincent, David; Zmuda, Michael; Waller, David; Calusdian, James

    2015-03-01

    We describe WeaVR, a computer simulation system that takes virtual reality technology beyond specialized laboratories and research sites and makes it available in any open space, such as a gymnasium or a public park. Novel hardware and software systems enable HMD-based immersive virtual reality simulations to be conducted in any arbitrary location, with no external infrastructure and little-to-no setup or site preparation. The ability of the WeaVR system to provide realistic motion-tracked navigation for users, to improve the study of large-scale navigation, and to generate usable behavioral data is shown in three demonstrations. First, participants navigated through a full-scale virtual grocery store while physically situated in an open grass field. Trajectory data are presented for both normal tracking and for tracking during the use of redirected walking that constrained users to a predefined area. Second, users followed a straight path within a virtual world for distances of up to 2 km while walking naturally and being redirected to stay within the field, demonstrating the ability of the system to study large-scale navigation by simulating virtual worlds that are potentially unlimited in extent. Finally, the portability and pedagogical implications of this system were demonstrated by taking it to a regional high school for live use by a computer science class on their own school campus.

  19. The effect of viewing a virtual environment through a head-mounted display on balance.

    PubMed

    Robert, Maxime T; Ballaz, Laurent; Lemay, Martin

    2016-07-01

    In the next few years, several head-mounted displays (HMD) will be publicly released making virtual reality more accessible. HMD are expected to be widely popular at home for gaming but also in clinical settings, notably for training and rehabilitation. HMD can be used in both seated and standing positions; however, presently, the impact of HMD on balance remains largely unknown. It is therefore crucial to examine the impact of viewing a virtual environment through a HMD on standing balance. To compare static and dynamic balance in a virtual environment perceived through a HMD and the physical environment. The visual representation of the virtual environment was based on filmed image of the physical environment and was therefore highly similar. This is an observational study in healthy adults. No significant difference was observed between the two environments for static balance. However, dynamic balance was more perturbed in the virtual environment when compared to that of the physical environment. HMD should be used with caution because of its detrimental impact on dynamic balance. Sensorimotor conflict possibly explains the impact of HMD on balance. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Enhancing patient freedom in rehabilitation robotics using gaze-based intention detection.

    PubMed

    Novak, Domen; Riener, Robert

    2013-06-01

    Several design strategies for rehabilitation robotics have aimed to improve patients' experiences using motivating and engaging virtual environments. This paper presents a new design strategy: enhancing patient freedom with a complex virtual environment that intelligently detects patients' intentions and supports the intended actions. A 'virtual kitchen' scenario has been developed in which many possible actions can be performed at any time, allowing patients to experiment and giving them more freedom. Remote eye tracking is used to detect the intended action and trigger appropriate support by a rehabilitation robot. This approach requires no additional equipment attached to the patient and has a calibration time of less than a minute. The system was tested on healthy subjects using the ARMin III arm rehabilitation robot. It was found to be technically feasible and usable by healthy subjects. However, the intention detection algorithm should be improved using better sensor fusion, and clinical tests with patients are needed to evaluate the system's usability and potential therapeutic benefits.

Top